Jobs
Interviews

5 Elt Pipelines Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

0 Lacs

jaipur, rajasthan

On-site

You are a Sr. Data Engineer with a strong background in building ELT pipelines and expertise in modern data engineering practices. You are experienced with Databricks and DBT, proficient in SQL and Python, and have a solid understanding of data warehousing methodologies such as Kimball or Data Vault. You are comfortable working with DevOps tools, particularly within AWS, Databricks, and GitLab. Your role involves collaborating with cross-functional teams to design, develop, and maintain scalable data infrastructure and pipelines using Databricks and DBT. Your responsibilities include designing, building, and maintaining scalable ELT pipelines for processing and transforming large datasets efficiently in Databricks. You will implement Kimball data warehousing methodologies or other multi-dimensional modeling approaches using DBT. Leveraging AWS, Databricks, and GitLab, you will implement CI/CD practices for data engineering workflows. Additionally, you will optimize SQL queries and database performance, monitor and fine-tune data pipelines and queries, and ensure compliance with data security, privacy, and governance standards. Key qualifications for this role include 6+ years of data engineering experience, hands-on experience with Databricks and DBT, proficiency in SQL and Python, experience with Kimball data warehousing or Data Vault methodologies, familiarity with DevOps tools and practices, strong problem-solving skills, and the ability to work in a fast-paced, agile environment. Preferred qualifications include experience with Apache Spark for large-scale data processing, familiarity with CI/CD pipelines for data engineering workflows, understanding of orchestration tools like Apache Airflow, and certifications in AWS, Databricks, or DBT. In return, you will receive benefits such as medical insurance for employees, spouse, and children, accidental life insurance, provident fund, paid vacation time, paid holidays, employee referral bonuses, reimbursement for high-speed internet at home, one-month free stay for employees moving from other cities, tax-free benefits, and other bonuses as determined by management.,

Posted 4 days ago

Apply

5.0 - 7.0 years

4 - 6 Lacs

Hyderabad, Telangana, India

On-site

Key Responsibilities: Data Engineering & Development: Design, build, and maintain ETL/ELT pipelines using Hadoop ecosystem tools Write complex Hive queries for data transformation and analysis Work with HDFS for storage and efficient data access Develop UNIX shell scripts for automation of jobs and workflows Optimize SQL queries for performance and scalability Data Processing & Integration: Process large volumes of structured and semi-structured data Integrate data from various sources into Hadoop-based data lakes Work with cross-functional teams to understand data requirements and deliver solutions Monitoring, Maintenance & Quality: Monitor and troubleshoot production data pipelines Ensure data quality, integrity, and consistency across systems Support data ingestion, batch processing, and job scheduling (e.g., Oozie, Airflow) Required Skills and Qualifications: Bachelor's degree in Computer Science, IT, or related field 37 years of hands-on experience with Big Data tools Strong expertise in: Hadoop Distributed File System (HDFS) Hive (querying, optimization, partitioning) SQL (advanced queries, joins, aggregations) UNIX/Linux shell scripting Good understanding of data warehousing concepts and large-scale data processing

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Software Engineer- Operational Support Systems at Barclays, you will embark on a transformative journey, spearheading the evolution of the digital landscape and driving innovation and excellence. Your primary responsibility will be the design, build, and maintenance of the underlying OSS infrastructure and toolchains necessary to run the Barclays Global Network, deployed across cloud and On-Prem environments. To excel in this role, you should possess expertise in both front-end and back-end development. Proficiency in Java (Java 17+) and the Spring Ecosystem (Spring MVC, Data JPA, Security, etc.), along with strong SQL and NoSQL integration skills, are essential. Additionally, you should have experience with React.js, JavaScript frameworks like material UI and Ant design, and state management tools like Redus, Zustand, or Context API. Your role will also involve working with runtime technologies such as virtualization, containers, and Kubernetes, and implementing test-driven development using frameworks like Cypress, Playwright, or Selenium. Proficiency in CI/CD pipelines and tools like GitHub Actions, Jenkins, or Gitlab CI, as well as knowledge of monitoring and observability tools like Grafana/ELK, are crucial for success in this position. Highly valued skills for this role include expertise in building ELT pipelines, cloud/storage integrations, security practices (OAuth2, CSRF/XSS protection), performance optimization, and familiarity with Public, Private, and Hybrid Cloud technologies and Network domains. In this role, you will collaborate with product managers, designers, and fellow engineers to develop high-quality software solutions that are scalable, maintainable, and optimized for performance. You will actively contribute to the organization's technology communities, stay informed of industry trends, and promote a culture of technical excellence and growth. As a Software Engineer at Barclays, you will be expected to adhere to secure coding practices, implement effective unit testing, and actively contribute to a culture of code quality and knowledge sharing. Your role will be based in the Pune office, and you will play a crucial part in designing, developing, and improving software that enhances business, platform, and technology capabilities for customers and colleagues.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Software Engineer- Operational Support Systems at Barclays, you will lead the evolution of the digital landscape by utilizing cutting-edge technology to enhance the digital offerings and ensure unmatched customer experiences. Your primary responsibility will be the design, construction, and maintenance of the OSS infrastructure and toolchains necessary to operate the Barclays Global Network across cloud and On-Prem environments. To excel in this role, you must demonstrate proficiency in front-end and back-end technologies, including Java (Java 17+), Spring Ecosystem, SQL, NoSQL, React.js, JavaScript, and state management expertise. Additionally, you should have expertise in virtualization, containers, Kubernetes, test-driven development, CI/CD pipelines, monitoring, observability, ELT pipelines, cloud/storage integrations, security practices, and performance optimization. Your role will involve collaborating with product managers, designers, and other engineers to define software requirements, devise solutions, and ensure alignment with business objectives. You will also participate in code reviews, promote a culture of code quality, and stay updated on industry technology trends to contribute to the organization's technology communities. As a Vice President, your responsibilities may include setting strategies, driving requirements, managing resources, policies, and budgets, delivering continuous improvements, and advising key stakeholders on functional areas of impact and alignment. You will also demonstrate leadership, accountability for risk management, and collaborate with other business areas to achieve organizational goals. All colleagues at Barclays are expected to uphold the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as the Barclays Mindset of Empower, Challenge, and Drive.,

Posted 2 weeks ago

Apply

5.0 - 8.0 years

20 - 30 Lacs

Noida, Hyderabad, Bengaluru

Hybrid

Looking for Data Engineers, immediate joiners only, for Hyderabad, Bengaluru and Noida Location. * Must have experience in Python, Kafka Stream, Pyspark, and Azure Databricks.* Role and responsibilities: Lead the design, development, and implementation of real-time data pipelines using Kafka, Python, and Azure Databricks . Architect scalable data streaming and processing solutions to support healthcare data workflows. Develop, optimize, and maintain ETL/ELT pipelines for structured and unstructured healthcare data. Ensure data integrity, security, and compliance with healthcare regulations (HIPAA, HITRUST, etc.). Collaborate with data engineers, analysts, and business stakeholders to understand requirements and translate them into technical solutions. Troubleshoot and optimize Kafka streaming applications, Python scripts, and Databricks workflows . Mentor junior engineers, conduct code reviews, and ensure best practices in data engineering . Stay updated with the latest cloud technologies, big data frameworks, and industry trends . Preferred candidate profile : 5+ years of experience in data engineering, with strong proficiency in Kafka and Python . Expertise in Kafka Streams, Kafka Connect, and Schema Registry for real-time data processing. Experience with Azure Databricks (or willingness to learn and adopt it quickly). Hands-on experience with cloud platforms (Azure preferred, AWS or GCP is a plus) . Proficiency in SQL, NoSQL databases, and data modeling for big data processing. Knowledge of containerization (Docker, Kubernetes) and CI/CD pipelines for data applications. Experience working with healthcare data (EHR, claims, HL7, FHIR, etc.) is a plus. Strong analytical skills, problem-solving mindset, and ability to lead complex data projects. Excellent communication and stakeholder management skills. Interested, call: Rose (9873538143 / WA : 8595800635) rose2hiresquad@gmail.com

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies