1578 Data Pipeline Jobs - Page 22

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

25 - 30 Lacs

hyderabad

Hybrid

Job Title: Senior Data Engineer AWS | Python | Data Pipelines Experience: 6 - 10 Years Location: Hyderabad Employment Type: Full-time | Permanent Notice Period: Immediate About the Role: We are seeking a highly skilled Senior Data Engineer with deep expertise in AWS cloud services and Python-based data engineering . In this role, you will architect and build scalable, automated, and high-performance data platforms that power analytics, AI, and business intelligence across the organization. Key Responsibilities: Architect and implement end-to-end data pipelines using AWS Glue, Lambda, EMR, Step Functions, and Redshift . Design and manage data lakes and warehouses on Amazon S3, Redshift, and A...

Posted 2 months ago

AI Match Score
Apply

12.0 - 22.0 years

20 - 35 Lacs

chennai, bengaluru, mumbai (all areas)

Work from Office

Role & responsibilities • Serve as the Subject Matter Expert (SME) for data engineering solutions within a high-stakes production support context . • Provide expert-level troubleshooting, diagnosis, and resolution for complex incidents affecting data pipelines built on Azure Databricks, PySpark, SQL, Azure Data Factory (ADF) and other related Azure data services (e.g., Azure Data Lake Storage, Azure Synapse Analytics, Azure SQL Database). • Conduct thorough root cause analysis (RCA) for recurring data issues, implementing sustainable long-term solutions to minimize downtime and prevent data discrepancies. • Proactively monitor, optimize, and enhance existing data pipelines for performance, s...

Posted 2 months ago

AI Match Score
Apply

3.0 - 5.0 years

8 - 17 Lacs

pune

Hybrid

Company Introduction Coditas is a new-age, offshore product development organization, offering services pertaining to the entire software development life cycle. Headquartered in Pune, Coditas works with clients across the globe. We attribute our organic growth to an engineering-driven culture and steadfast philosophies around writing clean code, designing intuitive user experiences, and letting the work speak for itself. Job Description We are looking for data engineers who have the right attitude, aptitude, skills, empathy, compassion, and hunger for learning. Build products in the data analytics space. A passion for shipping high-quality data products, interest in the data products space;...

Posted 2 months ago

AI Match Score
Apply

3.0 - 8.0 years

15 - 30 Lacs

pune, bengaluru, delhi / ncr

Hybrid

Hiring an ETL Developer to design, develop, and maintain robust data integration pipelines. The ideal candidate will have strong SQL skills and hands-on experience with ETL tools like SSIS, Informatica, or Talend. Apply now recruiter2@rankskills.in Perks and benefits Mediclaim policy & Personal Accident Policy

Posted 2 months ago

AI Match Score
Apply

5.0 - 7.0 years

5 - 5 Lacs

bengaluru

Work from Office

Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes: Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development mainten...

Posted 2 months ago

AI Match Score
Apply

3.0 - 8.0 years

0 - 0 Lacs

bengaluru

Work from Office

SUMMARY W issen Technology is Hiring for .Net + AWS Developer About Wissen Technology: At Wissen Technology, we deliver niche, custom-built products that solve complex business challenges across industries worldwide. Founded in 2015, our core philosophy is built around a strong product engineering mindset ensuring every solution is architected and delivered right the first time. Today, Wissen Technology has a global footprint with 2000+ employees across offices in the US, UK, UAE, India, and Australia . Our commitment to excellence translates into delivering 2X impact compared to traditional service providers. How do we achieve this? Through a combination of deep domain knowledge, cutting-ed...

Posted 2 months ago

AI Match Score
Apply

6.0 - 11.0 years

15 - 30 Lacs

bengaluru

Hybrid

About the Role This role is part of the MarTech Data Engineering POD supporting a large-scale ESP (Email Service Provider) migration program from Oracle Responsys to Adobe Journey Optimizer. The initiative aims to build a scalable, real-time marketing data foundation integrating customer journeys, preference orchestration, and omnichannel personalization across email, mobile, and direct channels. The offshore team will focus on designing, building, and maintaining data pipelines across SAS , OES , Wunderkind , and AEP (Adobe Experience Platform), enabling real-time campaign execution and advanced targeting capabilities. Key Responsibilities Design and develop data ingestion, transformation, ...

Posted 2 months ago

AI Match Score
Apply

5.0 - 8.0 years

20 - 35 Lacs

noida

Work from Office

•Provide technical leadership and mentorship to a team of data engineers. •Design, architect, and implement highly scalable, resilient, and performant data pipelines, using GCP technologies is a plus (e.g., Dataproc, Cloud Composer, Pub/Sub, BigQuery). •Guide the team in adopting best practices for data engineering, including CI/CD, infrastructure-as-code, and automated testing. •Conduct code reviews, design reviews, and provide constructive feedback to team members. •Stay up-to-date with the latest technologies and trends in data engineering, Data Pipeline Development: •Develop and maintain robust and efficient data pipelines to ingest, process, and transform large volumes of structured and...

Posted 2 months ago

AI Match Score
Apply

7.0 - 8.0 years

2 - 5 Lacs

hyderabad

Work from Office

We are looking for a skilled and experienced Databricks Engineer to join our team at our Hyderabad (Madhapur) office. The ideal candidate will have a strong background in data engineering, hands-on experience with Databricks, and a solid understanding of cloud platforms (AWS, Azure, or GCP). This is a great opportunity to work on cutting-edge data projects in a collaborative and fast-paced environment. Key Responsibilities : - Design and implement scalable and efficient data pipelines, ETL/ELT processes, and data integration solutions using Databricks. - Work with large-scale structured and unstructured datasets using Python, PySpark, SQL, and NoSQL technologies. - Collaborate with cross-fun...

Posted 2 months ago

AI Match Score
Apply

5.0 - 8.0 years

10 - 20 Lacs

hyderabad

Work from Office

Role: Sr. Data Engineer - Snowflake Experience: 5-8 years Location: Hyderabad Job Overview: We are seeking a highly skilled and experienced Snowflake Data Engineer with a strong background in SQL, Python, and a minimum of 3 years of hands-on Snowflake development experience. The ideal candidate will be certified, with a proven track record in data warehousing, data modeling, performance optimization, and implementing scalable ETL/ELT pipelines using industry-standard tools. Experience in leading small delivery tracks or mentoring junior engineers is highly desirable. Primary Roles & Responsibilities: • Design, develop, and optimize scalable data pipelines and ETL/ELT processes in Snowflake. ...

Posted 2 months ago

AI Match Score
Apply

6.0 - 11.0 years

5 - 15 Lacs

ahmedabad

Remote

Must-Have Skills: Strong hands-on experience with Python, ETL processes, and Snowflake Expertise in PySpark and Big Data Engineering Proven ability to create and manage data pipelines and data systems

Posted 2 months ago

AI Match Score
Apply

5.0 - 10.0 years

7 - 12 Lacs

baheri

Work from Office

The AI Engineer plays a critical role in advancing LabVantages AI capabilities by designing, evaluating, and integrating state-of-the-art AI and generative models into our product suite. This includes selecting and benchmarking large language models (LLMs), developing applications using Retrieval-Augmented Generation (RAG) and agentic AI, and collaborating with engineering teams to deliver innovative, scalable, and robust AI-driven features. Additional responsibilities include building and maintaining data pipelines, developing APIs for model access, and staying at the forefront of AI innovation to ensure LabVantage remains a leader in applied AI. The ideal candidate has a strong understandi...

Posted 2 months ago

AI Match Score
Apply

4.0 - 8.0 years

3 - 24 Lacs

pune

Work from Office

Responsibilities Design, develop, and maintain ETL/ELT pipelines to collect, process, and store large volumes of structured and unstructured data. Build and optimize data models to support analytics, business intelligence, and machine learning.

Posted 2 months ago

AI Match Score
Apply

9.0 - 13.0 years

20 - 27 Lacs

noida

Work from Office

Role & responsibilities I came across your profile for Databricks Architect role with us and would like to check with you in reference to the below Full Time/Permanent job Opportunity. Please check the details below and let me know if you would be interested in applying your Candidature for the same . Position :: Databricks Architect Location :: Noida Role :: Full-Time Company :: Stefanini Group (https://stefanini.com/en/) Role Overview: A Data Architect designs, implements, and manages scalable data solutions on the IICS and Databricks platform. This role involves collaborating with business stakeholders to develop data pipelines, optimize performance, and ensure best practices in data hand...

Posted 2 months ago

AI Match Score
Apply

5.0 - 10.0 years

10 - 20 Lacs

bengaluru

Work from Office

As a Senior Data Engineer in the Engineering team: Join us in: Building and optimizing data architectures to handle large-scale data from various sources, including cloud and on-premise systems. Architecting scalable ETL and data workflows to support advanced analytics and machine learning models. Collaborating with cross-functional teams to deliver end-to-end data solutions that drive business insights. Owning and leading projects that focus on data platform modernization, ensuring high performance, security, and scalability. Working on complex data engineering challenges that push the boundaries of cloud data infrastructure. You'll have: A Bachelors or Master’s degree in Computer Science, ...

Posted 2 months ago

AI Match Score
Apply

15.0 - 19.0 years

0 Lacs

maharashtra

On-site

Role Overview: As a Project Manager/Senior Project Manager at GormalOne LLP, you will play a crucial role in driving data-centric product initiatives for the DairyTech ecosystem. Your focus will be on building scalable data platforms, enabling advanced analytics, and delivering actionable insights through intuitive dashboards and visualization tools to empower decision-making across supply chain, operations, sales, and customer engagement. Key Responsibilities: - Define and own the product strategy for the Data & Analytics platform to support business growth in the DairyTech domain. - Identify data-driven opportunities in collaboration with business stakeholders, ensuring alignment with orga...

Posted 2 months ago

AI Match Score
Apply

5.0 - 10.0 years

14 - 18 Lacs

bengaluru

Work from Office

Job Purpose and Impact The Professional, Data Engineering job designs, builds and maintains moderately complex data systems that enable data analysis and reporting With limited supervision, this job collaborates to ensure that large sets of data are efficiently processed and made accessible for decision making, Key Accountabilities DATA & ANALYTICAL SOLUTIONS: Develops moderately complex data products and solutions using advanced data engineering and cloud based technologies, ensuring they are designed and built to be scalable, sustainable and robust, DATA PIPELINES: Maintains and supports the development of streaming and batch data pipelines that facilitate the seamless ingestion of data fr...

Posted 2 months ago

AI Match Score
Apply

4.0 - 9.0 years

15 - 19 Lacs

bengaluru

Work from Office

Job Purpose and Impact The Senior Professional, Data Engineering job designs, builds and maintains complex data systems that enable data analysis and reporting With minimal supervision, this job ensures that large sets of data are efficiently processed and made accessible for trading decision making, Key Accountabilities DATA INFRASTRUCTURE: Prepares data infrastructure to support the efficient storage and retrieval of data, DATA FORMATS: Examines and resolves appropriate data formats to improve data usability and accessibility across the organization, DATA & ANALYTICAL SOLUTIONS: Develops complex data products and solutions using advanced engineering and cloud based technologies, ensuring t...

Posted 2 months ago

AI Match Score
Apply

5.0 - 10.0 years

3 - 6 Lacs

pune

Work from Office

What We Expect Data Engineering & Pipeline Management Design, manage, monitor, and maintain data pipelines and ETL processes using PySpark, Azure DataBricks, SQL Server Studio and Azure Data Factory, Ensure data flows are reliable, scalable, and optimized for performance, Validate data processes through automated testing and monitoring, Continuous Integration, Deployment & Reliability Perform deployments using CI/CD pipelines in Azure DevOps, aligned with team practices, Ensure continuous availability, reliability, and scalability of connected systems, Support performance, uptime, disaster recovery, and capacity growth, Operational Support & Collaboration Serve as Level 3 support for digital...

Posted 2 months ago

AI Match Score
Apply

5.0 - 10.0 years

8 - 18 Lacs

pune

Hybrid

Join General Mills Digital & Technology team shaping the future of data engineering for one of the worlds most loved food companies! We’re looking for experienced Data Engineers (5–8 yrs) skilled in SQL, Python, and Cloud Data Warehousing (GCP preferred) to drive enterprise-scale digital transformation . Location: Pune (Kalyani Nagar) Walk-In Drive: 1st November, Friday Step 1: Complete a 60-min online technical test before Monday to qualify for the interview! What You’ll Work On: Design and build scalable data pipelines & warehouse solutions on Cloud (GCP). Integrate business data for actionable insights across global systems. Collaborate with global teams using Agile methods. You Bring: St...

Posted 2 months ago

AI Match Score
Apply

4.0 - 8.0 years

0 - 1 Lacs

hyderabad

Work from Office

Job Title: Senior Data Engineer & Power BI Developer AWS Glue, Redshift, Python, PySpark, Power BI Location: Hyderabad (On-site) Experience: 4+ Years Job Type: Full-Time About the Role We are seeking a Senior Data Engineer & Power BI Developer who possesses strong experience in both cloud-based data engineering and business intelligence (BI) development . The ideal candidate will be adept at designing and managing ETL pipelines using AWS technologies and transforming data into actionable insights through Power BI dashboards and reports . This role demands a professional who can work across data integration, modeling, visualization, and performance optimization , collaborating closely with te...

Posted 2 months ago

AI Match Score
Apply

4.0 - 7.0 years

15 - 27 Lacs

pune, gurugram, bengaluru

Hybrid

Salary: 15 to 25 LPA Exp: 4 to 7 years Location: Gurgaon/Pune/Bengalore Notice: Immediate to 30 days..!! Job Profile: Experienced Data Engineer with a strong foundation in designing, building, and maintaining scalable data pipelines and architectures. Skilled in transforming raw data into clean, structured formats for analytics and business intelligence. Proficient in modern data tools and technologies such as SQL, T-SQL, Python, Databricks, and cloud platforms (Azure). Adept at data wrangling, modeling, ETL/ELT development, and ensuring data quality, integrity, and security. Collaborative team player with a track record of enabling data-driven decision-making across business units. As a Dat...

Posted 2 months ago

AI Match Score
Apply

7.0 - 12.0 years

15 - 27 Lacs

ahmedabad

Work from Office

Job Title: Data Engineer Shift: UK Shift (5 days working) Shift Timing: 12:30 PM to 9:30 PM | 1:30 Pm to 10:30 PM Job Location: Ahmedabad (Work from Office) Key Responsibilities: Design, build and maintain scalable data pipelines and transformation processes using Microsoft Fabric components including Data Factory, OneLake, Dataflows and Notebooks. Develop and manage data models and analytics solutions. Develop ETL processes and integrate data from diverse sources (on-cloud and on-premise) into centralised and governed environments using Fabrics capabilities. Ensure data quality, integrity, consistency, security, governance and compliance with industry standards (Uniclass & SFG20) across all...

Posted 2 months ago

AI Match Score
Apply

9.0 - 12.0 years

22 - 35 Lacs

hyderabad

Hybrid

Capgemini is hiring a Data Engineer to design, build, and maintain data architecture, databases, and ETL processes for a leading U.S. insurance client. The role involves working with SQL/NoSQL databases, optimizing ETL pipelines, and collaborating with analysts and stakeholders to deliver reliable data solutions. Candidates must have strong experience in Informatica , Snowflake , and SQL , along with good communication and analytical skills. Key Skills: AWS Glue Pyspark AWS Lambda Informatica Snowflake SQL

Posted 2 months ago

AI Match Score
Apply

2.0 - 6.0 years

5 - 15 Lacs

mumbai, mumbai suburban, mumbai (all areas)

Work from Office

Russell Investments is actively hiring for Data Platform Engineer for Mumbai, Goregaon (E) location. Years of Experience 2+ years of experience required, preferably within the financial services domain (investment bank, asset management, etc.). Role & responsibilities Strong proficiency in Snowflake and related cloud-based platforms like Fivetran , dBT , etc. Hands-on experience with SQL , Python , or other scripting languages. Expertise in data modelling , data pipeline design , ETL tools , and MDM platforms preferably in the financial services/ asset management domain. Architect and administer Snowflake data warehouses , including performance tuning, resource management, and security confi...

Posted 2 months ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies