661 Data Engineer Jobs - Page 22

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 12.0 years

15 - 27 Lacs

Mumbai, Pune, Bengaluru

Work from Office

Role & responsibilities : Job Description: Primarily looking for a Data Engineer (AWS) with expertise in processing data pipelines using Data bricks, PySpark SQL on Cloud distributions like AWS Must have AWS Data bricks ,Good-to-have PySpark, Snowflake, Talend Requirements- • Candidate must be experienced working in projects involving • Other ideal qualifications include experiences in • Primarily looking for a data engineer with expertise in processing data pipelines using Databricks Spark SQL on Hadoop distributions like AWS EMR Data bricks Cloudera etc. • Should be very proficient in doing large scale data operations using Databricks and overall very comfortable using Python • Familiarity...

Posted 5 months ago

AI Match Score
Apply

5.0 - 10.0 years

0 Lacs

Hyderabad

Work from Office

Dear all., Greetings of the Day!!! We have opening with one of the top MNC. Experience:5-15 Years Notice Period: Immediate to 45 Days Job Description: Design, build, and maintain data pipelines (ETL/ELT) using BigQuery , Python , and SQL Optimize data flow, automate processes, and scale infrastructure Develop and manage workflows in Airflow/Cloud Composer and Ascend (or similar ETL tools) Implement data quality checks and testing strategies Support CI/CD (DevSecOps) processes, conduct code reviews, and mentor junior engineers Collaborate with QA/business teams and troubleshoot issues across environments. DBT for transformation Collibra for data quality Working with unstructured datasets Stro...

Posted 5 months ago

AI Match Score
Apply

7.0 - 12.0 years

25 - 40 Lacs

Pune

Work from Office

Role : Consultant/Sr. Consultant Mandate Skills : GCP & Hadoop Location : Pune Budget : 7-10 Years - Upto 30 LPA & 10-12 Years - Upto 40 LPA (Non-Negotiable) Interested candidates can share resumes at: "Kashif@d2nsolutions.com"

Posted 5 months ago

AI Match Score
Apply

7.0 - 12.0 years

25 - 40 Lacs

Pune

Work from Office

Experience as a Data Analyst with GCP & Hadoop is mandatory. Work From Office

Posted 5 months ago

AI Match Score
Apply

5.0 - 10.0 years

20 - 35 Lacs

Chennai

Work from Office

5+ Years of experience in ETL development with strong proficiency in Informatica BDM . Hands-on experience with big data platforms like Hadoop, Hive, HDFS, Spark . Proficiency in SQL and working knowledge of Unix/Linux shell scripting. Experience in performance tuning of ETL jobs in a big data environment. Familiarity with data modeling concepts and working with large datasets. Strong problem-solving skills and attention to detail. Experience with job scheduling tools (e.g., Autosys, Control-M) is a plus.

Posted 5 months ago

AI Match Score
Apply

4.0 - 7.0 years

6 - 9 Lacs

Pune

Work from Office

Perydot is looking for Data Engineer to join our dynamic team and embark on a rewarding career journey. Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up- to- date with industry standards and technological advancements that will improve the qua...

Posted 5 months ago

AI Match Score
Apply

6.0 - 10.0 years

20 - 30 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Role & responsibilities As a Senior Data Engineer, you will work to solve some of the organizational data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member and Data Modeling lead as demanded by each project to define, design, and deliver actionable insights. On a typical day, you might Engage the clients & understand the business requirements to translate those into data models. Create and maintain a Logical Data Model (LDM) and Physical Data Model (PDM) by applying best practices to provide business insights. Contribute to Data Modeling accelerators Create and maintain the Source to Target Data...

Posted 5 months ago

AI Match Score
Apply

8.0 - 12.0 years

20 - 35 Lacs

Kolkata, Pune, Chennai

Work from Office

Senior data engineer Job description: - Demonstrate hands-on expertise in Ab Initio GDE, Metadata Hub, Co-operating system & Control-Centre. - Must demonstrate high proficiency in SQL. - Develop and implement solutions for metadata management and data quality assurance. - Able to identify, analyze, and resolve technical issues related to the Ab Initio solution. - Perform unit testing and ensure the quality of developed solutions. - Provide Level 3 support and troubleshoot issues with Ab Initio applications deployed in Production - Working knowledge of Azure Databricks & python will be an advantage. - Any past experience working on SAP HANA Data layer would be good to have. Other traits - Pro...

Posted 5 months ago

AI Match Score
Apply

5.0 - 7.0 years

15 - 25 Lacs

Pune, Bengaluru

Work from Office

Job Role & responsibilities: - Responsible for architecture designing, building and deploying data systems, pipelines etc Responsible for Designing and implementing agile, scalable, and cost efficiency solution on cloud data services. Responsible for Designing, Implementation, Development & Migration Migrate data from traditional database systems to Cloud environment Architect and implement ETL and data movement solutions. Technical Skill, Qualification & experience required:- 4.5-7 years of experience in Data Engineering, Azure Cloud Data Engineering, Azure Databricks, datafactory , Pyspark, SQL,Python Hands on experience in Azure Databricks, Data factory, Pyspark, SQL Proficient in Cloud S...

Posted 5 months ago

AI Match Score
Apply

4.0 - 9.0 years

3 - 8 Lacs

Pune

Work from Office

Design, develop, and maintain ETL pipelines using Informatica PowerCenter or Talend to extract, transform, and load data into EDW systems and data lake. Optimize and troubleshoot complex SQL queries and ETL jobs to ensure efficient data processing and high performance. Technologies - SQL, Informatica Power center, Talend, Big Data, Hive

Posted 5 months ago

AI Match Score
Apply

7.0 - 12.0 years

15 - 30 Lacs

Hyderabad

Remote

Lead Data Engineer with Health Care Domain Role & responsibilities Position: Lead Data Engineer Experience: 7+ Years Location: Hyderabad | Chennai | Remote SUMMARY: Data Engineer will be responsible for ETL and documentation in building data warehouse and analytics capabilities. Additionally, maintain existing systems/processes and develop new features, along with reviewing, presenting and implementing performance improvements. Duties and Responsibilities Build ETL (extract, transform, and loading) jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce, and AWS technologies. Monitoring active ETL jobs in production. Build out...

Posted 5 months ago

AI Match Score
Apply

6.0 - 9.0 years

15 - 20 Lacs

Chennai

Work from Office

Skills Required: Should have a minimum 6+ years in Data Engineering, Data Analytics platform. Should have strong hands-on design and engineering background in AWS, across a wide range of AWS services with the ability to demonstrate working on large engagements. Should be involved in Requirements Gathering and transforming them to into Functionally and technical design. Maintain and optimize the data infrastructure required for accurate extraction, transformation, and loading of data from a wide variety of data sources. Design, build and maintain batch or real-time data pipelines in production. Develop ETL/ELT Data pipeline (extract, transform, load) processes to help extract and manipulate d...

Posted 5 months ago

AI Match Score
Apply

3.0 - 5.0 years

4 - 9 Lacs

Chennai

Work from Office

Skills Required: Should have a minimum 3+ years in Data Engineering, Data Analytics platform. Should have strong hands-on design and engineering background in AWS, across a wide range of AWS services with the ability to demonstrate working on large engagements. Should be involved in Requirements Gathering and transforming them to into Functionally and technical design. Maintain and optimize the data infrastructure required for accurate extraction, transformation, and loading of data from a wide variety of data sources. Design, build and maintain batch or real-time data pipelines in production. Develop ETL/ELT Data pipeline (extract, transform, load) processes to help extract and manipulate d...

Posted 5 months ago

AI Match Score
Apply

8.0 - 13.0 years

15 - 25 Lacs

Hyderabad, Bengaluru

Hybrid

Looking for Snowflake developer for US client, this candidate should be strong with Snowflake & DBT & should be able to do impact analysis on the current ETLs (Informatica/ Data stage) and provide solutions based on the analysis. Exp: 7- 12yrs

Posted 5 months ago

AI Match Score
Apply

2.0 - 7.0 years

4 - 7 Lacs

Hyderabad

Work from Office

Design, develop, deploy ETL workflows mappings using Informatica PowerCenter Extract data from various source systems transform/load into target systems Troubleshoot ETL job failures resolve data issues promptly. Optimize and tune complex SQL queries Required Candidate profile Maintain detailed documentation of ETL design, mapping logic, and processes. Ensure data quality and integrity through validation and testing. Exp with Informatica PowerCenter Strong SQL knowledge Perks and benefits Perks and Benefits

Posted 5 months ago

AI Match Score
Apply

5.0 - 10.0 years

11 - 21 Lacs

Kochi, Bengaluru

Work from Office

Hiring for Senior Python Developer Data Engineering with Mage.ai Mage.ai experience mandatory Location : Remote/Bangalore/Kochi Experience : 6+ Years | Role Type : Full-time About the Role We are looking for a Senior Python Developer with strong Data Engineering expertise to help us build and optimize data workflows, manage large-scale pipelines, and enable efficient data operations across the organization. This role requires hands-on experience with Mage.AI , PySpark , and cloud-based data engineering workflows , and will play a critical part in our data infrastructure modernization efforts. Required Skills & Experience 6+ years of hands-on Python development with strong data engineering fo...

Posted 5 months ago

AI Match Score
Apply

5.0 - 10.0 years

22 - 35 Lacs

Kochi, Bengaluru

Hybrid

Greetings from Trinity!! We are looking for Python Developer with proficiency in Mage.AI Senior Python Developer Data Engineering Focus Location: Bangalore/Kochi/Remote Budget: 15-25 LPA for 5-7 years and 24-32 LPA for 7-9 years Mode of hiring: FTE About the Role We are looking for a Senior Python Developer with strong Data Engineering expertise to help us build and optimize data workflows, manage large-scale pipelines, and enable efficient data operations across the organization. This role requires hands-on experience with Mage.AI , PySpark , Required Skills & Experience 6+ years of hands-on Python development with strong data engineering focus. Deep experience in Mage.AI for building and m...

Posted 5 months ago

AI Match Score
Apply

6.0 - 11.0 years

5 - 15 Lacs

Hyderabad

Hybrid

Dear Candidates, We are conducting Face to Face Drive on 7th June 2025. Whoever interested in F2F Drive kindly share the updated resume asap. Here are the JD Details: Role: Data Engineer with Python, Apache Spark, HDFS Experience: 6 to 12 Years Location: Hyderabad Shift Timings: General Shift Job Overview: Key Responsibilities: • Design, develop, and maintain scalable data pipelines using Python and Spark. • Ingest, process, and transform large datasets from various sources into usable formats. • Manage and optimize data storage using HDFS and MongoDB. • Ensure high availability and performance of data infrastructure. • Implement data quality checks, validations, and monitoring processes. • ...

Posted 5 months ago

AI Match Score
Apply

3.0 - 8.0 years

8 - 16 Lacs

Bengaluru

Work from Office

Role & responsibilities ualifications Experience - 3-6 years Education - B.E/B.Tech/MCA/M.Tech Minimum Qualifications • Bachelor's Degree in Computer Science, CIS, or related field (or equivalent work experience in a related field) • 3 years of experience in software development or a related field • 2 year of experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC) You will be responsible for designing, building, and maintaining our data infrastructure, ensuring data quality, and enabling data-driven decision-making across the organization. The ideal candidate will have a strong background in data engineering, excellent problem-solvin...

Posted 5 months ago

AI Match Score
Apply

8.0 - 13.0 years

25 - 40 Lacs

Bengaluru

Work from Office

Job Title: Data Engineer (Java + Hadoop/Spark) Location: Bangalore WFO Type: Full Time Experience: 8-12 years Notice Period Immediate Joiners to 30 Days Virtual drive on 1st June '25 Job Description: We are looking for a skilled Data Engineer with strong expertise in Java and hands-on experience with Hadoop or Spark. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and processing systems. Key Responsibilities: • Develop and maintain data pipelines using Java. • Work with big data technologies such as Hadoop or Spark to process large datasets. • Optimize data workflows and ensure high performance and reliability. • Collaborate with data ...

Posted 5 months ago

AI Match Score
Apply

3.0 - 5.0 years

10 - 12 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Data Pipelines: Proven experience in building scalable and reliable data pipelines BigQuery: Expertise in writing complex SQL transformations; hands-on with indexing and performance optimization Ingestion: Skilled in data scraping and ingestion through RESTful APIs and file-based sources Orchestration: Familiarity with orchestration tools like Prefect, Apache Airflow (nice to have) Tech Stack: Proficient in Python, FastAPI, and PostgreSQL End-to-End Workflows: Capable of owning ingestion, transformation, and delivery processes Location-Remote,Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad

Posted 5 months ago

AI Match Score
Apply

5.0 - 10.0 years

30 - 35 Lacs

Pune, Bengaluru, Mumbai (All Areas)

Work from Office

Good hands on experience working as a GCP Data Engineer with very strong experience in SQL and PySpark. Also on BigQuery, Dataform, Dataplex, etc. Looking for only Immediate to currently serving candidates.

Posted 5 months ago

AI Match Score
Apply

2.0 - 7.0 years

40 - 45 Lacs

Chandigarh

Work from Office

Responsibilities: Design and Develop complex data processes in coordination with business stakeholders to solve critical financial and operational processes. Design and Develop ETL/ELT pipelines against traditional databases and distributed systems and to flexibly produce data back to the business and analytics teams for analysis. Work in an agile, fail fast environment directly with business stakeholders and analysts, while recognising data reconciliation and validation requirements. Develop data solutions in coordination with development teams across a variety of products and technologies. Build processes that analyse and monitor data to help maintain controls - correctness, completeness a...

Posted 5 months ago

AI Match Score
Apply

2.0 - 7.0 years

3 - 7 Lacs

Thane, Navi Mumbai, Mumbai (All Areas)

Work from Office

Job Title: Data Analyst/Engineer Location: Mumbai Experience: 3-4 Years Job Summary: We are seeking a skilled Data Analyst/Engineer with expertise in AWS S3 and Python to manage and process large datasets in a cloud environment. The ideal candidate will be responsible for developing efficient data pipelines, managing data storage, and optimizing data workflows in AWS. Your role will involve using your Python skills to automate data tasks. Key Responsibilities: Python Scripting and Automation: • Develop Python scripts for automating data collection, transformation, and loading into cloud storage systems. • Create robust ETL pipelines to move data between systems and perform data transformatio...

Posted 5 months ago

AI Match Score
Apply

3.0 - 5.0 years

0 - 0 Lacs

Hyderabad, Pune, Bangalore Rural

Work from Office

Role & responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adhe...

Posted 5 months ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies