Lead Data Engineer-10+ Yrs. - Python, AWS Glue, ETL, API -Immediate

10 - 20 years

20 - 35 Lacs

Posted:2 weeks ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Role & responsibilities

Skill - Lead Data Engineer, Python, AWS Glue, API, fast API, rest API, ETL, SQL

Location - Gurugram

Exp - 10+ Yrs.

Notice period - only immediate joiners needed

Job Descriptio

Job Summary:

We are seeking an experienced Lead Data Engineer with strong expertise in Python, AWS cloud services, ETL pipelines, and system integrations. The ideal candidate will lead the design, development, and optimization of scalable data solutions and ensure seamless API and data integrations across systems. You will collaborate with cross-functional teams to implement robust DataOps and CI/CD pipelines.

Key Responsibilities:

Responsible for implementation of scalable, secure, and high-performance data pipelines.

Design and develop ETL processes using AWS services (Lambda, S3, Glue, Step Functions, etc.).

Own and enhance API design and integrations for internal and external data systems.

Work closely with data scientists, analysts, and software engineers to understand data needs and deliver solutions.

Drive DataOps practices for automation, monitoring, logging, testing, and continuous deployment.

Develop CI/CD pipelines for automated deployment of data solutions.

Conduct code reviews and mentor junior engineers in best practices for data engineering and cloud development.

Ensure compliance with data governance, security, and privacy policies.

Required Skills & Experience:

10+ years of experience in data engineering, software development, or related fields.

Strong programming skills in Python for building robust data applications.

Expert knowledge of AWS services, particularly Lambda, S3, Glue, CloudWatch, and Step Functions.

Proven experience designing and managing ETL pipelines for large-scale data processing.

Experience with API design, RESTful services, and API integration workflows.

Deep understanding of DataOps practices and principles.

Hands-on experience implementing CI/CD pipelines (e.g., using CodePipeline, Jenkins, GitHub Actions).

Familiarity with containerization tools like Docker and orchestration tools like ECS/EKS (optional but preferred).

Strong understanding of data modeling, data warehousing concepts, and performance optimization.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Bounteous x Accolite logo
Bounteous x Accolite

Digital Transformation, Technology Consulting

N/A

RecommendedJobs for You

Pune, Maharashtra, India

Andhra Pradesh, India