5 - 7 years

3 - 6 Lacs

Posted:1 hour ago| Platform: GlassDoor logo

Apply

Work Mode

On-site

Job Type

Part Time

Job Description

Position: Data Engineer Lead
Experience: 5–7 Years
Location: Udaipur / Gurugram
Work Mode: On-site (5 Days a Week)


About the Role


We are looking for an experienced Data Engineer Lead to design, build, and optimize scalable data platforms and pipelines. In this leadership role, you will take full ownership of data engineering initiatives, ensure high-quality data delivery, and work closely with cross-functional teams to drive data strategy and innovation across the organization.


You will lead end-to-end data engineering efforts—from ingestion and transformation to orchestration and infrastructure—ensuring reliability, performance, and scalability of the data ecosystem.


Key Responsibilities


  • Lead the design, development, and maintenance of scalable, secure, and automated data pipelines (batch and streaming).


  • Build, optimize, and manage data lakes / data warehouses on platforms such as Redshift, Snowflake, BigQuery, or Delta Lake.


  • Develop and deploy data workflows using Airflow, DBT, Spark, Kafka, and other modern data engineering tools.


  • Implement and optimize ETL/ELT processes, ensuring data quality, governance, lineage, and observability.


  • Work closely with analytics, product, and engineering teams to define data strategy, architecture, and KPIs.


  • Lead and mentor junior and mid-level data engineers, ensuring adherence to best practices and coding standards.


  • Build and maintain CI/CD pipelines for data engineering processes using tools like GitHub Actions, Jenkins, or similar.


  • Deploy and manage container environments using Docker and Kubernetes.


  • Ensure performance optimization of queries, storage, pipelines, and cloud infrastructure.


  • Drive innovation by evaluating new tools, technologies, and architectural improvements.


Required Skills & Qualifications

Core Technical Skills


  • Strong expertise in data processing & workflow automation.


  • Expert-level proficiency in Python and SQL
    .


  • Hands-on experience with Cloud Platforms (AWS / GCP / Azure) for data engineering services.


  • Proven experience building and managing pipelines using Airflow, DBT, Spark, Kafka, etc.


  • Strong understanding of modern data warehousing technologies:


    • Redshift


    • Snowflake


    • BigQuery


    • Delta Lake


  • Experience with CI/CD tools (GitHub Actions, Jenkins, Bitbucket Pipelines, etc.).


  • Strong working knowledge of Docker, Kubernetes, and containerized deployments.


Preferred Skills


  • Knowledge of data governance, lineage, metadata management, and observability frameworks.


  • Experience with distributed systems and performance tuning.


  • Familiarity with version control (Git), DevOps practices, and cloud cost optimization.


Soft Skills


  • Strong leadership and mentoring capabilities.


  • Excellent communication and stakeholder management skills.


  • Analytical mindset with a strong problem-solving approach.


  • Ability to work in a fast-paced, data-driven environment.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

chennai, tamil nadu, india

jalalabad, uttar pradesh, india

hyderabad, telangana, india