Lead Data Engineer (Snowflake+dbt+Airflow)

7 - 10 years

3 - 7 Lacs

Posted:2 days ago| Platform: GlassDoor logo

Apply

Work Mode

On-site

Job Type

Part Time

Job Description

Location: Ahmedabad, Pune

Required Experience: 7 to 10 Years

Preferred Immediate Joiner


We are seeking a Lead Data Engineer with deep expertise in Snowflake, dbt, and Apache Airflow to design, implement, and optimize scalable data solutions. This role involves working on complex datasets, building robust data pipelines, ensuring data quality, and collaborating closely with analytics and business teams to deliver actionable insights.

If you are passionate about data architecture, ELT best practices, and modern cloud data stack, we’d like to meet you.

Key Responsibilities:

  • Pipeline Design & Orchestration:Build and maintain robust, scalable data pipelines using Apache Airflow, including incremental & full-load strategies, retries, and logging.
  • Data Modelling & Transformation:Develop modular, tested, and documented transformations in dbt, ensuring scalability and maintainability.
  • Snowflake Development:Design and maintain warehouse in Snowflake, optimize Snowflake schemas, implement performance tuning (clustering keys, warehouse scaling, materialized views), manage access control, and utilize streams & tasks for automation.
  • Data Quality & Monitoring:Implement validation frameworks (null checks, type checks, threshold alerts) and automated testing for data integrity and reliability.
  • Collaboration:Partner with analysts, data scientists, and business stakeholders to translate requirements into scalable technical solutions.
  • Performance Optimization:Develop incremental and full-load strategies with continuous monitoring, retries, and logging and tune query performance and job execution efficiency.
  • Infrastructure Automation:Use Terraform or similar IaC tools to provision and manage Snowflake, Airflow, and related environments
  • Partner with data analysts, scientists, and business stakeholders to translate reporting and analytics requirements into technical specifications.

Required Skills & Qualifications:

  • Bachelor’s or Master’s degreein Computer Science, Information Technology, Data Engineering, or a related field.
  • 7–10 years of experience in data engineering, with strong hands-on expertise in:
    • Snowflake (data modelling, performance tuning, access control, streams & tasks, external tables)
    • Apache Airflow(DAG design, task dependencies, dynamic tasks, error handling)
    • dbt(modular SQL development, Jinja templating, testing, documentation)
  • Proficiency in SQLand Python (Spark experience is a plus).
  • Experience building and managing pipelines on AWS, GCP, or Azure.
  • Strong understanding of data warehousing concepts and ELT best practices.
  • Familiarity with version control(Git) and CI/CD
  • Exposure to infrastructure-as-codetools like Terraform for provisioning Snowflake or Airflow environments.
  • Excellent problem-solving, collaboration, and communication skills, with the ability to lead technical projects.

Good To Have:

  • Experience with streaming data pipelines (Kafka, Kinesis, Pub/Sub).
  • Exposure to BI/analytics tools (Looker, Tableau, Power BI).
  • Knowledge of data governance and security best practices

Perks:

  • Flexible Timings
  • 5 Days Working
  • Healthy Environment
  • Celebration
  • Learn and Grow
  • Build the Community
  • Medical Insurance Benefit

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You