Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 12.0 years
14 - 27 Lacs
ahmedabad, gujarat, india
On-site
Job Overview: We are seeking a Lead Data Engineer with deep expertise in Snowflake, dbt, and Apache Airflow to design, implement, and optimize scalable data solutions. This role involves working on complex datasets, building robust data pipelines, ensuring data quality, and collaborating closely with analytics and business teams to deliver actionable insights. If you are passionate about data architecture, ELT best practices, and modern cloud data stack, we'd like to meet you. Key Responsibilities: Pipeline Design & Orchestration: Build and maintain robust, scalable data pipelines using Apache Airflow, including incremental & full-load strategies, retries, and logging. Data Modelling & Transformation: Develop modular, tested, and documented transformations in dbt, ensuring scalability and maintainability. Snowflake Development: Design and maintain warehouse in Snowflake, optimize Snowflake schemas, implement performance tuning (clustering keys, warehouse scaling, materialized views), manage access control, and utilize streams & tasks for automation. Data Quality & Monitoring: Implement validation frameworks (null checks, type checks, threshold alerts) and automated testing for data integrity and reliability. Collaboration: Partner with analysts, data scientists, and business stakeholders to translate requirements into scalable technical solutions. Performance Optimization: Develop incremental and full-load strategies with continuous monitoring, retries, and logging and tune query performance and job execution efficiency. Infrastructure Automation: Use Terraform or similar IaC tools to provision and manage Snowflake, Airflow, and related environments Partner with data analysts, scientists, and business stakeholders to translate reporting and analytics requirements into technical specifications. Qualifications: Bachelor's or Master's degree in Computer Science, Information Technology, Data Engineering, or a related field. 710 years of experience in data engineering, with strong hands-on expertise in: o Snowflake (data modelling, performance tuning, access control, streams & tasks, external tables) o Apache Airflow (DAG design, task dependencies, dynamic tasks, error handling) o dbt (modular SQL development, Jinja templating, testing, documentation) Proficiency in SQL and Python (Spark experience is a plus). Experience building and managing pipelines on AWS, GCP, or Azure. Strong understanding of data warehousing concepts and ELT best practices. Familiarity with version control (Git) and CI/CD workflows. Exposure to infrastructure-as-code tools like Terraform for provisioning Snowflake or Airflow environments. Excellent problem-solving, collaboration, and communication skills, with the ability to lead technical projects. Good to have: o Experience with streaming data pipelines (Kafka, Kinesis, Pub/Sub). o Exposure to BI/analytics tools (Looker, Tableau, Power BI). o Knowledge of data governance and security best practices
Posted 6 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
67493 Jobs | Dublin
Wipro
26746 Jobs | Bengaluru
Accenture in India
21683 Jobs | Dublin 2
EY
20113 Jobs | London
Uplers
14352 Jobs | Ahmedabad
Bajaj Finserv
13841 Jobs |
IBM
13289 Jobs | Armonk
Accenture services Pvt Ltd
12869 Jobs |
Amazon
12463 Jobs | Seattle,WA
Amazon.com
12066 Jobs |