Snowflake-Lead Data Engineer

8 - 12 years

0 Lacs

Posted:3 days ago| Platform: Foundit logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Key Responsibilities

  • Lead the design, development, and optimization of data pipelines and data warehouse solutions on

    Snowflake

    .
  • Snowflake -

    Types of Tables, Storage Integration, Internal & External Stages, Streams, Tasks, Views, Materialized Views, Time Travel, Fail Safe, Micro partitions, Warehouses, RBAC, COPY Command, File Formats (CSV, JSON and XML), snowpipe

    ,

    Stored Procedures (SQL or JavaScript, Python).

  • Develop and maintain

    dbt models

    for data transformation, testing, and documentation.
  • dbt

    create, run and build a model, Scheduling, Running dependency Models, Macros,

    Jinga Template (Optional)
  • Collaborate with cross-functional teams including data architects, analysts, and business stakeholders to deliver robust data solutions.
  • Ensure high standards of data quality, governance, and security across pipelines and platforms.
  • Leverage

    Airflow

    (or other orchestration tools) to schedule and monitor workflows.
  • Integrate data from multiple sources using tools like

    Fivetran, Qlik Replicate, IDMC (At least one).

  • Provide technical leadership, mentoring, and guidance to junior engineers in the team.
  • Optimize costs, performance, and scalability of cloud-based data environments.
  • Contribute to architectural decisions, code reviews, and best practices.
  • CI/CD

    BitBucket, GitHub (At least one).

  • Data Model

    ENTITY (SUB DIM, DIM, FACTS), Data Vault (HUB, LINK, SAT).

Required Skills & Experience

  • 812 years of overall experience in Data Engineering, with at least 34 years in a

    lead role

    .
  • Strong hands-on expertise in

    Snowflake

    (data modeling, performance tuning, query optimization, security, and cost management).
  • Proficiency in

    dbt

    (core concepts, macros, testing, documentation, and deployment).
  • Solid programming skills in

    Python

    (for data processing, automation, and integrations).
  • Experience with workflow orchestration tools such as

    Apache Airflow

    .
  • Exposure to ELT/ETL tools.
  • Strong understanding of modern data warehouse architectures, data governance, and cloud-native environments.
  • Excellent problem-solving, communication, and leadership skills.

Good to Have

  • Hands-on experience with

    Databricks

    (PySpark, Delta Lake, MLflow).
  • Exposure to other cloud platforms (AWS, Azure, or GCP).
  • Experience in building CI/CD pipelines for data workflows.

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You