Senior Data Engineer

7 - 11 years

10 - 20 Lacs

Posted:7 hours ago| Platform: Naukri logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

Responsibilities

Assessment & Planning

  • Analyze the clients current GCP setup (BigQuery datasets, Airflow DAGs, lineage).
  • Design a migration roadmap to Azure Data Lake, Snowflake, and Fabric.

Data Migration & Engineering

  • Migrate data from

    BigQuery into Snowflake and/or Azure Data Lake

    .
  • Develop scalable

    ETL/ELT pipelines

     leveraging Spark, ADF, and Fabric Data Factory.
  • Standardize schemas, enforce governance/security, and ensure high data quality.
  • Consolidate data into a

    single Snowflake database

     for simplified integration.

Pipeline Modernization

  • Re-engineer Airflow DAGs into

    ADF pipelines

     and migrate those into

    Fabric pipelines

     where applicable.
  • Build and optimize

    Spark scripts

     for transformations within Azure Notebooks, Fabric, or Snowflake.
  • Implement CI/CD workflows for pipeline deployment using GitHub or Azure DevOps.

Fabric Enablement

  • Configure and optimize

    Microsoft Fabric environments

     for pipeline execution.
  • Ensure seamless

    FabricSnowflake integration

     for data workflows.
  • Assist with migration of existing ADF pipelines into Fabric.

Snowflake & Performance Optimization

  • Configure Snowflake warehouses, security roles, and RBAC.
  • Tune query and warehouse performance for scalability and cost efficiency.
  • Monitor and enforce governance policies.

BI & Reporting Integration

  • Collaborate with BI teams to integrate

    Power BI with Snowflake/Fabric backends

    .
  • Validate existing reports and dashboards post-migration.
  • Ensure report performance, security, and reliability align with client expectations.

Testing, Documentation & Knowledge Transfer

  • Conduct unit, integration, and UAT testing for pipelines and data flows.
  • Deliver architecture diagrams, playbooks, and runbooks.
  • Lead knowledge transfer and training sessions for client technical teams.

Required Skills & Experience

  • Strong experience in

    data engineering

     within GCP and Azure environments.
  • Hands-on expertise with:
    • BigQuery

       (datasets, SQL, exports)
    • Airflow

       (DAGs, orchestration)
    • Azure Data Lake & ADF

       (data ingestion, pipeline design)
    • Microsoft Fabric

       (Data Factory pipelines, Snowflake integration, notebooks)
    • Snowflake

       (database design, optimization, governance)
    • Spark

       (script development, transformations, optimization)
  • Skilled in SQL and Python development.
  • Experience migrating pipelines and workloads across cloud platforms.
  • Familiarity with CI/CD tools (GitHub, Azure DevOps).
  • Strong collaboration experience with BI teams (especially Power BI).

Nice to Have

  • Direct experience migrating ADF pipelines into Fabric.
  • Strong knowledge of Power BI and DAX.
  • Prior consulting or client-facing experience in large-scale data migrations.

Soft Skills

  • Strong problem-solving and critical thinking skills.
  • Clear communicator able to collaborate across engineering, BI, and business stakeholders.
  • Comfortable navigating ambiguity and evolving technical landscapes.
  • Detail-oriented with a proactive approach to risk and dependency management.

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You

mumbai, gurugram, bengaluru

coimbatore, tamil nadu, india