4 - 8 years

0 - 18 Lacs

Posted:2 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Microsoft Fabric

Key Responsibilities

  • Build data pipelines

    using

    PySpark

    on

    Databricks

    /

    Fabric Notebooks

    , ingesting data from APIs, files, and DBs into the

    Fabric Lakehouse

    /

    Azure Data Lake

    .
  • Develop ELT/ETL workflows

    : transform and curate bronze silver gold layers; implement partitioning, Delta tables, and performance tuning.
  • Data modeling & warehousing

    : design dimensional models (star/snowflake), views, and semantic layers for BI & analytics in

    Synapse

    /

    Fabric

    .
  • SQL development

    : write complex queries, stored procedures, and optimize execution plans; enforce coding standards.
  • Orchestration & CI/CD

    : schedule jobs with

    Fabric pipelines

    /

    Azure Data Factory

    ; implement CI/CD for notebooks, SQL, and infrastructure (Git, YAML pipelines).
  • Data quality & governance

    : implement validation checks, lineage, cataloging, and security (RBAC, rowlevel/columnlevel security).
  • Performance & cost optimization

    : tune clusters, caching, autoscaling, job concurrency; monitor and manage workspace costs.
  • Collaboration

    : partner with analysts, data scientists, and product teams to deliver reliable datasets and SLAs.

MustHave Skills

  • Microsoft Fabric

    (Lakehouse, Data Engineering, Pipelines) or strong

    Azure Synapse

    experience with readiness to work in Fabric.
  • Databricks

    (Delta Lake, notebooks, cluster management).
  • PySpark

    (RDD/DataFrame APIs, UDFs, joins, windowing, checkpointing).
  • SQL

    (TSQL/ANSI SQL, query optimization, indexing).
  • Azure

    data services:

    Azure Data Lake Storage

    ,

    Key Vault

    ,

    Azure DevOps

    (repos, pipelines).
  • Data modeling

    : dimensional modeling, data vault basics, semantic modeling for BI.

NicetoHave

  • Orchestration

    : Azure Data Factory, Airflow.
  • BI

    : Power BI (Direct Lake/Import).
  • Streaming

    : Structured Streaming, Event Hub, Kafka.
  • Automation/IaC

    : Terraform/Bicep for Azure resources.
  • Testing

    : Great Expectations, dbt tests, unit tests for PySpark.
  • Security & Governance

    : Purview (catalog, lineage, classification).

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You

bokaro, dhanbad, jamshedpur, ranchi

bokaro, dhanbad, jamshedpur, ranchi

bokaro, dhanbad, jamshedpur, ranchi

hyderabad, telangana, india