Jobs
Interviews

1 Fabric Notebooks Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 5.0 years

4 - 9 Lacs

noida

Work from Office

We are seeking a skilled Data Engineer to design, build, and maintain high-performance data pipelines within the Microsoft Fabric ecosystem. The role involves transforming raw data into analytics-ready assets, optimising data performance across both modern and legacy platforms, and collaborating closely with Data Analysts to deliver reliable, business-ready gold tables. You will also coordinate with external vendors during build projects to ensure adherence to standards. Key Responsibilities Pipeline Development & Integration Design and develop end-to-end data pipelines using Microsoft Fabric (Data Factory, Synapse, Notebooks). Build robust ETL/ELT processes to ingest data from both modern and legacy sources. Create and optimise gold tables and semantic models in collaboration with Data Analysts. Implement real-time and batch processing with performance optimisation. Build automated data validation and quality checks across Fabric and legacy environments. Manage integrations with SQL Server (SSIS packages, cube processing). Data Transformation & Performance Optimisation Transform raw datasets into analytics-ready gold tables following dimensional modelling principles. Implement complex business logic and calculations within Fabric pipelines. Create reusable data assets and standardised metrics with Data Analysts. Optimise query performance across Fabric compute engines and SQL Server. Implement incremental loading strategies for large datasets. Maintain and improve performance across both Fabric and legacy environments. Business Collaboration & Vendor Support Partner with Data Analysts and stakeholders to understand requirements and deliver gold tables. Provide technical guidance to vendors during data product development. Ensure vendor-built pipelines meet performance and integration standards. Collaborate on data model design for both ongoing reporting and new analytics use cases. Support legacy reporting systems including Excel, SSRS, and Power BI. Resolve data quality issues across internal and vendor-built solutions. Quality Assurance & Monitoring Write unit and integration tests for data pipelines. Implement monitoring and alerting for data quality. Troubleshoot pipeline failures and data inconsistencies. Maintain documentation and operational runbooks. Support deployment and change management processes. Required Skills & Experience Essential 2+ years of data engineering experience with Microsoft Fabric and SQL Server environments. Strong SQL expertise for complex transformations in Fabric and SQL Server. Proficiency in Python or PySpark for data processing. Integration experience with SSIS, SSRS, and cube processing. Proven performance optimisation skills across Fabric and SQL Server. Experience coordinating with vendors on technical build projects. Strong collaboration skills with Data Analysts for gold table creation. Preferred Microsoft Fabric or Azure certifications (DP-600, DP-203). Experience with Git and CI/CD for data pipelines. Familiarity with streaming technologies and real-time processing. Background in BI or analytics engineering. Experience with data quality tools and monitoring frameworks.

Posted Date not available

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies