Data Engineer - BigQuery, Spark SQL, Microsoft Fabric - 10+ Years - Immediate joiners only

10 years

0 Lacs

Posted:3 weeks ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

Role- Data Engineer

Work Mode- Remote


Mandatory Skills


Profile Summary

Data Engineer with over 10 years of experience designing and building large-scale data platforms across cloud and on-prem ecosystems. Proven track record of developing high performance ETL pipelines, optimizing data infrastructure, and enabling analytics teams through reliable, well-structured, and secure data systems. Deep expertise in Microsoft Fabric and GCP, with hands-on experience across data ingestion, transformation, modeling, and delivery. Comfortable working in fast-moving environments that require ownership, adaptability, and strong collaboration with analytics and product teams.


Core Skills

Microsoft Fabric

• Designed and implemented pipelines and dataflows (Gen2) to orchestrate data ingestion from multiple structured and semi-structured sources.

• Developed notebooks using Python and Spark SQL for data wrangling, feature engineering, and model-ready transformations.

• Experience in Fabric & Azure DevOps for automated deployment of pipelines, datasets, and reports.

• Defined security and access control strategies in Fabric, ensuring compliance with enterprise data governance standards.

• Built and optimized Direct Lake models for low-latency Power BI reporting.

• Developed advanced DAX measures and tabular models to support complex analytical dashboards.

Google Cloud Platform (GCP)

• Experience using BigQuery, improving performance and reducing query costs through partitioning and clustering.

• Automated workflows using Cloud Functions to handle event-based data transformations and notifications.

• Integrated real-time streaming data through Pub/Sub, enabling near real-time analytics pipelines.


Technical Skills

• Languages: Python, SQL, Spark SQL

• Tools: Microsoft Fabric, Power BI, Azure DevOps, Git, GCP Console, Dataflow, Dataproc

• Concepts: Data Modeling (Kimball, Data Vault), ETL/ELT, CI/CD, Data Governance, Security, Performance Optimization


Experience Highlights

• Led migration of on-prem ETL workflows to Microsoft Fabric

• Built an end-to-end analytics solution combining BigQuery with Power BI

• Designed a CI/CD pipeline in Fabric, automating dataset promotion from dev to production with integrated testing and validation.

• Partnered with data scientists to deliver feature stores and ML-ready datasets using Fabric Notebooks and Spark SQL.

• Implemented Fabric security model to support row-level access and data masking across multiple business domains.

• Created a unified metadata catalog and data quality monitoring framework, improving data discoverability and trust.


Education

• Bachelor’s degree in Computer Science (or related field)

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now