Senior Data Engineer

5 - 8 years

25 - 30 Lacs

Posted:13 hours ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Role & Responsibilities :

  • Design, build, and optimize modern data solutions for our mid-market and enterprise clients
  • Transform raw data into trusted, analytics-ready assets that power dashboards, advanced analytics, and AI use cases working primarily inside the Microsoft stack (Azure, Synapse, and Microsoft Fabric)
  • Develop scalable, well-documented ETL/ELT pipelines using T-SQL, Python, Azure Data Factory/Fabric Data Pipelines, and Databricks; implement best-practice patterns for performance, security, and cost control.
  • Design relational and lakehouse models; create Fabric OneLake shortcuts, medallion-style layers, and dimensional/semantic models for Power BI.
  • Build automated data-quality checks, lineage, and observability metrics; contribute to CI/CD workflows in Azure DevOps or GitHub
  • Gather requirements, demo iterative deliverables, document technical designs, and translate complex concepts to non-technical audiences.
  • Research new Fabric capabilities, share findings in internal communities of practice, and contribute to reusable accelerators.

Qualifications :

  • Bachelors degree required.
  • Minimum of 3 years delivering production data solutions, preferably in a consulting or client-facing role.
  • Strong SQL for data transformation and performance tuning.
  • Python for data wrangling, orchestration, or notebook-based development (PySpark), and hands on ETL/ELT with at least one Microsoft service (ADF, Synapse Pipelines, Fabric Data Pipelines). Solid grasp of Azure fundamentalsstorage, networking, security, and cost management.
  • Project experience with Microsoft Fabric (OneLake, Lakehouses, Data Pipelines, Notebooks, Warehouse, Power BI Direct Lake).
  • Familiarity with Databricks, Delta Lake, or comparable Lakehouse technologies, and exposure to DevOps (YAML pipelines, Terraform/Bicep) and test automation frameworks.
  • Experience writing unit tests for data pipelines and transformation logic, understanding or working with metadata frameworks for data governance and lineage, and integrating SaaS/ERP sources (e.g., Dynamics 365, Workday, Costpoint, SAP, NetSuite, Sage Intacct, IFS, etc.)

Mock Interview

Practice Video Interview with JobPe AI

Start Data Engineer Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Moss Adams logo
Moss Adams

Accounting & Consulting

Seattle

RecommendedJobs for You