Data Engineer – Azure Databricks

5 - 10 years

4 - 8 Lacs

Posted:1 week ago| Platform: GlassDoor logo

Apply

Work Mode

On-site

Job Type

Part Time

Job Description

Data Engineer – Azure Databricks
Experience: 5–10 Years
Role Overview
We are seeking an experienced Data Engineer specialized in Azure Databricks to design, build, and maintain scalable data pipelines and analytics platforms on Microsoft Azure. The role focuses on building production-grade data pipelines with strong emphasis on performance, data quality, security, and governance.
Key Responsibilities
  • Design and implement end-to-end batch and incremental data pipelines using Azure Databricks (PySpark / Spark SQL)
  • Implement Medallion Architecture (Copper/Bronze/Silver/Gold)
  • Ingest data from REST APIs, ADLS Gen2, Azure SQL, SQL Server, and cloud file systems
  • Develop and optimize Databricks notebooks, workflows, and scheduled jobs
  • Implement Delta Lake features such as ACID transactions, schema evolution, and time travel
  • Build data quality checks, reconciliation logic, and rejected record quarantine patterns
  • Implement centralized logging, error handling, and monitoring
  • Manage security using Unity Catalog, Azure AD RBAC, and secret scopes
  • Support CI/CD pipelines using Azure DevOps or GitHub Actions
  • Collaborate with Data Architects, BI teams, and business stakeholders
Mandatory Skills (Must-Have – Shortlisting Criteria)
  • Azure Databricks – strong hands-on experience in production environments
  • Apache Spark using PySpark and Spark SQL
  • Delta Lake (MERGE, OPTIMIZE, VACUUM, schema evolution)
  • Azure Data Lake Storage (ADLS Gen2)
  • Strong SQL skills (complex joins, window functions, performance tuning)
  • Python programming for data engineering
  • Experience with data pipeline orchestration and scheduling
  • Understanding of data modeling for analytics
  • Experience handling large datasets and performance optimization
  • Delta Live Tables (DLT)
  • Databricks Asset Bundles (DAB)
  • REST API ingestion with OAuth2 authentication
  • Exposure to Unity Catalog data lineage and governance features
Optional Skills (Good-to-Have – Added Advantage)
  • Azure Data Factory (ADF)
  • Terraform / Infrastructure as Code for Databricks or Azure
  • Streaming technologies (Azure Event Hubs, Kafka, Structured Streaming)
  • REST API ingestion with OAuth2 authentication
  • Azure Monitor / Log Analytics

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You