Databricks Data Engineer

7 years

0 Lacs

Posted:12 hours ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

Job Title:

Location:

Job Type:

Company:

Experience Required:

Send Resume To:

Job Summary

Senior Data Engineer

Key Responsibilities

  • Design, develop, and maintain scalable ETL/ELT pipelines

     using Azure Data Factory, Databricks, PySpark, and SQL.
  • Engineer data workflows integrating 

    structured and unstructured data

     from Azure Data Lake (ADLS Gen2), Azure Synapse, SQL Server, and external APIs.
  • Implement 

    data modeling and transformation logic

     to support analytics, reporting, and ML workloads.
  • Collaborate with 

    data scientists and business analysts

     to understand requirements and deliver clean, reliable, production-ready datasets.
  • Optimize data storage and compute performance

     in Azure Databricks and Delta Lake environments.
  • Develop and maintain 

    CI/CD pipelines for data workflows

     using Azure DevOps or GitHub Actions.
  • Monitor, troubleshoot, and resolve 

    pipeline performance issues

     to ensure reliability and high availability.
  • Apply 

    best practices in data governance, security, and compliance

     across workflows.
  • Lead end-to-end data solutioning in Azure with a focus on performance, reliability, and scalability.
  • Mentor junior engineers and provide technical leadership within the team.

Required Qualifications

  • Bachelor’s in Computer Science, Engineering, or related field (or equivalent work experience).
  • 7+ years of experience in Data Engineering

    , including:
  • Strong hands-on expertise in 

    Databricks

     (Delta Lake, orchestration, notebooks).
  • In-depth experience with 

    Azure Cloud Services

     (ADF, ADLS Gen2, Synapse, Key Vault).
  • High proficiency in 

    Python, PySpark, and SQL

    .
  • Experience building 

    lakehouse architectures

     and designing 

    ETL/ELT pipelines

    .
  • Strong background in 

    data modeling and transformation logic

    .
  • Excellent communication skills with the ability to convey technical concepts to non-technical stakeholders.

Preferred Skills & Technologies

  • Experience with 

    Snowflake

     for scalable data warehousing.
  • Experience with 

    data mesh architectures

     and distributed data ownership.
  • Familiarity with 

    event-driven architectures

    : Azure Event Hubs, Kafka.
  • Exposure to 

    Databricks Workflows, MLflow, Unity Catalog

    .
  • Hands-on experience with 

    Power BI

     or other BI/visualization tools.
  • Knowledge of 

    Terraform

     or other Infrastructure-as-Code (IaC) tools.

Certifications (Preferred)

  • Microsoft Certified: 

    Azure Data Engineer Associate (DP-203)

  • Databricks Certified Data Engineer

To Apply:

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

mumbai, maharashtra, india

noida, bengaluru, mumbai (all areas)