Databricks Developer / Engineer

5 - 10 years

20 - 35 Lacs

Posted:10 hours ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Databricks with strong data engineering skills and deep AWS cloud expertise

data products

The role requires mastery in Databricks features, Delta Lake, AWS ecosystem integration, and modern Data Engineering best practices.

Primary Responsibilities

  • Design, build, and manage

    data products

    and event driven architecture using Databricks Lakehouse principles.
  • Design

    end-to-end data pipelines

    using PySpark, Databricks SQL.
  • Define Data Product blueprint including

    domain boundaries, ownership, SLAs, documentation, quality rules

    .
  • Implement modern ingestion frameworks using

    Auto Loader, Delta Live Tables, Workflows

    .
  • Develop multi-zone medallion architecture (Bronze/Silver/Gold) using

    Delta Lake

    .
  • Lead

    Databricks on AWS integration

    including S3 access, IAM roles, VPC networking.
  • Implement

    Unity Catalog governance frameworks

    , fine-grained permissions, lineage tracking.
  • Drive automation & DevOps practices using

    Git, Repos, Databricks CLI/SDK, CI/CD pipelines

    .
  • Build and optimize Spark workloads for performance and cost control (Photon/Serverless tuning).
  • Lead performance reviews, code quality checks, and data engineering guidance for teams.
  • Operationalize Machine Learning and advanced analytics with

    MLflow/Feature Store/Model Serving

    .
  • Monitor and enhance reliability using

    job monitoring, observability dashboards, alerts

    .
  • Hands-on development of Power BI dashboards, DAX measures, data models, DirectQuery/Import mode
  • Evaluate new Databricks features and adopt innovation into technical roadmap.

Technical Skills Required

  • Strong programming experience in

    PySpark, SQL & Python

    for production-grade data pipelines.
  • Deep mastery in

    Databricks, Delta Lake, Unity Catalog, Workflows, SQL Warehouses

    .
  • Expert knowledge of AWS services

    : S3, Glue, Lambda, EMR, Step Functions, CloudWatch, VPC networking.
  • Experience building

    data products

    with versioning, discoverability, contracts, metadata & lineage.
  • Good understanding of Infra-as-Code:

    Terraform

    (workspace, clusters, UC policies, jobs, service principals).
  • Strong foundation in

    data modeling, schema design, Lakehouse & Data Mesh principles

    .
  • Familiar with governance & security frameworks (encryption, tokenization, row/column controls).
  • Experience with

    enterprise integrations

    (Power BI).

Qualifications

  • 5- 10 years of professional Data Engineering experience.
  • Minimum 3+ years hands-on with Databricks (production scale).
  • Proven experience in

    delivering Data Products on cloud Lakehouse platforms

    .

Soft Skills

  • Strong ownership mindset with Data Product thinking.
  • Excellent communication and ability to translate architecture for business stakeholders.
  • Ability to lead engineering teams and influence architecture decisions.
  • Continuous innovation mindset with focus on automation, performance & reusability.

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
CGI logo
CGI

Information Technology and Consulting

Montreal

RecommendedJobs for You