Posted:5 days ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

Databricks Developer

100% Remote

Experience:

Long Term Contract

highly skilled Databricks Developer

The ideal candidate will have hands-on experience with Slowly Changing Dimensions (SCD2), data deduplication strategies, and cross-environment data sharing within Databricks.

This is a critical client-facing position

Key Responsibilities

  • Design, develop, and maintain

    end-to-end ETL/ELT pipelines

    using

    Databricks (PySpark, Delta Lake, DLT, and Autoloader)

    .
  • Implement

    Slowly Changing Dimensions (SCD2)

    and other complex data modeling techniques to manage historical and incremental data efficiently.
  • Develop and apply

    deduplication frameworks

    to ensure clean, consistent, and accurate datasets across multiple layers.
  • Enable

    cross-environment data sharing

    within Databricks workspaces while ensuring data security and governance compliance.
  • Optimize data ingestion, transformation, and storage processes for

    high-volume and real-time environments

    .
  • Collaborate with data architects and business analysts to translate requirements into scalable data models and pipelines.
  • Implement and maintain

    Delta Live Tables (DLT), Autoloader

    , and

    Vacuum

    for incremental ingestion, metadata management, and storage optimization.
  • Troubleshoot performance issues, identify bottlenecks, and propose improvements in pipeline efficiency and data flow.
  • Work closely with cross-functional teams (Data Engineering, Analytics, Cloud, and Product) to deliver data solutions aligned with business outcomes.
  • Develop

    CI/CD workflows

    for Databricks jobs using Git-based versioning, automation, and DevOps best practices.
  • Prepare and maintain technical documentation, ensuring smooth handover and project continuity.
  • Participate in client discussions, design reviews, and technical demos to represent the data engineering practice confidently.

Required Skills and Expertise

  • Strong hands-on experience

    in

    Databricks (Workspace, Delta Lake, PySpark, SQL, and DLT)

    .
  • Deep understanding of

    SCD2 implementation, deduplication techniques, and incremental data processing.

  • Experience with Autoloader, Vacuum, and data lakehouse optimization

    strategies.
  • Proficiency in

    Apache Spark, Python, and SQL

    for large-scale data transformations.
  • Knowledge of

    Delta Live Tables

    and

    Data Quality (Expectations)

    within Databricks pipelines.
  • Experience with

    AWS/Azure/GCP Databricks environments

    , including data ingestion from multiple cloud sources.
  • Familiarity with

    data governance, lineage, and cataloging

    (Unity Catalog preferred).
  • Strong knowledge of

    ETL performance tuning

    and distributed data architecture.
  • Understanding of

    data warehousing concepts, dimensional modeling, and schema design

    .
  • Excellent problem-solving, communication, and client-facing skills with the ability to lead technical discussions.

Preferred Qualifications

  • Certification in

    Databricks Data Engineer Professional

    or

    AWS/Azure Data Engineering

    .
  • Experience integrating Databricks with visualization tools like

    Power BI

    or

    Tableau

    .
  • Exposure to

    CI/CD pipelines

    using

    GitHub Actions, Jenkins, or Azure DevOps

    .
  • Familiarity with

    Apache Airflow or other orchestration frameworks

    .
  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.

Why Join Us

  • Opportunity to work on high-impact

    enterprise data transformation projects

    .
  • Collaborate with global clients and cutting-edge technologies.
  • Be part of a

    dynamic and growing data engineering practice

    where innovation is valued.
  • Competitive compensation and growth opportunities for top performers.

  • Mock Interview

    Practice Video Interview with JobPe AI

    Start PySpark Interview
    cta

    Start Your Job Search Today

    Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

    Job Application AI Bot

    Job Application AI Bot

    Apply to 20+ Portals in one click

    Download Now

    Download the Mobile App

    Instantly access job listings, apply easily, and track applications.

    coding practice

    Enhance Your Python Skills

    Practice Python coding challenges to boost your skills

    Start Practicing Python Now

    RecommendedJobs for You