MLOps Engineer — Databricks

0 years

0 Lacs

Posted:4 days ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Contractual

Job Description

MLOps Engineer — Databricks


Client:

Location:

Work Model:

Contract:

Start Date:

Engagement:


Role Overview

Databricks MLOps Developer


Key Responsibilities

1. Develop Scalable MLOps Pipelines

  • Build automated ML pipelines for training, validation, deployment, and batch/real-time inference.
  • Use

    Databricks Workflows, Jobs, Repos

    , and

    Delta Live Tables

    where applicable.
  • Implement distributed training and inference pipelines using

    MLflow + PySpark

    .

2. Model Lifecycle Management

  • Manage model versioning and promotion across

    dev → staging → production

    using

    MLflow Model Registry

    .
  • Create reproducible workflows for model packaging, deployment, and rollback.

3. CI/CD Integration

  • Build and integrate ML pipelines with CI/CD using

    Azure DevOps, GitHub Actions, or Jenkins

    .
  • Automate testing, validation, and deployment for ML artifacts, notebooks, and infrastructure.

4. Feature Engineering & Data Pipelines

  • Collaborate with Data Engineering teams to build optimized

    Delta Lake

    pipelines (Bronze/Silver/Gold architecture).
  • Implement feature engineering workflows and support feature reuse at scale.

5. Monitoring & Governance

  • Set up model monitoring for performance, drift, data quality, and lineage.
  • Use Databricks-native tools, MLflow metrics, and cloud monitoring services (Azure/AWS).
  • Ensure compliance through logging, auditing, permissions, and environment governance.

6. Cross-Functional Collaboration

  • Work closely with Data Scientists, Data Engineers, Cloud teams, and Product teams.
  • Document workflows, best practices, and MLOps reusable components.


Required Skills & Qualifications

  • Strong hands-on experience with

    Databricks (Workflows, Repos, Jobs, Compute)

  • Proficiency with

    MLflow (Tracking, Registry, Model Deployment)

  • Expertise in

    Delta Lake

    , PySpark, and distributed data pipelines
  • Solid programming skills in

    Python

    and

    SQL

  • Experience with CI/CD tools:

    Azure DevOps, GitHub Actions, Jenkins

  • Familiarity with cloud platforms:

    Azure, AWS, or GCP

  • Understanding of

    containerization (Docker)

    and orchestration (Kubernetes)
  • Background in ML model training, serving, and observability


Preferred Qualifications

  • Databricks certifications:
  • Databricks Certified Machine Learning Professional
  • Databricks Certified Data Engineer Associate/Professional
  • Experience with

    Unity Catalog

    for governance
  • Experience implementing

    feature stores

  • Knowledge of ML observability tools (WhyLabs, Monte Carlo, Arize AI, etc.)

Mock Interview

Practice Video Interview with JobPe AI

Start DevOps Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You