Databricks Administrator

5 years

0 Lacs

Posted:3 days ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

Job Title: Databricks Administrator

Experience Required: 5+ Years

Location: Remote

Employment Type: Full-time


About the Role

Databricks Administrator

You’ll play a key role in ensuring high availability, cost efficiency, and data security for enterprise-scale analytics and AI workloads.


Key Responsibilities

  • Manage and administer

    Databricks workspaces, clusters, pools, and jobs

    across environments (Dev/QA/Prod).
  • Configure and maintain

    user access control

    , groups, and permissions using

    RBAC, SCIM, and IAM integrations

    .
  • Implement and manage

    Unity Catalog

    for data governance, lineage, and access control.
  • Collaborate with data engineers and analysts to optimize

    Spark cluster performance

    and

    job execution

    .
  • Integrate Databricks with cloud services such as

    Azure AD, AWS IAM, S3, ADLS, Key Vault, Secrets Manager, and Snowflake

    .
  • Automate provisioning and configuration using

    Terraform, Databricks CLI, or REST APIs

    .
  • Monitor platform performance, set up alerts, and manage

    audit logs and cost optimization

    dashboards.
  • Ensure compliance with

    data security, encryption, and networking best practices

    (VNet/VPC, Private Link, IP Access Lists).
  • Provide troubleshooting and support for cluster failures, job issues, and workspace connectivity.
  • Collaborate with DevOps, Cloud, and Data teams to improve

    CI/CD pipelines and deployment automation

    .


Required Skills & Qualifications

  • 5+ years

    of experience in administering

    Databricks

    environments in

    Azure, AWS, or GCP

    .
  • Hands-on experience with

    cluster policies, pools, instance profiles, and job scheduling

    .
  • Expertise in

    cloud identity management

    (Azure AD, AWS IAM, Okta) and

    SCIM provisioning

    .
  • Solid understanding of

    Spark architecture, data pipelines, and Delta Lake concepts

    .
  • Experience with

    Terraform, Databricks CLI, or REST APIs

    for automation.
  • Knowledge of

    networking, VPC/VNet security, and private endpoints

    .
  • Familiarity with

    cost monitoring tools

    and cluster optimization strategies.
  • Proficiency in

    Python or Bash scripting

    for automation and operational tasks.
  • Strong analytical and problem-solving skills, with attention to detail.
  • Excellent communication and collaboration skills in cross-functional environments.


Preferred Qualifications

  • Experience with

    Unity Catalog

    and

    Delta Sharing

    .
  • Knowledge of

    CI/CD tools

    (Azure DevOps, Jenkins, GitHub Actions).
  • Familiarity with

    data governance frameworks

    and compliance standards (GDPR, SOC2).
  • Certification in

    Databricks

    ,

    AWS

    , or

    Azure Data Engineer Associate

    is a plus.

Mock Interview

Practice Video Interview with JobPe AI

Start DevOps Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

Noida, Uttar Pradesh, India