Posted:3 days ago| Platform: SimplyHired logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

**Need to be Databricks SME ***

Location - offshore ( Anywhere from India - Remote ) - Need to work in EST Time (US shift)

Need 12+ Years of experience.

5 Must Haves:

1. Data Expertise -- worked in Azure Data Bricks/Pipeline/ Shut Down Clusters--2 or more years' experience

2. Unity Catalog migration -- well versed--done tera form scripting in Dev Ops--coding & understand the code--understanding the logics of the behind the scenes--automate functionality

3. Tera Form Expertise -- code building --- 3 or more years

4. Understanding data mesh architecture -- decoupling applications -- ability to have things run in Parallel -- clear understanding -- 2 plus years of experience Microsoft Azure Cloud Platform

5. Great problem Solver

Key Responsibilities:

  • Architect, configure, & optimize Databricks Pipelines for large-scale data processing within an Azure Data Lakehouse environment.
  • Set up & manage Azure infrastructure components including Databricks Workspaces, Azure Containers (AKS/ACI), Storage Accounts, & Networking.
  • Design & implement a monitoring & observability framework using tools like Azure Monitor, Log Analytics, & Prometheus / Grafana.
  • Collaborate with platform & data engineering teams to enable microservices-based architecture for scalable & modular data solutions.
  • Drive automation & CI / CD practices using Terraform, ARM templates, & GitHub Actions/Azure DevOps.

Required Skills & Experience:

  • Strong hands - on experience with Azure Databricks, Delta Lake, & Apache Spark.
  • Deep understanding of Azure services: Resource Manager, AKS, ACR, Key Vault, & Networking.
  • Proven experience in microservices architecture & container orchestration.
  • Expertise in infrastructure-as-code, scripting (Python, Bash), & DevOps tooling.
  • Familiarity with data governance, security, & cost optimization in cloud environments.

Bonus:

  • Experience with event - driven architectures (Kafka / Event Grid).
  • Knowledge of data mesh principles & distributed data ownership.

Interview: Two rounds of interviews (1st with manager & 2nd with the team)

Job Type: Full-time

Pay: ₹3,400,000.00 - ₹4,500,000.00 per year

Schedule:

  • US shift

Mock Interview

Practice Video Interview with JobPe AI

Start DevOps Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

Hyderabad Jubilee Ho, Hyderabad, Telangana

Hyderabad Jubilee Ho, Hyderabad, Telangana