Databricks (DLT, Unity Catalog, Workflows, SQL Warehouse)

8 years

0 Lacs

Posted:1 month ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Contractual

Job Description

Detailed JD (Roles and Responsibilities)

  1. Lead the design and implementation of advanced data solutions on Databricks.
  2. Architect scalable, secure, and efficient data platforms using Spark, Python, and SQL.
  3. Optimize ETL processes and data workflows for performance and reliability.
  4. Collaborate with data engineers, data scientists, and business stakeholders to understand requirements and deliver robust solutions.
  5. Set up and manage CI/CD pipelines and enforce data governance standards.
  6. Monitor and troubleshoot data pipelines to ensure data quality and integrity.
  7. Mentor junior team members and foster a culture of continuous learning.
  8. Stay updated with emerging technologies and promote innovation in data practices.

Total Experience

8 years

Relevant Experience

5 years

Mandatory skills

Deep expertise in Databricks (DLT, Unity Catalog, Workflows, SQL Warehouse).

Strong programming skills in Python, PySpark, and SQL.

Experience with cloud platforms (AWS, Azure, GCP).

Familiarity with data governance, security policies (RBAC), and performance tuning.

Excellent problem-solving, communication, and leadership skills.

Domain (Industry)

MFG

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You