Databricks Data Engineer

5 - 7 years

0 Lacs

Posted:3 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Job Title:

Experience Level:

Location:

Employment Type:

Certifications Required:

  • Databricks Certified Data Engineer Associate
  • Databricks Certified Data Engineer Professional
  • Cloud Certifications (Preferred): Azure, AWS, GCP

Job Summary:

We are seeking a highly skilled and certified Databricks Data Engineer to join our dynamic data engineering team. The ideal candidate will have hands-on experience in implementing Lakehouse architectures, upgrading to Unity Catalog, and building robust data ingestion pipelines. This role demands proficiency in Python, PySpark, SQL, and Scala, along with a strong understanding of big data technologies, streaming workflows, and multi-cloud environments.

Key Responsibilities:

Lakehouse Implementation:

  • Design and implement scalable Lakehouse architecture using Databricks.
  • Optimize data storage and retrieval strategies for performance and cost-efficiency.

Unity Catalog Upgrade:

  • Lead and execute Unity Catalog upgrades across Databricks workspaces.
  • Ensure secure and compliant data governance and access control.

Data Ingestion & Migration:

  • Develop and maintain data ingestion pipelines from various sources (structured, semi-structured, unstructured).
  • Perform large-scale data migrations across cloud platforms and environments.

Pipeline Development:

  • Build and manage ETL/ELT pipelines using PySpark and SQL.
  • Ensure data quality, reliability, and performance across workflows.

Big Data Streaming & Workflows:

  • Implement real-time data streaming solutions using Spark Structured Streaming or similar technologies.
  • Design workflow orchestration using tools like Airflow, Databricks Workflows, or equivalent.

Multi-Cloud Expertise (Preferred):

  • Work across Azure, AWS, and GCP environments.
  • Understand cloud-native services and integration patterns for data engineering.

Collaboration & Documentation:

  • Collaborate with data scientists, analysts, and business stakeholders to understand data requirements.
  • Document technical designs, data flows, and operational procedures.

Required Skills & Qualifications:

  • 57 years of hands-on experience in data engineering roles.
  • Strong expertise in Databricks platform and ecosystem.
  • Proficiency in Python, PySpark, SQL, and Scala.
  • Experience with Lakehouse architecture and Unity Catalog.
  • Proven track record in building scalable data ingestion pipelines and performing data migrations.
  • Familiarity with big data technologies such as Delta Lake, Apache Spark, Kafka, etc.
  • Understanding of data lake concepts and best practices.
  • Experience with streaming data and workflow orchestration.
  • Certified as Databricks Data Engineer Associate and Professional (mandatory).
  • Cloud certifications in Azure, AWS, or GCP are a strong plus.
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and collaboration abilities.

Nice to Have:

  • Experience with CI/CD pipelines and DevOps practices in data engineering.
  • Exposure to data cataloging tools and metadata management.
  • Knowledge of data security, privacy, and compliance standards.

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

noida, bengaluru, mumbai (all areas)

thiruvananthapuram, kerala