Databricks Certified Developer (DE)

5 - 10 years

7 - 12 Lacs

Posted:6 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Azure Data Bricks + SQL - Neev Systems | Reliable IT Partner for Digital Transformation

Experience :

Azure Databricks + SQL Developer / Big Data Engineer

Key Responsibilities

  • Develop, maintain, and optimize ETL/ELT pipelines using Azure Databricks (PySpark/Spark SQL).
  • Write and optimize complex SQL queries, stored procedures, triggers, and functions in Microsoft SQL Server.
  • Design and build scalable, metadata-driven ingestion pipelines for both batch and streaming datasets.
  • Perform data integration and harmonization across multiple structured and unstructured data sources.
  • Implement orchestration, scheduling, exception handling, and log monitoring for robust pipeline management.
  • Collaborate with peers to evaluate and select appropriate tech stack and tools.
  • Work closely with business, consulting, data science, and application development teams to deliver analytical solutions within timelines.
  • Support performance tuning, troubleshooting, and debugging of Databricks jobs and SQL queries.
  • Work with other Azure services such as Azure Data Factory, Azure Data Lake, Synapse Analytics, Event Hub, Cosmos DB, Streaming Analytics, and Purview when required.
  • Support BI and Data Science teams in consuming data securely and in compliance with governance standards.

Required Skills & Experience

  • 5 9 years of overall IT experience with at least 4+ years in Big Data Engineering on Microsoft Azure.
  • Proficiency in Microsoft SQL Server (T-SQL) stored procedures, indexing, optimization, and performance tuning.
  • Strong experience with Azure Data Factory (ADF), Databricks, ADLS, PySpark, and Azure SQL Database.
  • Working knowledge of Azure Synapse Analytics, Event Hub, Streaming Analytics, Cosmos DB, and Purview.
  • Proficiency in SQL, Python, and either Scala or Java with debugging and performance optimization skills.
  • Hands-on experience with big data technologies such as Hadoop, Spark, Airflow, NiFi, Kafka, Hive, Neo4J, and Elastic Search.
  • Strong understanding of file formats such as Delta Lake, Avro, Parquet, JSON, and CSV.
  • Solid background in data modeling, data transformation, and data governance best practices.
  • Experience designing and building REST APIs with practical exposure to Data Lake or Lakehouse projects.
  • Ability to work with large and complex datasets, ensuring data quality, governance, and security standards.
  • Certifications such as

    DP-203: Data Engineering on Microsoft Azure

    or

    Databricks Certified Developer (DE)

    are a plus.
Apply for this position Allowed Type(s): .pdf, .doc, .docx By using this form you agree with the storage and handling of your data by this website. *
We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Neev Systems

Information Technology

Bangalore

RecommendedJobs for You