Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Job Summary:


We are seeking a skilled and experienced Azure Databricks Engineer to join our growing data engineering team. The ideal candidate will have deep hands-on expertise in building scalable data pipelines and streaming architectures using Azure-native technologies. Prior experience in the banking or financial services domain is highly desirable, as you will be working with critical data assets and supporting regulatory, risk, and operational reporting use cases.


Key Responsibilities:


  • Design, develop, and optimize data pipelines using Databricks (PySpark) for batch and real-time data processing.
  • Implement CDC (Change Data Capture) and Delta Live Tables/Autoloader to support near-real-time ingestion.
  • Integrate various structured and semi-structured data sources using ADF, ADLS, and Kafka (Confluent).
  • Develop CI/CD pipelines for data engineering workflows using GitHub Actions or Azure DevOps.
  • Write efficient and reusable SQL and Python code for data transformations and validations.
  • Ensure data quality, lineage, governance, and security across all ingestion and transformation layers.
  • Collaborate closely with business analysts, data scientists, and data stewards to support use cases in risk, finance, compliance, and operations.
  • Participate in code reviews, architectural discussions, and documentation efforts.


Required Skills & Qualifications:


  • Strong proficiency in SQL, Python, and PySpark.
  • Proven experience with Azure Databricks, including notebooks, jobs, clusters, and Delta Lake.
  • Experience with Azure Data Lake Storage (ADLS Gen2) and Azure Data Factory (ADF).
  • Hands-on with Confluent Kafka for streaming data integration.
  • Strong understanding of Autoloader, CDC mechanisms, and Delta Lake-based architecture.
  • Experience implementing CI/CD pipelines using GitHub and/or Azure DevOps.
  • Knowledge of data modeling, data warehousing, and data security best practices.
  • Exposure to regulatory and risk data use cases in the banking/financial sector is a strong plus.


Preferred Qualifications:


  • Azure certifications (e.g., Azure Data Engineer Associate).
  • Experience with tools such as Delta Live Tables, Unity Catalog, and Lakehouse architecture.
  • Familiarity with business glossaries, data lineage tools, and data governance frameworks.
  • Understanding of financial data including GL, loan, customer, transaction, or market risk domains.

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

Gurugram, Haryana, India

Noida, Uttar Pradesh, India

Andhra Pradesh, India