Data Engineer with Databricks

7 years

0 Lacs

Posted:2 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

kumar@zscalellc.com


Data Engineer with Databricks

5 – 7 years

2+ years

Immediate to 30 Days

Hyderabad, Telangana, India

Mandatory


Core skills:


* Azure Cloud, Databricks workflows, DLT, Notebooks, Python, SQL, PySpark, ADLS Gen2, ADF, GitHub, PL/SQL, ORACLE, ETL experience, Understanding of Delta Lake architecture, CDC patterns, and Lakehouse.



* Ability to design and orchestrate data pipelines using Databricks Workflows and DLT and strong Understanding of Medallion Architecture.

* Expertise in developing Databricks notebooks for scalable solutions using Python, SQL, and PySpark.

* Understanding of Delta Lake architecture, CDC patterns, and Lakehouse.

* Strong understanding of Delta table key features like ACID-compliant Delta tables, time travel, schema enforcement, Deep and shallow Clones.

* Performance tuning using Liquid clustering, partitioning, Z-ordering and data skipping in delta tables.

* Knowledge on data governance (unity catalog), Data Security (RBAC, Fine grain access control), Data Sharing (Delta Sharing).

* Proficient in working with Azure Data Lake Storage Gen2 (ADLS Gen2),Azure Data Factory (ADF) and terraform for provisioning and management of Azure resources.

* Knowledge on dealing with Spark streaming and Auto loaders in Data bricks.

* Strong experience in analyzing and understanding legacy Informatica ETL workflows, including mappings, transformations, and data flow logic, to support seamless migration to Databricks-based data pipelines.

* Hands-on experience in implementing CI/CD pipelines using Jenkins to automate deployment of Databricks notebooks, jobs, and data workflows.

* Integrating GitHub with Databricks Repos to enable seamless code synchronization, change tracking, and automated deployment workflows.

* knowledge of Snowflake, Oracle, MySQL, and Shell scripting for diverse data integration.

* Knowledge of Power BI and Azure Synapse Analytics for data analytics dashboards and reports


Cloud Platform:          Azure (ADLS Gen2, ADF)      Terraform

Data Platform: Databricks (Workflows, DLT, Notebooks)      

Languages/Frameworks:       Python, SQL, PySpark           PL/SQL, Shell Scripting

Architecture:   Delta Lake, Medallion Architecture, Lakehouse         ETL/ELT

DevOps/CI/CD:          GitHub, Jenkins, Databricks Repos   

Data Governance:      Unity Catalog, Delta Sharing  

Databases:     Oracle, ADLS Gen2   Snowflake, MySQL

Analytics:        Azure Synapse Analytics, Power BI   

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You