7 - 12 years
7 - 12 Lacs
Posted:4 weeks ago|
Platform:
Work from Office
Full Time
We are seeking a highly skilled Azure Databricks Engineering Lead to design, develop, and optimize data pipelines using Azure Databricks. The ideal candidate will have deep expertise in data engineering, cloud-based data processing, and ETL workflows to support business intelligence and analytics initiatives. Primary Responsibilities Design, develop, and implement scalable data pipelines using Azure Databricks Develop PySpark-based data transformations and integrate structured and unstructured data from various sources Optimize Databricks clusters for performance, scalability, and cost-efficiency within the Azure ecosystem Monitor, troubleshoot, and resolve performance bottlenecks in Databricks workloads Manage orchestration and scheduling of end to end data pipeline using tool like Apache airflow, ADF scheduling, logic apps Effective collaboration with Architecture team in designing solutions and with product owners with validating the implementations Implementing best practices to enable data quality, monitoring, logging and alerting the failure scenarios and exception handling Documenting step by step process to trouble shoot the potential issues and deliver cost optimized cloud solutions Provide technical leadership, mentorship, and best practices for junior data engineers Stay up to date with Azure and Databricks advancements to continuously improve data engineering capabilities. Required Qualifications Overall 7+ years of experience in IT industry and 6+ years of experience in data engineering with at least 3 years of hands-on experience in Azure Databricks Experience with CI/CD pipelines for data engineering solutions (Azure DevOps, Git) Hands-on experience with Delta Lake, Lakehouse architecture, and data versioning Solid expertise in the Azure ecosystem, including Azure Synapse, Azure SQL, ADLS, and Azure Functions Proficiency in PySpark, Python and SQL for data processing in Databricks Deep understanding of data warehousing, data modeling (Kimball/Inmon), and big data processing Solid knowledge of performance tuning, partitioning, caching, and cost optimization in Databricks Proven excellent written and verbal communication skills Proven excellent problem-solving skills and ability to work independently Balance multiple and competing priorities and execute accordingly Proven highly self-motivated with excellent interpersonal and collaborative skills Ability to anticipate risks and obstacles and develop plans for mitigation Proven excellent documentation experience and skills Preferred Qualifications Azure certifications DP-203, AZ-304 etc. Experience on infrastructure as code, scheduling as code, and automating operational activities using Terraform scripts.
Optum
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Now7.0 - 12.0 Lacs P.A.
Bengaluru
6.0 - 10.0 Lacs P.A.
8.0 - 9.0 Lacs P.A.
8.0 - 9.0 Lacs P.A.
Hyderabad
11.0 - 12.0 Lacs P.A.
18.0 - 20.0 Lacs P.A.
15.0 - 19.0 Lacs P.A.
11.0 - 16.0 Lacs P.A.
20.0 - 25.0 Lacs P.A.
Bengaluru
12.0 - 14.0 Lacs P.A.