Azure + Databricks Engineer

5 years

4 - 6 Lacs

Posted:1 day ago| Platform: GlassDoor logo

Apply

Work Mode

On-site

Job Type

Part Time

Job Description

Experience: 5 to 8 years
Location: Bengaluru, Gurgaon, Pune
Job code: 101299
Posted on: Aug 04, 2025

About Us:
AceNet Consulting is a fast-growing global business and technology consulting firm specializing in business strategy, digital transformation, technology consulting, product development, start-up advisory and fund-raising services to our global clients across banking & financial services, healthcare, supply chain & logistics, consumer retail, manufacturing, eGovernance and other industry sectors. We are looking for hungry, highly skilled and motivated individuals to join our dynamic team. If you’re passionate about technology and thrive in a fast-paced environment, we want to hear from you.

Job Summary:
We are seeking a skilled and experienced Azure + Databricks Engineer to design, develop, and maintain scalable data solutions in cloud-based environments. This role requires a deep understanding of Azure services, Databricks platform, data engineering best practices, and cloud infrastructure. The ideal candidate will have experience building data pipelines, implementing data governance, and optimizing data workflows in enterprise-scale environments.

Key Responsibilities:
  • Build and maintain data pipelines using Azure Data Factory and Databricks.
  • Develop scalable ETL/ELT workflows with Spark (PySpark/Scala).
  • Manage and optimize Databricks clusters and job performance.
  • Ensure data security, governance, and compliance (RBAC, Unity Catalog).
  • Collaborate with cross-functional teams and document data solutions.
Role Requirements and Qualifications:
  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Technology, or a related field.
  • 4–8 years of hands-on experience with Azure and Databricks.
  • Proven experience in Azure cloud services with a focus on data engineering and analytics.
  • Strong hands-on experience with Databricks, Apache Spark, and Delta Lake.
  • Proficiency in Python, SQL, and optionally Scala or PySpark.
  • Solid understanding of CI/CD pipelines, Terraform, or ARM templates for infrastructure automation.
  • Strong knowledge of data warehousing concepts and data lake architectures.
  • Experience with Databricks REST APIs, Azure DevOps, and monitoring tools (Log Analytics, Azure Monitor).
Why Join Us:
  • Opportunities to work on transformative projects, cutting-edge technology and innovative solutions with leading global firms across industry sectors.
  • Continuous investment in employee growth and professional development with a strong focus on up & re-skilling.
  • Competitive compensation & benefits, ESOPs and international assignments.
  • Supportive environment with healthy work-life balance and a focus on employee well-being.
  • Open culture that values diverse perspectives, encourages transparent communication and rewards contributions.

How to Apply:
If you are interested in joining our team and meet the qualifications listed above, please apply and submit your resume highlighting why you are the ideal candidate for this position.

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You