Posted:2 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

kumar@ZScaleLLC.com


Data Engineer

5 – 7 years

2 Weeks

Hyderabad, Telangana, India

Mandatory


Qualifications


1. 5–7 years of experience as a Data Engineer, with at least 3 years working in Databricks


2. Strong proficiency in Python, PySpark, and Spark SQL


3. This role is ideal for someone passionate about building scalable data solutions, enabling analytics, and driving innovation in a modern data platform


4. Hands-on experience with DBT in a cloud data platform


5. Experience with DABS for workflow packaging and deployment


6. Proven expertise in Azure DevOps, Git, and CI/CD pipeline development


7. Solid understanding of data modeling, ETL/ELT, and performance optimization


8. Experience implementing monitoring and observability for data pipelines


9. Excellent communication and collaboration skills


10. The work environment is generally favourable. Lighting and temperature are adequate and there are no hazardous or unpleasant conditions caused by noise, dust, etc


11. Specific vision abilities required by this job include close vision, distance vision, color vision, peripheral vision, depth perception and ability to adjust focus


12. Frequently required to sit and/or stand


Responsibilities


1. Design and develop scalable data pipelines using PySpark, Spark SQL, and Python within the Databricks environment


2. Build modular, version-controlled transformation workflows using DBT (Data Build Tool)


3. Package and deploy Databricks workflows and notebooks using Databricks Asset Bundles (DABS) for CI/CD and environment management


4. Integrate Databricks workflows with Azure DevOps for automated testing, deployment, and version control


5. Develop and maintain robust CI/CD pipelines for data engineering workflows using Git, Azure DevOps, and DABS


6. Implement and optimize dimensional data models, ELT/ETL processes, and performance tuning in cloud data platforms


7. Collaborate with data scientists, analysts, and business stakeholders to deliver high-impact data solutions


8. Implement logging, alerting, and monitoring for data pipelines using tools like Databricks Jobs, MLflow, or Azure Monitor


9. Reasonable accommodations will be evaluated and may be implemented to enable individuals with disabilities to perform essential functions of this position


10. This job operates in a professional office environment and routinely uses standard office equipment such as computers, phones, photocopiers, filing cabinets, etc

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

bengaluru, karnataka, india

bengaluru, karnataka, india