Posted:1 month ago| Platform: GlassDoor logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

We are seeking a skilled Data Engineer with strong experience in Azure Data Services, Databricks, SQL, and PySpark to join our data engineering team. The ideal candidate will be responsible for building robust and scalable data pipelines and solutions to support advanced analytics and business intelligence initiatives." Key Responsibilities:  Design and implement scalable and secure data pipelines using Azure Data Factory, Databricks, and Synapse Analytics.  Develop and maintain efficient ETL/ELT workflows into and within Databricks.  Write complex SQL queries for data extraction, transformation, and analysis.  Develop and optimize data transformation scripts using PySpark.  Ensure data quality, data governance, and performance optimization across all pipelines.  Collaborate with data architects, analysts, and business stakeholders to deliver reliable data solutions.  Perform data modelling and design for both structured and semi-structured data.  Monitor data pipelines and troubleshoot issues to ensure data integrity and timely delivery.  Contribute to best practices in cloud data architecture and engineering. Required Skills:  4–8 years of experience in data engineering or related fields.  Strong experience with Azure Data Services (ADF, Synapse, Databricks, Azure Storage).  Proficient with Databricks data warehouse – including data ingestion, Snow pipe, streams & tasks.  Advanced SQL skills, including performance tuning and complex query building.  Hands-on experience with PySpark for large-scale data processing and transformation.  Experience with ETL/ELT frameworks, orchestration, and scheduling.  Familiarity with data modelling concepts (dimensional/star schema).  Good understanding of data security, role-based access, and auditing in Snowflake and Azure. Preferred/Good to Have:  Experience with CI/CD pipelines and DevOps for data workflows.  Exposure to Power BI or similar BI tools.  Familiarity with Git, Terraform, or infrastructure-as-code (IaC) in cloud environments.  Experience with Agile/Scrum methodologies Job Type: Full-time Work Location: In person

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You

Pune, Maharashtra, India

Andhra Pradesh, India

Bengaluru, Karnataka, India