Data Engineer with Fabric - 5+ Years - Hybrid - Koc, Che, Coim, Pune

5 - 10 years

5 - 15 Lacs

Posted:3 weeks ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Notice Period-Immediate Joiner or Max 10 days (Do not share long notice period profile.)

Permanent Payroll - Anlage Infotech

Client - Orion Innovation

Location- Kochi, Chennai, Coimbatore, Pune - HYBRID (2/3 days in a week)

Role & responsibilities

Work Location: Kochi, Chennai, Coimbatore, Pune

Hybrid Mode

Mandate skills - Pyspark + Azure DataBricks + Azure Data Factory + MS Fabric

Experience- 5+ years

We are seeking a skilled and experienced Senior Data Engineer with a strong background in Azure Data Services,

particularly Azure Databricks, Azure Data Factory, PySpark and MS Fabric. The ideal candidate will be

responsible for designing, developing, and optimizing scalable data pipelines and solutions to support advanced

analytics, reporting, and business intelligence initiatives.

Key Responsibilities:

  • Design, develop, and maintain scalable data pipelines using Azure Databricks, ADF, and PySpark.
  • Implement ETL/ELT workflows to process structured and unstructured data from diverse sources.
  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and

deliver clean, reliable datasets.

  • Optimize performance of data solutions, ensuring scalability and reliability in a cloud environment.
  • Implement data quality checks, data validation, and transformation logic.
  • Develop and manage CI/CD pipelines for data workflows using Azure DevOps or similar tools.
  • Monitor and troubleshoot data pipelines to ensure timely data availability.
  • Work with data governance, security, and compliance teams to enforce data policies and standards.

Required Qualifications:

  • Bachelors or Masters degree in Computer Science, Engineering, Information Systems, or a related field.
  • 5+ years of hands-on experience in data engineering roles.
  • Strong expertise in:
  • Azure Databricks
  • Azure Data Factory
  • PySpark
  • Experience working with Azure Data Lake Storage (ADLS) and Delta Lake.
  • Strong SQL skills for data extraction and transformation.
  • Experience with version control tools like Git and CI/CD practices.
  • Familiarity with orchestration, monitoring, and alerting tools in Azure.
  • Understanding of data modelling and warehousing concepts.

Preferred Qualifications:

  • Experience with other Azure services such as Azure Synapse Analytics, Azure Functions, or Event Hubs.
  • Knowledge of Scala or Spark SQL.
  • Experience with Airflow or similar workflow orchestration tools.
  • Background in Big Data ecosystems and distributed computing.
  • Strong communication and stakeholder management skills.

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You