Data Engineer

5 - 10 years

7 - 11 Lacs

Posted:1 day ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

We are seeking a data engineer to design, implement, and optimize cloud-based data pipelines using Microsoft Azure services, including ADF, Synapse, and ADLS.

Job Role Responsibilities

  • Develop and maintain ETL/ELT pipelines using Azure Data Factory to ingest, transform, and load data from diverse sources (databases, APIs, flat .
  • Design and manage data storage solutions using Azure Blob Storage and ADLS Gen2, ensuring proper partitioning, compression, and lifecycle policies for performance and cost efficiency.
  • Build and optimize data models and analytical queries in Azure Synapse Analytics, collaborating with data architects to support reporting and BI needs.
  • Ensure data quality, consistency, and reliability through validation, reconciliation, auditing, and monitoring frameworks.
  • Collaborate with data architects, BI developers, and business teams to define architecture, integration patterns, and performance tuning strategies.
  • Implement data security best practices, including encryption, access control, and role-based access management (RBAC).
  • Create and maintain documentation of data workflows, pipelines, and architecture to support knowledge transfer, compliance, and audits.

Skills Required

  • 5+ years of hands-on experience in data engineering with a strong focus on Azure Data Factory, Azure Synapse Analytics, and ADLS Gen2.
  • Strong expertise in SQL, performance tuning, and query optimization for large-scale datasets.
  • Experience designing and managing data pipelines for structured and semi-structured data (CSV, JSON, Parquet, etc ).
  • Proficiency in data modeling (star schema, snowflake, normalized models) for analytics and BI use cases.
  • Practical knowledge of data validation, reconciliation frameworks, and monitoring pipelines to ensure data reliability.
  • Solid understanding of data security best practices (encryption, RBAC, compliance standards like GDPR).
  • Strong collaboration skills, with the ability to work closely with architects, BI teams, and business stakeholders.
  • Excellent skills in documentation and process standardization.

Good-to-Have Skills

  • Experience with Python/Scala scripting for automation of ETL and data quality checks.
  • Exposure to Power BI or other BI tools (Tableau, Qlik) for understanding downstream analytics requirements.
  • Familiarity with CI/CD pipelines for data projects using Azure DevOps or Git-based workflows.
  • Knowledge of big data frameworks (Databricks, Spark) for large-scale transformations.
  • Hands-on experience with metadata management, data lineage tools, or goverce frameworks.
  • Exposure to cloud cost optimization practices in Azure environments.
  • Understanding of API-based ingestion and event-driven architectures (Kafka, Event Hub)

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Trantor logo
Trantor

Data Engineering and Analytics

Las Vegas

RecommendedJobs for You