Data Engineer

3 - 7 years

15 Lacs

chandigarh delhi / ncr

Posted:9 hours ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description


Responsibilities

Develop and maintain ETL/ELT pipelines using Azure Data Factory to ingest, transform, and load data from diverse sources (databases, APIs, flat files).

Design and manage data storage solutions using Azure Blob Storage and ADLS Gen2, ensuring proper partitioning, compression, and lifecycle policies for performance and cost efficiency.

Build and optimize data models and analytical queries in Azure Synapse Analytics, collaborating with data architects to support reporting and BI needs.

Ensure data quality, consistency, and reliability through validation, reconciliation, auditing, and monitoring frameworks.

Collaborate with data architects, BI developers, and business teams to define architecture, integration patterns, and performance tuning strategies.

Implement data security best practices, including encryption, access control, and role-based access management (RBAC).

Create and maintain documentation of data workflows, pipelines, and architecture to support knowledge transfer, compliance, and audits.

Essential Skills

5+ years of hands-on experience in data engineering with a strong focus on Azure Data Factory, Azure Synapse Analytics, and ADLS Gen2.

Strong expertise in SQL, performance tuning, and query optimization for large-scale datasets.

Experience designing and managing data pipelines for structured and semi-structured data (CSV, JSON, Parquet, etc.).

Proficiency in data modeling (star schema, snowflake, normalized models) for analytics and BI use cases.

Practical knowledge of data validation, reconciliation frameworks, and monitoring pipelines to ensure data reliability.

Solid understanding of data security best practices (encryption, RBAC, and compliance standards like GDPR).

Strong collaboration skills, with the ability to work closely with architects, BI teams, and business stakeholders.

Excellent skills in documentation and process standardization. Good-to-Have Skills

Experience with Python/Scala scripting for automation of ETL and data quality checks.

Exposure to Power BI or other BI tools (Tableau, Qlik) for understanding downstream analytics requirements.

Familiarity with CI/CD pipelines for data projects using Azure DevOps or Git-based workflows.

Knowledge of big data frameworks (Databricks, Spark) for large-scale transformations.

Hands-on experience with metadata management, data lineage tools, or governance frameworks.

Exposure to cloud cost optimization practices in Azure environments. Understanding of API-based ingestion and event-driven architectures (Kafka, Event Hub).

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
E Solutions logo
E Solutions

Information Technology & Services

San Francisco

RecommendedJobs for You

bengaluru, mumbai (all areas)

chandigarh, delhi / ncr

new delhi, delhi, india

bengaluru, karnataka, india