Data Engineer

4 - 9 years

5 - 15 Lacs

Posted:2 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Job Summary

Data Engineer

Key Responsibilities

  • Design, develop, and maintain

    ETL pipelines

    for structured and unstructured data.
  • Work extensively with

    Databricks, PySpark, Python, and SQL

    for data integration, transformation, and processing.
  • Build and optimize

    data models

    for reporting, analytics, and advanced use cases.
  • Develop and manage

    data warehouse solutions

    leveraging

    PostgreSQL and AWS services

    (e.g., S3, Redshift, Glue, Lambda).
  • Perform

    data exploration, profiling, and quality checks

    to identify gaps and anomalies.
  • Collaborate with business and technical teams to gather, analyze, and document

    data requirements

    .
  • Implement

    unit testing (UT cases)

    and validation frameworks to ensure data accuracy and reliability.
  • Maintain

    technical documentation

    , including data flow diagrams, design specifications, and metadata.
  • Support data governance and ensure compliance with organizational and security standards.
  • Track and manage tasks, issues, and progress using

    JIRA

    or similar tools.

Required Skills & Qualifications

  • Proven experience in

    ETL development

    and data engineering projects.
  • Strong programming skills in

    Python

    and

    PySpark

    .
  • Proficiency in

    SQL

    (including performance tuning and complex queries).
  • Hands-on experience with

    Databricks

    and

    AWS Cloud ecosystem

    .
  • Solid understanding of

    data warehousing concepts, data modelling techniques, and database design

    .
  • Experience with

    PostgreSQL

    or similar relational databases.
  • Strong knowledge of

    data exploration, profiling, and debugging large datasets

    .
  • Familiarity with

    unit testing

    and test automation for data pipelines.
  • Excellent skills in preparing

    technical documentation

    .
  • Experience with

    Agile methodologies and tools like JIRA

    .
  • Strong communication and collaboration skills for cross-functional teamwork.

Preferred Qualifications

  • Experience with

    big data ecosystems

    (Spark, Hive, Kafka).
  • Exposure to

    CI/CD pipelines for data workflows

    .
  • Knowledge of

    data governance, security, and compliance practices

    .
  • Familiarity with BI/Visualization tools (e.g., Power BI, Tableau).

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Net Connect logo
Net Connect

Software Development

Schinnen Amsterdam

RecommendedJobs for You

pune, gurugram, bengaluru

bangalore rural, bengaluru

hyderabad, chennai, bengaluru