Data Engineer

8 - 12 years

0 - 2 Lacs

Posted:4 days ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Job Title: Data Engineer

Experience:

Key Responsibilities:

  • Design, develop, and maintain robust

    data pipelines, architectures, and datasets

    to support analytics and business needs.
  • Perform

    debugging, performance tuning, and optimization

    of data processes and databases.
  • Support

    production databases

    , ensuring data reliability, scalability, and performance.
  • Conduct

    root cause analysis

    on internal and external data to identify opportunities for process improvement.
  • Collaborate with cross-functional teams to deliver high-quality data solutions and ensure adherence to

    coding standards and PR processes

    .
  • Provide

    technical guidance and mentorship

    to junior engineers.
  • Implement

    data quality, validation, and governance frameworks

    ensuring data integrity, lineage, and compliance.
  • Enable

    AI/ML use cases

    by preparing datasets, supporting feature engineering, and maintaining model-serving data pipelines.
  • Design and maintain

    modern data warehouses

    , ensuring efficiency in storage, transformation, and query optimization.

Required Skills and Experience:

  • 8+ years

    of experience in

    SQL, ODS, Airflow, Data Integration Services

    , debugging, and performance tuning.
  • Hands-on experience with

    AWS RDS Aurora (PostgreSQL)

    .
  • 6+ years

    of experience in

    Big Data tools

    such as

    Hadoop, Hive, Spark, Kafka, HBase, Flume, Pig, Kudu

    , etc.
  • 6+ years

    of experience in programming languages

    Python, Java, Scala

    , with fluency in writing

    shell scripts (Bash, Korn)

    .
  • Strong expertise in

    building complex analytical queries

    on relational databases.
  • Experience in

    relational SQL

    and

    NoSQL databases

    .
  • Basic understanding of AWS cloud services

    such as

    EC2, EMR, and RDS

    .
  • Experience with

    stream-processing systems

    like

    Kafka topics, Kinesis, Storm, Spark Streaming

    , or

    Pub/Sub

    for real-time data ingestion.
  • Experience in

    API integration

    — building and consuming APIs for data exchange between systems.
  • Strong understanding of

    data quality, validation, lineage, and compliance frameworks

    .
  • Experience in

    preparing datasets for AI/ML pipelines

    , including

    feature engineering

    and

    model-serving requirements

    .
  • Familiarity with

    columnar storage formats

    and

    query optimization techniques

    for efficient data warehousing.
  • Proficiency in

    data manipulation, processing, and extracting metrics

    from large, semi/unstructured datasets.
  • Strong

    analytical, problem-solving, and organizational skills

    .
  • Hands-on experience with

    Snowflake

    is

    mandatory

    .

Preferred Qualifications:

  • Exposure to

    data integration and migration projects

    .
  • Familiarity with

    CI/CD pipelines

    for data workflows.
  • Strong communication, documentation, and collaboration skills.

Role & responsibilities

Preferred candidate profile

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Altimetrik logo
Altimetrik

Software Development

Southfield MI

RecommendedJobs for You

bangalore, chennai, noida, hyderabad, kolkata, gurugram, pune, mumbai city, delhi

pune, maharashtra, india