Data Engineer

4 - 9 years

5 - 8 Lacs

Posted:1 day ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

We are seeking a highly skilled and motivated

Data Engineer

with 4+ years of experience in designing and implementing scalable data solutions across Banking and Insurance domains. The ideal candidate will have hands-on expertise in building ETL pipelines, data lakes, and data warehouses using modern cloud platforms and orchestration tools. This role demands strong problem-solving skills, a passion for data architecture, and the ability to collaborate across teams to deliver high-impact data solutions.

 

Key Responsibilities:

  • Design and develop scalable, metadata-driven ETL pipelines using tools like

    Azure Data Factory

    ,

    Apache Airflow

    , and

    Databricks

    .
  • Architect and implement

    data lakes

    ,

    data marts

    , and

    feature stores

    across

    AWS

    ,

    Azure

    , and

    GCP

    environments.
  • Optimize data pipelines for performance, cost-efficiency, and security using

    PySpark

    ,

    SQL

    , and

    Python

    .
  • Implement

    Change Data Capture (CDC)

    and

    Slowly Changing Dimensions (SCD Type 1 2)

    for batch and near real-time ingestion.
  • Collaborate with cross-functional teams to gather requirements, resolve dependencies, and deliver end-to-end data solutions.
  • Automate reporting workflows and regulatory reporting layers to improve data accessibility and reduce manual effort.
  • Lead migration projects from legacy systems to cloud platforms, ensuring data integrity and lineage.
  • Monitor and troubleshoot production pipelines, ensuring high availability and reliability of data systems.

Technical Skills Required:

  • Languages:

    Python, SQL, PySpark, Spark
  • Cloud Platforms:

    • AWS:

      S3, EMR, Redshift, Glue, DMS, Lambda
    • Azure:

      ADLS, ADF, Databricks
    • GCP:

      BigQuery, Cloud Functions
  • Data Warehouses:

    Amazon Redshift, Google BigQuery, Snowflake
  • Orchestration Tools:

    Apache Airflow, Azure Data Factory
  • Development Tools:

    Git (GitHub, Bitbucket), Jenkins, GitHub Actions, Azure DevOps
  • Data Formats:

    Parquet, Avro, JSON, CSV, Hudi

Preferred Qualifications:

  • MCA or equivalent degree in Computer Science or related field.
  • AWS Certified Developer Associate

    or equivalent cloud certification.
  • Experience in

    regulatory reporting

    ,

    banking data marts

    , and

    insurance domain analytics

    .
  • Strong communication skills with the ability to present insights to both technical and non-technical stakeholders.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Exponentia Team logo
Exponentia Team

Business Consulting and Services

Mumbai Maharashtra

RecommendedJobs for You