Posted:2 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

About The Role

We are looking for an experienced

ETL Data Engineer

with strong expertise in

Informatica, Snowflake, Databricks, and Python/Shell scripting

. The ideal candidate must have hands-on experience in

Banking/BFSI domain

, with deep understanding of financial data, regulatory reporting, and core banking systems. This role involves building scalable ETL pipelines, managing data flows, and ensuring high data quality across enterprise platforms.

Key Responsibilities

  • Design, develop, and maintain ETL pipelines using Informatica PowerCenter / IICS.
  • Work with large datasets to build end-to-end data workflows for ingestion, transformation, and loading.
  • Implement data integration solutions between Snowflake, Databricks, and on-prem/cloud environments.
  • Optimize ETL performance, troubleshoot issues, and ensure data quality and governance compliance.
  • Write efficient Python or Shell scripts for automation, scheduling, and data processing tasks.
  • Collaborate with data architects, business analysts, and functional SMEs to understand banking/BFSI data requirements.
  • Ensure compliance with regulatory and audit requirements related to financial data handling.
  • Support production data pipelines, including incident management and root-cause analysis.
  • Participate in code reviews, version control, and CI/CD processes.
  • Document ETL workflows, mappings, transformations, and design specifications.

Mandatory Skills

  • Strong hands-on experience in ETL development
  • Expertise in Informatica (PowerCenter / IICS)
  • Experience with Snowflake Data Warehouse
  • Experience with Databricks (PySpark/Notebooks/Workflows)
  • Programming skills in Python or Shell scripting
  • Banking / BFSI domain experience is mandatory, ideally in areas such as:
    • Payments
    • Risk & Compliance
    • Regulatory reporting
    • Customer onboarding
    • Core banking & financial operations

Good-to-Have Skills

  • Experience with SQL, performance tuning, and stored procedures
  • Knowledge of cloud platforms (AWS / Azure / GCP)
  • Experience with CI/CD tools (Git, Jenkins, Azure DevOps)
  • Understanding of data governance, metadata management, and lineage
  • Exposure to Big Data technologies (Spark, Hadoop)

Educational Qualification

  • Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or related field.
Skills: python,bfsi,compliance,etl,snowflake,azure,data

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

hyderabad, telangana, india

pune, maharashtra, india

pune, maharashtra, india

hyderabad, telangana, india

hyderabad, telangana, india

noida, uttar pradesh