Home
Jobs

Data Engineer - Talend & Snowflake

6 years

0 Lacs

Posted:4 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Experience :

6+ years

Location : Description :

We are seeking a Data Engineer with expertise in Talend pipeline development for Snowflake Type 2 modeling and proficiency in IBM Data Replicator and Qlik Replicate.The role involves analysing current on-premises data sources across containerized DB2, DB2, Oracle, and Hadoop, and designing scalable data pipelines aligned with the companys data management framework.The candidate will handle data replication processes, including Change Data Capture (CDC), for both historical and incremental updates.

Key Responsibilities

  • Design and build Talend pipelines to implement Type 2 Slowly Changing Dimensions (SCD) models in Snowflake.
  • Analyze and assess existing on-premises data sources (DB2, Oracle, Hadoop) for migration and integration.
  • Develop and optimize data replication strategies using IBM Data Replicator and Qlik Replicate.
  • Implement one-time data migration processes for history and archives and configure pipelines for CDC-based updates.
  • Collaborate with data architects and business teams to define and enforce data modeling standards in Snowflake.
  • Perform data profiling, validation, and reconciliation to ensure data integrity and consistency during migrations.
  • Monitor and troubleshoot data pipelines, ensuring scalability, reliability, and performance.
  • Document pipeline designs, workflows, and data mappings for compliance and audit :
  • Proficiency in Talend ETL development and integration with Snowflake.
  • Hands-on experience with IBM Data Replicator and Qlik Replicate.
  • Strong knowledge of Snowflake database architecture and Type 2 SCD modeling.
  • Expertise in containerized DB2, DB2, Oracle, and Hadoop data sources.
  • Understanding of Change Data Capture (CDC) processes and real-time data replication patterns.
  • Experience with SQL, Python, or Shell scripting for data transformations and automation.
  • Familiarity with cloud platforms (AWS, Azure) and DevOps practices for pipeline automation.

Preferred Skills

  • Experience in data governance frameworks and metadata management.
  • Working knowledge of version control tools (e.g., Git) and CI/CD pipelines.
  • Exposure to Kafka or other streaming platforms for data ingestion.
  • Strong troubleshooting and performance optimization capabilities
(ref:hirist.tech)

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You