Data Engineer [ETL] [Informatica] [Singapore]

6 years

0 Lacs

Posted:1 week ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

POSITION OVERVIEW : Industry Consulting Consultant

Position General Duties And Tasks

Skills: AWS, Informatica, ETL, Databricks, Oracle SQL, Python, Tableau

Role And Responsibilities

  • Manage the end-to-end migration process from informatica PowerCenter (CDI PC) to Informatica IDMC, ensuring minimal disruption to business operations.
  • Hands on experience in Informatica IDMC for creating mappings, workflows, setting up Secure Agents etc.
  • Integrate data from various sources, both internal and external, into AWS and Databricks environments, ensuring data consistency and quality, while leveraging Informatica IDMC for data integration, transformation, and governance.
  • Develop ETL (Extract, Transform, Load) processes to cleanse, transform, and enrich data, making it suitable for analytical purposes using Databricks' Spark capabilities and Informatica IDMC for data transformation and quality.
  • Monitor and optimize data processing and query performance in both AWS and Databricks environments, making necessary adjustments to meet performance and scalability requirements. Utilize Informatica IDMC for optimizing data workflows.
  • Implement security best practices and data encryption methods to protect sensitive data in both AWS and Databricks, while ensuring compliance with data privacy regulations. Employ Informatica IDMC for data governance and compliance.
  • Maintain clear and comprehensive documentation of data infrastructure, pipelines, and configurations in both AWS and Databricks environments, with metadata management facilitated by Informatica IDMC.
  • Collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to understand data requirements and deliver appropriate solutions across AWS, Databricks, and Informatica IDMC.
  • Identify and resolve data-related issues and provide support to ensure data availability and integrity in both AWS, Databricks, and Informatica IDMC environments.
  • Optimize AWS, Databricks, and Informatica resource usage to control costs while meeting performance and scalability requirements.
Stay up to date with AWS, Databricks, Informatica IDMC services, and data engineering best practices to recommend and implement new technologies and techniques.

Requirements

  • Degree in Computer Science, Information Technology, Computer Engineering or equivalent.
  • Minimum 6 years of experience in data engineering, with expertise in AWS or Azure services, Databricks, and/or Informatica IDMC.
  • Understanding and hands on experience in all phases of project lifecycle & has implementation involvement in at least 4 project cycles.
  • Experience in AWS Services focusing on ETL and data processing using ETL software: Informatica PowerCenter, IDMC, Informatica Cloud Data Integration for PowerCenter (CDI-PC), Informatica Data Engineering Integration (DEI).
  • Strong understanding of data integration concepts, ETL process and data quality management.
  • Accomplished in designing and implementing ETL data mappings, transformations, and workflows.
  • Evaluate potential technical solutions and make recommendations to resolve data issues especially on performance assessment for complex data transformations and long running data processes.
  • Strong in Business Intelligence data model design and proficient in BI software such as Oracle Analytics Server (OAS), Tableau
  • Strong knowledge of Oracle SQL and NoSQL databases, equipped with key SQL Skillsets on databases such as Oracle, Teradata and MS-SQL Server.
  • Working knowledge on BI standard languages such as Python, C#, Java, VBA
  • AWS Associate/AWS Professional/AWS Specialty certification is preferred
  • Databricks certifications, IDMC certifications, Tableau Certifications are a plus.

Preferred Skills

  • Knowledge of data governance and data cataloguing tools, especially Informatica IDMC.
  • Understanding of DevOps principles for managing and deploying data pipelines.
  • Experience with version control systems (e.g., Git) and CI/CD pipelines.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You