5 years
0 Lacs
Posted:5 hours ago|
Platform:
On-site
Full Time
Position Overview
We are looking for an experienced ETL Migration Specialist with strong expertise in Informatica PowerCenter and Databricks (PySpark). The ideal candidate will be responsible for analysing and reverse engineering existing Informatica mappings/workflows, understanding the business logic implemented, and refactoring these pipelines into scalable, efficient data processing workflows on Databricks.
This role requires strong analytical skills, hands-on experience in ETL development, and the ability to collaborate with both business and technical teams to ensure seamless migration.
________________________________________
Key Responsibilities
• Reverse Engineering: Analyze existing Informatica mappings, workflows, sessions, and variables to understand data flow, transformations, joins, filters, lookups, and aggregations.
• ETL Refactoring: Translate Informatica logic into PySpark/Databricks workflows, ensuring accuracy and performance.
• Data Modeling & Transformation: Design and implement transformation logic in Databricks aligned with business rules.
• Performance Optimization: Refactor ETL logic for scalability and performance tuning in Databricks.
• Documentation: Create detailed documentation for the converted pipelines, capturing both technical logic and business rules.
• Testing & Validation: Collaborate with QA/Business teams to validate data correctness between Informatica outputs and Databricks outputs.
• Best Practices: Ensure adherence to Databricks and cloud data engineering best practices, including modular coding, error handling, and reusability.
• Collaboration: Work with stakeholders, data architects, and business SMEs to clarify requirements and resolve discrepancies.
________________________________________
Must Have Informatica Expertise
• Strong knowledge of Informatica objects:
o Mappings, Mapplets, Workflows, Worklets, Sessions
• Hands-on experience with Informatica transformations:
o Lookup, Joiner, Router, Aggregator, Expression, Sequence Generator, Filter, Stored Procedure, Update Strategy, etc.
• Proficiency in handling Informatica parameters and variables:
o Mapping variables and parameters
o Workflow variables
o Session parameters
o Parameter files and variable scoping
• Ability to debug and trace variable values across sessions and workflows.
• Experience in designing reusable transformations and optimization techniques in Informatica.
________________________________________
Required Skills & Experience
• 5+ years of hands-on experience in ETL development with Informatica PowerCenter (or IICS).
• 2+ years of experience in Databricks (PySpark, SQL, Delta Lake) for ETL/ELT implementation.
• Strong understanding of data warehousing concepts (SCDs, fact/dimension modeling, partitioning, indexing).
• Hands-on experience with cloud platforms (Azure / AWS / GCP) and Databricks workspace setup.
• Proficiency in SQL and performance tuning of large datasets.
• Experience in reverse engineering ETL logic and re-implementing in modern platforms.
• Excellent problem-solving, debugging, and communication skills.
________________________________________
Good to Have
• Experience with DevOps/CI-CD pipelines for Databricks (e.g., GitHub Actions, Azure DevOps, Jenkins).
• Exposure to Informatica to Spark migration tools or accelerators.
• Knowledge of Medallion architecture or modern data lakehouse patterns.
• Experience in Agile/Scrum delivery.
Celebal Technologies
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
0.5 - 3.0 Lacs P.A.
navi mumbai, maharashtra, india
Salary: Not disclosed
navi mumbai, maharashtra, india
Salary: Not disclosed