Home
Jobs

2 Bdt Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 8.0 years

10 - 15 Lacs

Hyderabad

Hybrid

Naukri logo

Mega Walkin Drive for Senior Software Engineer - Informatica Developer Your future duties and responsibilities: Job Summary: CGI is seeking a skilled and detail-oriented Informatica Developer to join our data engineering team. The ideal candidate will be responsible for designing, developing, and implementing ETL (Extract, Transform, Load) workflows using Informatica PowerCenter (or Informatica Cloud), as well as optimizing data pipelines and ensuring data quality and integrity across systems. Key Responsibilities: Develop, test, and deploy ETL processes using Informatica PowerCenter or Informatica Cloud. Work with business analysts and data architects to understand data requirements and translate them into technical solutions. Integrate data from various sources including relational databases, flat files, APIs, and cloud-based platforms. Create and maintain technical documentation for ETL processes and data flows. Optimize existing ETL workflows for performance and scalability. Troubleshoot and resolve ETL and data-related issues in a timely manner. Implement data validation, transformation, and cleansing techniques. Collaborate with QA teams to support data testing and verification. Ensure compliance with data governance and security policies. Required qualifications to be successful in this role: Minimum 6 years of experience with Informatica PowerCenter or Informatica Cloud. Proficiency in SQL and experience with databases like Oracle, SQL Server, Snowflake, or Teradata. Strong understanding of ETL best practices and data integration concepts. Experience with job scheduling tools like Autosys, Control-M, or equivalent. Knowledge of data warehousing concepts and dimensional modeling. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Good to have Python or any programming knowledge. Bachelors degree in Computer Science, Information Systems, or related field. Preferred Qualifications : Experience with cloud platforms like AWS, Azure, or GCP. Familiarity with Bigdata/ Hadoop tools (e.g., Spark, Hive) and modern data architectures. Informatica certification is a plus. Experience with Agile methodologies and DevOps practices. Skills: Hadoop Hive Informatica Oracle Teradata Unix Notice Period- 0-45 Days Pre requisites : Aadhar Card a copy, PAN card copy, UAN Disclaimer : The selected candidates will initially be required to work from the office for 8 weeks before transitioning to a hybrid model with 2 days of work from the office each week.

Posted 2 weeks ago

Apply

6.0 - 8.0 years

7 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

Your future duties and responsibilities: Job Summary: CGI is seeking a skilled and detail-oriented Informatica Developer to join our data engineering team. The ideal candidate will be responsible for designing, developing, and implementing ETL (Extract, Transform, Load) workflows using Informatica PowerCenter (or Informatica Cloud), as well as optimizing data pipelines and ensuring data quality and integrity across systems. Key Responsibilities : Develop, test, and deploy ETL processes using Informatica PowerCenter or Informatica Cloud. Work with business analysts and data architects to understand data requirements and translate them into technical solutions. Integrate data from various sources including relational databases, flat files, APIs, and cloud-based platforms. Create and maintain technical documentation for ETL processes and data flows. Optimize existing ETL workflows for performance and scalability. Troubleshoot and resolve ETL and data-related issues in a timely manner. Implement data validation, transformation, and cleansing techniques. Collaborate with QA teams to support data testing and verification. Ensure compliance with data governance and security policies. Required qualifications to be successful in this role: Minimum 6 years of experience with Informatica PowerCenter or Informatica Cloud. Proficiency in SQL and experience with databases like Oracle, SQL Server, Snowflake, or Teradata. Strong understanding of ETL best practices and data integration concepts. Experience with job scheduling tools like Autosys, Control-M, or equivalent. Knowledge of data warehousing concepts and dimensional modeling. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Good to have Python or any programming knowledge. Bachelors degree in Computer Science, Information Systems, or related field. Preferred Qualifications: Experience with cloud platforms like AWS, Azure, or GCP. Familiarity with Bigdata/ Hadoop tools (e.g., Spark, Hive) and modern data architectures. Informatica certification is a plus. Experience with Agile methodologies and DevOps practices. Skills: Hadoop Hive Informatica Oracle Teradata Unix Note: 1.This role will require- 8 weeks of in-office work after joining, after which we will transition to a hybrid working model, with 2 days per week in the office. 2.Mode of interview F2F 3.Time : Registration Window -9am to 12.30 pm.Candidates who are shortlisted will be required to stay throughout the day for subsequent rounds of interviews Notice Period: 0-45 Days

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies