Home
Jobs
Companies
Resume

41 Powercenter Jobs - Page 2

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7 - 11 years

0 - 0 Lacs

Chennai, Bengaluru, Hyderabad

Work from Office

Naukri logo

Hi, This is vinita from Silverlink Technologies. We have an excellent Job opportunity with TCS for the post of Informatica power Centre" at Chennai Location. If interested, kindly forward me your word formatted updated resume ASAP on vinita@silverlinktechnologies.com, kindly fill in the below-mentioned details too. Full Name: Contact No: Email ID: DOB: Experience: Relevant Exp: Current Company: Notice Period: Current CTC: Expected CTC: Offer in hand: If yes then offered ctc: Date of joining: Company name: Grades -- 10th: 12th: Graduation: Full time/Part Time? University Name: Current Location: Preferred Location: Gap in education: Gap in employment: **Mandatory**Pan Card Number: Have you ever worked with TCS? Do you have active PF account? Role: Informatica powercentre Exp: 7-10yrs Mode: Permanent Notice Period: up to 1-2 Months only Interview Mode: Virtual For any queries can revert back on the below-mentioned details. Regards, Thanks & Regards, Vinita Shetty Silverlink Group Tel: - 022 42000665 India | United Kingdom | USA | Australia |Singapore Email: Vinita@silverlinktechnologies.com Website: www.silverlinktechnologies.com

Posted 3 months ago

Apply

5 - 7 years

7 - 16 Lacs

Pune

Remote

Naukri logo

Vidushi Infotech: Join a Dynamic Team and Shape the Future of Technology About us: Vidushi Infotech Software Solutions Provider Private Limited, founded in 2003 and headquartered in Pune, India, is a global digital transformation company with a direct presence in India, the USA, Panama, and New Zealand. This strategic presence allows us to provide focused support and tailored solutions to our clients in these key regions. Coupled with our indirect reach through strategic partnerships in over 90 other countries, we offer a truly global service network. With over 21 years of experience, we offer a comprehensive suite of IT solutions, encompassing high-quality web and mobile application development, as well as robust IT security services. Our expertise extends to cloud services, business applications, ERP, CRM, and disruptive technologies such as Blockchain, Artificial Intelligence, Machine Learning, IoT, Bots, and Data Science. We also provide comprehensive cybersecurity solutions. Serving a diverse global clientele across various industries, government agencies, and business segments, we cater to the evolving needs of businesses worldwide. Our reach spans North and South America, the UK, Europe, Africa, the Middle East, Asia, New Zealand, and Australia. Through strategic partnerships and tailored advice, we empower our customers to drive innovation, achieve sustainable growth, and build successful businesses. Our highly qualified and experienced management team leads a talented group of software engineers dedicated to delivering world-class products and services. At Vidushi Infotech, we are committed to excellence in all that we do, and we strive to provide exceptional value to our clients. Key Benefits: Career Growth: Advance your skills and take on challenging projects. Collaborative Culture: Work with a passionate and supportive team. Global Impact: Contribute to innovative solutions that serve clients worldwide. Cutting-Edge Technology: Stay at the forefront of technological advancements. Competitive Compensation: Enjoy competitive salaries and benefits. Ready to make a difference? Join Vidushi Infotech and embark on a rewarding career journey. Maximo Developer Job Description/Skills sets required Informatica Data Quality, Power center Azure background knowledge Python knowledge Able to troubleshoot issues and raise tickets to other teams Exp with ADO Stakeholder management, Communication and collaboration skills DevOps model Must be able to resolve issues with IDQ / Powercenter Agile ways-of-working

Posted 3 months ago

Apply

8 - 13 years

10 - 16 Lacs

Pune

Remote

Naukri logo

Job Title: Informatica IDMC Project Developer Work from home EST time zone 8+ years of experience Long-term contract Job Description: We are seeking a skilled Informatica IDMC Project Developer to design, develop, and optimize data integration solutions using Informatica Intelligent Data Management Cloud (IDMC). In this role, you will be responsible for building scalable ETL/ELT pipelines, migrating legacy processes to IDMC, and ensuring seamless data movement across cloud and on-premises environments. You will collaborate with cross-functional teams to understand business requirements, enhance data workflows, and optimize performance. Key Responsibilities: Develop, implement, and maintain IDMC -based data integration solutions. Design and optimize ETL/ELT pipelines for data movement between on-premises and cloud platforms (e.g., Snowflake, Oracle, Azure, GCP ). Migrate existing Informatica PowerCenter workflows to IDMC, ensuring minimal disruption and improved efficiency. Work closely with data architects, analysts, and business stakeholders to understand data requirements and translate them into scalable solutions. Monitor, troubleshoot, and enhance data pipelines for performance and reliability. Implement best practices in data integration, security, and governance. Document processes, workflows, and technical designs to support ongoing maintenance and future enhancements. Qualifications & Skills: 3+ years of experience in Informatica IDMC, IICS, or PowerCenter development. Strong knowledge of SQL, data warehousing, and cloud platforms (e.g., Snowflake, Azure, GCP). Experience with Control-M, scheduling tools, and workflow automation . Familiarity with API-based integrations and real-time data processing . Strong troubleshooting, performance tuning, and problem-solving skills. Ability to work in a fast-paced, collaborative environment and manage multiple tasks efficiently.

Posted 3 months ago

Apply

8 - 13 years

10 - 20 Lacs

Chennai, Bengaluru, Hyderabad

Work from Office

Naukri logo

Hi, we are looking for senior Informatica IDMC consultant with IDMC Development experience working with Databases, Flat Files, API IBM DB2 (connect to it as Source or Target for their PowerCenter jobs) .

Posted 3 months ago

Apply

4 - 9 years

15 - 30 Lacs

Pune, Bengaluru, Gurgaon

Work from Office

Naukri logo

Experience: 3+ years of experience in building and maintaining data pipelines for diverse data sources and formats using tools such as Informatica PowerCenter and IICS . Conducted data quality checks and implemented data validation processes to ensure accuracy and consistency across cloud and on-premises environments. Collaborated with data scientists and analysts to provide reliable and timely data for analysis and reporting. Participated in setting up data engineering practices and standards for various projects and clients, including cloud migration strategies. Designed and optimized complex SQL queries for efficient data retrieval, transformation, and integration with cloud data warehouses like AWS Redshift and Snowflake . Implemented data monitoring and alerting systems to proactively address data pipeline issues and ensure continuous data availability. Assisted in building cloud-native data solutions on AWS , with a focus on services like S3 , Lambda , Redshift , and Athena , ensuring scalability, performance, and cost-effectiveness.

Posted 3 months ago

Apply

5 - 10 years

15 - 25 Lacs

Bengaluru, Noida, Mumbai (All Areas)

Hybrid

Naukri logo

We are hiring for an Informatica ETL Developer Location : Bangalore, Hyderabad, Pune, Noida , Mumbai, Goa Experience : 5+ years Notice Period : Immediate Joiners/ 15 Days Development experience using PowerCenter Hands-on experience with Snowflake database & snowsql commands. Proficiency in shell scripting Strong SQL skills

Posted 3 months ago

Apply

3 - 5 years

8 - 12 Lacs

Gurgaon

Hybrid

Naukri logo

Role & responsibilities Understand the project scope, identify activities/ tasks, task level estimates, schedule, dependencies, risks and provide inputs to Program Lead for review. Lead the analysis, design and development phases to develop a robust ETL solutions which can best meet the business requirements for investments applications. Prepare/Modify design documents (High level design and Detailed design documents) based on business requirements. Suggest changes in design on technical grounds. Coordinate delivery of the assigned tasks with the onshore Partners or/and Business Analyst. Ensure timely notification and escalation of possible issues/problems, options and recommendations for prompt resolution Follow development standards to ensure that code is clear, efficient, logical and easily maintainable. Create and maintain application documentation like Network diagrams, technical project handbook, technical data mapping doc, unit test documents, implementation plan. Ensure SLF Information Security Policies and General Computing Control is compiled to in all situations. Take complete ownership of work assignments and ensure the successful completion of assigned tasks. Ensure all written and verbal communication is clear, understandable and audience appropriate Preferred candidate profile Minimum 3 to 5 years of overall IT experience with Informatica PowerCenter. Minimum 1 year of experience in AWS data services (Glue) and PySpark. Strong knowledge of ETL, relational databases (Microsoft SQL Server, PostgreSQL) Good knowledge of data warehousing concepts Good analytical and problem-solving skills. Working knowledge on job scheduling tools like Control-M, Autosys etc. Experience in GIT / Bitbucket versioning tools. Understanding of DevOps processes. Good hands-on experience with AWS step functions, lambda, SQS, SNS, Redshift and other data services will be preferred.

Posted 3 months ago

Apply

8 - 12 years

22 - 35 Lacs

Pune, Bengaluru

Work from Office

Naukri logo

Informatica IDMC Technical lead Experience:- 8 -12 Yrs Location:- Bhubaneswar, Pune, Chennai, Bangalore Strong proficiency in ETL concepts with python knowledge. Familiarity with Data warehousing concepts. IDMC Migration Strategy and Implementation.

Posted 3 months ago

Apply

7 - 10 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica MDM Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : Graduation Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Informatica MDM. Your typical day will involve working with the Informatica MDM tool, collaborating with cross-functional teams, and ensuring the quality and integrity of data. Roles & Responsibilities: Design, develop, and maintain Informatica MDM solutions to meet business requirements. Collaborate with cross-functional teams to ensure the quality and integrity of data. Develop and maintain data integration processes using Informatica PowerCenter. Create and maintain technical documentation for all development activities. Provide technical guidance and support to junior team members. Professional & Technical Skills: Must To Have Skills:Strong experience in Informatica MDM. Must To Have Skills:Experience in Informatica PowerCenter. Good To Have Skills:Experience in data modeling and data architecture. Good To Have Skills:Experience in SQL and Oracle databases. Strong understanding of data integration and data quality concepts. Additional Information: The candidate should have a minimum of 7.5 years of experience in Informatica MDM and PowerCenter. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Bengaluru office.

Posted 3 months ago

Apply

4 - 8 years

8 - 18 Lacs

Bengaluru

Hybrid

Naukri logo

Job Title: IICS Developer Company: LumenData Location: Bangalore, India Job Type: Full-Time Job Description: LumenData is looking for an experienced IICS Developer to join our team in Bangalore, India . The ideal candidate should have 4 + years of experience in Informatica Intelligent Cloud Services (IICS) , with expertise in Cloud Data Integration (CDI) and Cloud Data Quality (CDQ) . This role will involve designing, developing, and optimizing data integration workflows to support enterprise-wide data initiatives. Key Responsibilities: Develop, test, and deploy IICS CDI and CDQ solutions for data integration and transformation. Collaborate with business and technical teams to understand data requirements and implement scalable solutions. Design and optimize ETL processes for performance, scalability, and reliability. Implement data validation, cleansing, and transformation rules to improve data quality. Work with SQL, PL/SQL, and relational databases (Oracle, SQL Server, PostgreSQL, etc.). Develop APIs and integrations with external applications using REST/SOAP web services . Troubleshoot and resolve data integration issues in a timely manner. Maintain technical documentation, including data flow diagrams and process documentation . Stay updated with Informatica Cloud advancements and recommend best practices. Required Skills & Qualifications: 4+ years of hands-on experience in Informatica IICS CDI/CDQ . Strong knowledge of ETL, data integration, and data quality principles . Experience with SQL, PL/SQL, and database optimization techniques . Familiarity with API-based integrations and working knowledge of REST/SOAP web services . Strong problem-solving and debugging skills . Ability to work in an agile development environment . Excellent communication and team collaboration skills . Informatica Cloud Certification is a plus. Good to have experience in CAI (Cloud Application Integration) . Preferred Qualifications: Experience with Cloud Platforms (AWS, Azure, GCP) . Knowledge of Python or Shell Scripting for automation. Understanding of Big Data technologies and data governance frameworks

Posted 3 months ago

Apply

7 - 12 years

9 - 14 Lacs

Mumbai

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Informatica PowerCenter Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Be involved in the end-to-end data management process. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Lead data architecture design and implementation. Optimize and maintain existing data pipelines. Conduct data modeling and database design. Implement data security and privacy measures. Professional & Technical Skills: Must To Have Skills:Proficiency in Informatica PowerCenter. Strong understanding of data modeling and database design. Experience in ETL development and implementation. Knowledge of data security and privacy measures. Experience with cloud data platforms like AWS or Azure. Additional Information: The candidate should have a minimum of 7.5 years of experience in Informatica PowerCenter. This position is based at our Mumbai office. A 15 years full-time education is required. Qualifications 15 years full time education

Posted 3 months ago

Apply

5 - 10 years

10 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Overview We are seeking an Associate Manager Data IntegrationOps to support and assist in managing data integration and operations (IntegrationOps) programs within our growing data organization. In this role, you will help maintain and optimize data integration workflows, ensure data reliability, and support operational excellence. This position requires a solid understanding of enterprise data integration, ETL/ELT automation, cloud-based platforms, and operational support. Support the management of Data IntegrationOps programs by assisting in aligning with business objectives, data governance standards, and enterprise data strategies. Monitor and enhance data integration platforms by implementing real-time monitoring, automated alerting, and self-healing capabilities to help improve uptime and system performance under the guidance of senior team members. Assist in developing and enforcing data integration governance models, operational frameworks, and execution roadmaps to ensure smooth data delivery across the organization. Support the standardization and automation of data integration workflows, including report generation and dashboard refreshes. Collaborate with cross-functional teams to help optimize data movement across cloud and on-premises platforms, ensuring data availability, accuracy, and security. Provide assistance in Data & Analytics technology transformations by supporting full sustainment capabilities, including data platform management and proactive issue identification with automated solutions. Contribute to promoting a data-first culture by aligning with PepsiCos Data & Analytics program and supporting global data engineering efforts across sectors. Support continuous improvement initiatives to help enhance the reliability, scalability, and efficiency of data integration processes. Engage with business and IT teams to help identify operational challenges and provide solutions that align with the organizations data strategy. Develop technical expertise in ETL/ELT processes, cloud-based data platforms, and API-driven data integration, working closely with senior team members. Assist with monitoring, incident management, and troubleshooting in a data operations environment to ensure smooth daily operations. Support the implementation of sustainable solutions for operational challenges by helping analyze root causes and recommending improvements. Foster strong communication and collaboration skills, contributing to effective engagement with cross-functional teams and stakeholders. Demonstrate a passion for continuous learning and adapting to emerging technologies in data integration and operations. Responsibilities Support and maintain data pipelines using ETL/ELT tools such as Informatica IICS, PowerCenter, DDH, SAP BW, and Azure Data Factory under the guidance of senior team members. Assist in developing API-driven data integration solutions using REST APIs and Kafka to ensure seamless data movement across platforms. Contribute to the deployment and management of cloud-based data platforms like Azure Data Services, AWS Redshift, and Snowflake, working closely with the team. Help automate data pipelines and participate in implementing DevOps practices using tools like Terraform, GitOps, Kubernetes, and Jenkins. Monitor system reliability using observability tools such as Splunk, Grafana, Prometheus, and other custom monitoring solutions, reporting issues as needed. Assist in end-to-end data integration operations by testing and monitoring processes to maintain service quality and support global products and projects. Support the day-to-day operations of data products, ensuring SLAs are met and assisting in collaboration with SMEs to fulfill business demands. Support incident management processes, helping to resolve service outages and ensuring the timely resolution of critical issues. Assist in developing and maintaining operational processes to enhance system efficiency and resilience through automation. Collaborate with cross-functional teams like Data Engineering, Analytics, AI/ML, CloudOps, and DataOps to improve data reliability and contribute to data-driven decision-making. Work closely with teams to troubleshoot and resolve issues related to cloud infrastructure and data services, escalating to senior team members as necessary. Support building and maintaining relationships with internal stakeholders to align data integration operations with business objectives. Engage directly with customers, actively listening to their concerns, addressing challenges, and helping set clear expectations. Promote a customer-centric approach by contributing to efforts that enhance the customer experience and empower the team to advocate for customer needs. Assist in incorporating customer feedback and business priorities into operational processes to ensure continuous improvement. Contribute to the work intake and Agile processes for data platform teams, ensuring operational excellence through collaboration and continuous feedback. Support the execution of Agile frameworks, helping drive a culture of adaptability, efficiency, and learning within the team. Help align the team with a shared vision, ensuring a collaborative approach while contributing to a culture of accountability. Mentor junior technical team members, supporting their growth and ensuring adherence to best practices in data integration. Contribute to resource planning by helping assess team capacity and ensuring alignment with business objectives. Remove productivity barriers in an agile environment, assisting the team to shift priorities as needed without compromising quality. Support continuous improvement in data integration processes by helping evaluate and suggest optimizations to enhance system performance. Leverage technical expertise in cloud and computing technologies to support business goals and drive operational success. Stay informed on emerging trends and technologies, helping bring innovative ideas to the team and supporting ongoing improvements in data operations. Qualifications 5+ years of technology work experience in a large-scale, global organization CPG (Consumer Packaged Goods) industry preferred. 4+ years of experience in Data Integration, Data Operations, and Analytics, supporting and maintaining enterprise data platforms. 4+ years of experience working in cross-functional IT organizations, collaborating with teams such as Data Engineering, CloudOps, DevOps, and Analytics. 3+ years of hands-on experience in MQ & WebLogic administration. 1+ years of leadership/management experience supporting technical teams and contributing to operational efficiency initiatives. Knowledge of ETL/ELT tools such as Informatica IICS, PowerCenter, SAP BW, Teradata, and Azure Data Factory. Hands-on knowledge of cloud-based data integration platforms such as Azure Data Services, AWS Redshift, Snowflake, and Google BigQuery. Familiarity with API-driven data integration (e.g., REST APIs, Kafka), and supporting cloud-based data pipelines. Basic proficiency in Infrastructure-as-Code (IaC) tools such as Terraform, GitOps, Kubernetes, and Jenkins for automating infrastructure management. Understanding of Site Reliability Engineering (SRE) principles, with a focus on proactive monitoring and process improvements. Strong communication skills, with the ability to explain technical concepts clearly to both technical and non-technical stakeholders. Ability to effectively advocate for customer needs and collaborate with teams to ensure alignment between business and technical solutions. Interpersonal skills to help build relationships with stakeholders across both business and IT teams. Customer Obsession: Enthusiastic about ensuring high-quality customer experiences and continuously addressing customer needs. Ownership Mindset: Willingness to take responsibility for issues and drive timely resolutions while maintaining service quality. Ability to support and improve operational efficiency in large-scale, mission-critical systems. Some experience leading or supporting technical teams in a cloud-based environment, ideally within Microsoft Azure. Able to deliver operational services in fast-paced, transformation-driven environments. Proven capability in balancing business and IT priorities, executing solutions that drive mutually beneficial outcomes. Basic experience with Agile methodologies, and an ability to collaborate effectively across virtual teams and different functions. Understanding of master data management (MDM), data standards, and familiarity with data governance and analytics concepts. Openness to learning new technologies, tools, and methodologies to stay current in the rapidly evolving data space. Passion for continuous improvement and keeping up with trends in data integration and cloud technologies.

Posted 1 month ago

Apply

7 - 12 years

15 - 30 Lacs

Hyderabad

Hybrid

Naukri logo

Role & responsibilities : Candidate should be Immediate Joiner Lead Level Exp Must Informatica Powercenter, Snowflake, Oracle, Unix are Mandatory Skills Only Hyd (GAR-Kokapet) location (Hybrid - Mode) Key Responsibilities : •Design and implement scalable data storage solutions using Snowflake •Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. •Write, optimize, and troubleshoot complex SQL queries within Snowflake •Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs. •Develop and maintain ETL processes using Informatica PowerCenter •Integrate Snowflake with various data sources and third-party applications •Experience in Data Lineage Analysis, Data Profiling, ETL Design and development, Unit Testing, Production batch support and UAT support. •Involve SQL performance tuning, Root causing failures and bifurcating them into different technical issues and resolving them. •In-depth understanding of Data Warehouse, ETL concepts and Data Modelling. •Experience in requirement gathering, analysis, designing, development, and deployment. •Good working knowledge of any ETL tool (preferably Informatica powercenter, DBT) •Should have proficiency in SQL. •Have experience in client facing projects. •Have experience on Snowflake Best Practices. •Should have experience working on Unix shell scripting. •Good to have working experience in python

Posted 1 month ago

Apply

9 - 14 years

13 - 20 Lacs

Kochi, Chennai, Bengaluru

Hybrid

Naukri logo

Greetings from Aspire Systems!! Currently hiring for Informatica Developer Exp: 8+ Years Location: Chennai/ Bangalore/ Kochi Notice: Immediate to 15 Days. This is a great opportunity for a technically strong and detail-focused Informatica Developer to contribute to enterprise-level data solutions in a collaborative and fast-paced environment. Share CV to safoora.imthiyas@aspiresys.com - Immediate joiners. JD: Experience in Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor. Design and develop ETL processes using Informatica Power Center. Design and develop complex customizations and implementing integrations between disparate system and integrate with Salesforce. Develop Mappings, Workflows, reusable and non-reusable session tasks. Use various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer and Error Handling. Experience in Performance tuning, Data modeling and Data profiling to support Functional / Business requirement. Strong Database knowledge - Oracle PL/SQL, ORACLE, DB2, SQL Server. Ability to analyze database schemas, triggers, and Stored procedures. Able to write complex relational SOQL queries to help with data retrieval and validation from Salesforce. Nice to have knowledge about Webservices and APIs.

Posted 1 month ago

Apply

4 - 9 years

15 - 30 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Warm Greetings from SP Staffing!! Role: Informatica Developer Experience Required :4 to 12 yrs Work Location : PAN India Required Skills, Informatica powercenter Interested candidates can send resumes to nandhini.spstaffing@gmail.com

Posted 1 month ago

Apply

7 - 12 years

15 - 30 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Warm Greetings from SP Staffing Services Pvt Ltd!!!! Experience:4-13yrs Work Location :Bglre/Hybed/Chennai/Pune/Kolkata Participates in ETL Design of new or changing mappings and workflows with the team and prepares technical specifications. - Creates ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter 10.x and prepare corresponding documentation. - Designs and builds integrations supporting standard data warehousing objects (type-2 dimensions, aggregations, star schema, etc.). - Performs source system analysis as required. - Works with DBAs and Data Architects to plan and implement appropriate data partitioning strategy in Enterprise Data Warehouse. - Implements versioning of the ETL repository and supporting code as necessary. - Develops stored procedures, database triggers and SQL queries where needed. - Implements best practices and tunes SQL code for optimization. - Loads data from SF Power Exchange to Relational database using Informatica. - Works with XML's, XML parser, Java and HTTP transformation within Informatica. - Works with Informatica Data Quality (Analyst and Developer) - Primary skill is Informatica PowerCenter Interested candidates, Kindly share your updated resume to ramya.r@spstaffing.in or contact number 8667784354 (Whatsapp:9597467601) to proceed further

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies