Home
Jobs

2394 Informatica Jobs - Page 28

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 years

0 Lacs

India

Remote

Linkedin logo

Title - MDM Architect Location - Remote Type - Contract Key Responsibilities: · Lead the architecture, design, and implementation of Informatica MDM solutions, focusing on customer and product master data domains. · Define and drive MDM strategy, standards, and best practices across the organization. · Collaborate with business and IT stakeholders to identify MDM needs and translate them into scalable, enterprise-grade solutions. · Provide technical leadership and guidance for MDM configurations, customizations, match/merge rules, data stewardship workflows, and integration patterns. · Oversee MDM tool management including performance tuning, version upgrades, and patching. · Design and maintain MDM data models and metadata standards. · Ensure data governance, quality, lineage, and security policies are implemented effectively. · Mentor and support MDM developers, analysts, and data stewards. · Stay abreast of industry trends and tools to continually enhance the MDM platform. Qualifications: · 10+ years of experience in Data Management or related fields, with at least 5 years in Informatica MDM. · Proven experience with Customer and/or Product MDM implementations. · Deep understanding of MDM architecture, match/merge logic, hierarchy management, data stewardship, and governance frameworks. · Hands-on experience managing the Informatica MDM Hub and associated components. · Strong knowledge of data integration techniques and experience with ETL tools. · Solid understanding of data modeling, relational databases, and metadata management. · Experience with cloud environments (e.g., AWS, Azure, GCP) and hybrid MDM deployments is a plus. · Excellent communication skills and ability to lead cross-functional teams. Preferred Certifications: · Informatica MDM Certification · TOGAF, DAMA, or other data architecture/governance-related certifications Show more Show less

Posted 1 week ago

Apply

1.0 - 6.0 years

3 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT THE ROLE Role Description: We are seeking an MDM Associate Analyst with 2 5 years of development experience to support and enhance our enterprise MDM (Master Data Management) platforms using Informatica/Reltio. This role is critical in delivering high-quality master data solutions across the organization, utilizing modern tools like Databricks and AWS to drive insights and ensure data reliability. The ideal candidate will have strong SQL, data profiling, and experience working with cross-functional teams in a pharma environment. To succeed in this role, the candidate must have strong experience on MDM (Master Data Management) on configuration (L3 Configuration, Assets creati on, Data modeling etc ) , ETL and data mappings (CAI, CDI ) , data mastering (Match/Merge and Survivorship rules) , source and target integrations ( RestAPI , Batch integration, Integration with Databricks tables etc ) Roles & Responsibilities: Analyze and manage customer master data using Reltio or Informatica MDM solutions. Perform advanced SQL queries and data analysis to validate and ensure master data integrity. Leverage Python, PySpark, and Databricks for scalable data processing and automation. Collaborate with business and data engineering teams for continuous improvement in MDM solutions. Implement data stewardship processes and workflows, including approval and DCR mechanisms. Utilize AWS cloud services for data storage and compute processes related to MDM. Contribute to metadata and data modeling activities. Track and manage data issues using tools such as JIRA and document processes in Confluence. Apply Life Sciences/Pharma industry context to ensure data standards and compliance. Basic Qualifications and Experience: Masters degree with 1 - 3 years of experience in Business, Engineering, IT or related field OR Bachelors degree with 2 - 5 years of experience in Business, Engineering, IT or related field OR Diploma with 6 - 8 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Strong experience with Informatica or Reltio MDM platforms in building configurations from scratch (Like L3 configuration or Data modeling, Assets creations, Setting up API integrations, Orchestration) Strong experience in building data mappings, data profiling, creating and implementation business rules for data quality and data transformation Strong experience in implementing match and merge rules and survivorship of golden records Expertise in integrating master data records with downstream systems Very good understanding of DWH basics and good knowledge on data modeling Experience with IDQ, data modeling and approval workflow/DCR. Advanced SQL expertise and data wrangling. Exposure to Python and PySpark for data transformation workflows. Knowledge of MDM, data governance, stewardship, and profiling practices. Good-to-Have Skills: Familiarity with Databricks and AWS architecture. Background in Life Sciences/Pharma industries. Familiarity with project tools like JIRA and Confluence. Basics of data engineering concepts. Professional Certifications : Any ETL certification (e.g. Informatica) Any Data Analysis certification (SQL, Python, Databricks) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams.

Posted 1 week ago

Apply

8.0 years

0 Lacs

Tamil Nadu, India

On-site

Linkedin logo

Job Title: Data Engineer About VXI VXI Global Solutions is a BPO leader in customer service, customer experience, and digital solutions. Founded in 1998, the company has 40,000 employees in more than 40 locations in North America, Asia, Europe, and the Caribbean. We deliver omnichannel and multilingual support, software development, quality assurance, CX advisory, and automation & process excellence to the world’s most respected brands. VXI is one of the fastest growing, privately held business services organizations in the United States and the Philippines, and one of the few US-based customer care organizations in China. VXI is also backed by private equity investor Bain Capital. Our initial partnership ran from 2012 to 2016 and was the beginning of prosperous times for the company. During this period, not only did VXI expand our footprint in the US and Philippines, but we also gained ground in the Chinese and Central American markets. We also acquired Symbio, expanding our global technology services offering and enhancing our competitive position. In 2022, Bain Capital re-invested in the organization after completing a buy-out from Carlyle. This is a rare occurrence in the private equity space and shows the level of performance VXI delivers for our clients, employees, and shareholders. With this recent investment, VXI has started on a transformation to radically improve the CX experience though an industry leading generative AI product portfolio that spans hiring, training, customer contact, and feedback. Job Description: We are seeking talented and motivated Data Engineers to join our dynamic team and contribute to our mission of harnessing the power of data to drive growth and success. As a Data Engineer at VXI Global Solutions, you will play a critical role in designing, implementing, and maintaining our data infrastructure to support our customer experience and management initiatives. You will collaborate with cross-functional teams to understand business requirements, architect scalable data solutions, and ensure data quality and integrity. This is an exciting opportunity to work with cutting-edge technologies and shape the future of data-driven decision-making at VXI Global Solutions. Responsibilities: Design, develop, and maintain scalable data pipelines and ETL processes to ingest, transform, and store data from various sources. Collaborate with business stakeholders to understand data requirements and translate them into technical solutions. Implement data models and schemas to support analytics, reporting, and machine learning initiatives. Optimize data processing and storage solutions for performance, scalability, and cost-effectiveness. Ensure data quality and integrity by implementing data validation, monitoring, and error handling mechanisms. Collaborate with data analysts and data scientists to provide them with clean, reliable, and accessible data for analysis and modeling. Stay current with emerging technologies and best practices in data engineering and recommend innovative solutions to enhance our data capabilities. Requirements: Bachelor's degree in Computer Science, Engineering, or a related field. Proven 8+ years' experience as a data engineer or similar role Proficiency in SQL, Python, and/or other programming languages for data processing and manipulation. Experience with relational and NoSQL databases (e.g., SQL Server, MySQL, Postgres, Cassandra, DynamoDB, MongoDB, Oracle), data warehousing (e.g., Vertica, Teradata, Oracle Exadata, SAP Hana), and data modeling concepts. Strong understanding of distributed computing frameworks (e.g., Apache Spark, Apache Flink, Apache Storm) and cloud-based data platforms (e.g., AWS Redshift, Azure, Google BigQuery, Snowflake) Familiarity with data visualization tools (e.g., Tableau, Power BI, Looker, Apache Superset) and data pipeline tools (e.g. Airflow, Kafka, Data Flow, Cloud Data Fusion, Airbyte, Informatica, Talend) is a plus. Understanding of data and query optimization, query profiling, and query performance monitoring tools and techniques. Solid understanding of ETL/ELT processes, data validation, and data security best practices Experience in version control systems (Git) and CI/CD pipelines. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills to work effectively with cross-functional teams. Join VXI Global Solutions and be part of a dynamic team dedicated to driving innovation and delivering exceptional customer experiences. Apply now to embark on a rewarding career in data engineering with us! Show more Show less

Posted 1 week ago

Apply

6.0 - 8.0 years

6 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Key Responsibilities: Design, develop, and maintain scalable data pipelines using Snowflake . Develop and optimize complex SQL queries , views, and stored procedures. Migrate data from legacy systems to Snowflake using ETL tools like Informatica, Talend, dbt, or Matillion . Implement data modeling techniques (Star, Snowflake schemas) and maintain data dictionary. Ensure performance tuning, data quality, and security across all Snowflake objects. Integrate Snowflake with BI tools like Tableau, Power BI , or Looker . Collaborate with data analysts, data scientists, and business teams to understand requirements and deliver solutions. Monitor and manage Snowflake environments using tools like SnowSight, Snowsql , or CloudWatch . Participate in code reviews and enforce best practices for data governance and security. Develop automation scripts using Python, Shell , or Airflow for data workflows. Required Skills: 6+ years of experience in data engineering / data warehousing . 3+ years hands-on experience with Snowflake Cloud Data Platform . Strong expertise in SQL, performance tuning, data modeling, and query optimization . Experience with ETL tools like Informatica, Talend, Apache NiFi , or dbt . Proficient in cloud platforms: AWS / Azure / GCP (preferably AWS). Good understanding of DevOps/CI-CD principles for Snowflake deployments. Hands-on experience with scripting languages: Python, Bash, etc. Knowledge of RBAC, masking policies, row access policies in Snowflake.

Posted 1 week ago

Apply

1.0 - 2.0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Linkedin logo

Hungry, Humble, Honest, with Heart. The Opportunity As a Business Intelligence Engineer at Nutanix, you will be part of a dynamic team responsible for developing and maintaining our analytics platform. You will play a key role in analyzing business data and providing insights that will influence decision-making across the organization. This is an exciting opportunity for a new college graduate to gain valuable experience and grow their career in the field of business intelligence. About The Team At Nutanix, you will be joining the Saas Data team, comprising 50 talented individuals located across various regions. Our team culture emphasizes open communication, collaboration, and mutual support, fostering an environment where diverse perspectives are valued and contribute to our collective success. You will report to the Senior Manager Business Intelligence, who leads with a focus on fostering growth, providing guidance, and ensuring that each team member has the resources needed to excel. At Nutanix, we offer a hybrid work setup, requiring employees to work in the office for three days a week, promoting a balance between in-person collaboration and remote flexibility. There are no travel requirements associated with this role, allowing you to focus on your responsibilities without the need for frequent travel. Your Role Develop data pipelines using Informatica or open-source technologies to bring data from various sources to our data warehouse. Design and develop aggregated data layers to support business intelligence reporting and analytics, with an emphasis on code modularity and scalability. Develop, maintain, and optimize dashboards and reports in Tableau, providing actionable insights to business stakeholders. Leverage advanced analytics techniques such as predictive modeling, statistical analysis, and machine learning to uncover trends and patterns in the data. Work closely with various teams to understand business needs and translate them into technical requirements for BI projects. Maintain the integrity and reliability of business data, ensuring high-quality and consistent data flows into our data warehouse. Collaborate with IT and data governance teams to implement data security, compliance, and privacy best practices. Assist in the integration of BI solutions with other business systems, including NetSuite and SFDC, to streamline business processes. Develop and automate key performance indicators (KPIs) to monitor and evaluate the performance of business operations. Keep abreast of the latest trends and technologies in business intelligence, advanced analytics, and data warehousing to drive continuous improvement and innovation within the team. What You Will Bring Bachelor’s degree in Computer Science, Engineering, Information Systems, or a related field. 1-2 years of experience in business intelligence, data analytics, or a related field. Proficiency in BI tools, particularly Tableau; familiarity with advanced Tableau features such as calculated fields, parameters, and LOD expressions is highly desired. Experience with advanced analytics tools such as Python, R, or similar is advantageous. Strong understanding of data warehousing concepts, including data modeling, ETL processes, and SQL. Experience with Netsuite and SFDC is advantageous but not required. Excellent problem-solving, analytical, and technical skills, with the ability to apply statistical techniques to business data. Strong communication skills, with the ability to convey complex concepts to non- technical stakeholders. Work Arrangement Hybrid: This role operates in a hybrid capacity, blending the benefits of remote work with the advantages of in-person collaboration. For most roles, that will mean coming into an office a minimum of 3 days per week, however certain roles and/or teams may require more frequent in-office presence. Additional team-specific guidance and norms will be provided by your manager. Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Sanganer, Rajasthan, India

On-site

Linkedin logo

Unlock yourself. Take your career to the next level. At Atrium, we live and deliver at the intersection of industry strategy, intelligent platforms, and data science — empowering our customers to maximize the power of their data to solve their most complex challenges. We have a unique understanding of the role data plays in the world today and serve as market leaders in intelligent solutions. Our data-driven, industry-specific approach to business transformation for our customers places us uniquely in the market. Who are you? You are smart, collaborative, and take ownership to get things done. You love to learn and are intellectually curious in business and technology tools, platforms, and languages. You are energized by solving complex problems and bored when you don’t have something to do. You love working in teams and are passionate about pulling your weight to make sure the team succeeds. What will you be doing at Atrium? In this role, you will join the best and brightest in the industry to skillfully push the boundaries of what’s possible. You will work with customers to make smarter decisions through innovative problem-solving using data engineering, Analytics, and systems of intelligence. You will partner to advise, implement, and optimize solutions through industry expertise, leading cloud platforms, and data engineering. As a Snowflake Data Engineering Lead , you will be responsible for expanding and optimizing the data and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. You will support the software developers, database architects, data analysts, and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. In This Role, You Will Lead the design and architecture of end-to-end data warehousing and data lake solutions, focusing on the Snowflake platform, incorporating best practices for scalability, performance, security, and cost optimization Assemble large, complex data sets that meet functional / non-functional business requirements Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Lead and mentor both onshore and offshore development teams, creating a collaborative environment Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, DBT, Python, AWS, and Big Data tools Development of ELT processes to ensure timely delivery of required data for customers Implementation of Data Quality measures to ensure accuracy, consistency, and integrity of data Design, implement, and maintain data models that can support the organization's data storage and analysis needs Deliver technical and functional specifications to support data governance and knowledge sharing In This Role, You Will Have Bachelor's degree in Computer Science, Software Engineering, or equivalent combination of relevant work experience and education 6+ years of experience delivering consulting services to medium and large enterprises. Implementations must have included a combination of the following experiences: Data Warehousing or Big Data consulting for mid-to-large-sized organizations 3+ years of experience specifically with Snowflake, demonstrating deep expertise in its core features and advanced capabilities Strong analytical skills with a thorough understanding of how to interpret customer business needs and translate those into a data architecture SnowPro Core certification is highly desired Hands-on experience with Python (Pandas, Dataframes, Functions) Strong proficiency in SQL (Stored Procedures, functions), including debugging, performance optimization, and database design Strong Experience with Apache Airflow and API integrations Solid experience in any one of the ETL/ELT tools (DBT, Coalesce, Wherescape, Mulesoft, Matillion, Talend, Informatica, SAP BODS, DataStage, Dell Boomi, etc.) Nice to have: Experience in Docker, DBT, data replication tools (SLT, Fivetran, Airbyte, HVR, Qlik, etc), Shell Scripting, Linux commands, AWS S3, or Big data technologies Strong project management, problem-solving, and troubleshooting skills with the ability to exercise mature judgment Enthusiastic, professional, and confident team player with a strong focus on customer success who can present effectively even under adverse conditions Strong presentation and communication skills Next Steps Our recruitment process is highly personalized. Some candidates complete the hiring process in one week, others may take longer, as it’s important we find the right position for you. It's all about timing and can be a journey as we continue to learn about one another. We want to get to know you and encourage you to be selective - after all, deciding to join a company is a big decision! At Atrium, we believe a diverse workforce allows us to match our growth ambitions and drive inclusion across the business. We are an equal opportunity employe,r and all qualified applicants will receive consideration for employment. Show more Show less

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

Chennai, Tamil Nadu

On-site

Indeed logo

Our software engineers at Fiserv bring an open and creative mindset to a global team developing mobile applications, user interfaces and much more to deliver industry-leading financial services technologies to our clients. Our talented technology team members solve challenging problems quickly and with quality. We're seeking individuals who can create frameworks, leverage developer tools, and mentor and guide other members of the team. Collaboration is key and whether you are an expert in a legacy software system or are fluent in a variety of coding languages you're sure to find an opportunity as a software engineer that will challenge you to perform exceptionally and deliver excellence for our clients. Full-time Entry, Mid, Senior Yes (occasional), Minimal (if any) Responsibilities Requisition ID R-10358182 Date posted 06/11/2025 End Date 06/26/2025 City Chennai State/Region Tamil Nadu Country India Location Type Onsite Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Professional, Software Development Engineering What does a successful Professional, Data Conversions doat Fiserv? A Conversion Professional is responsible for timely and accurate conversion of new and existing Bank/Client data to Fiserv systems, from both internal and external sources. This role is responsible for providing data analysis for client projects and to accommodate other ad hoc data updates to meet client requests. As part of the overall Service Delivery organization, a Conversion Professional plays a critical role in mapping in data to support project initiatives for new and existing banks. Working with financial services data means a high priority on accuracy and adherence to procedures and guidelines. What will you do A Conversion Professional is responsible for timely and accurate conversion of new and existing Bank/Client data to Fiserv systems, from both internal and external sources. This role is responsible for providing data analysis for client projects and to accommodate other ad hoc data updates to meet client requests. As part of the overall Service Delivery organization, a Conversion Professional plays a critical role in mapping in data to support project initiatives for new and existing banks. Working with financial services data means a high priority on accuracy and adherence to procedures and guidelines. The person stepping in as the backup would need to review the specifications history and then review and understand the code that was being developed to resolve the issue and or change. This would also have to occur on the switch back to the original developer. Today, the associate handling the project would log back in to support the effort and address the issue and or change. What you will need to have Bachelor’s degree in programming or related field Minimum 3 years’ relevant experience in data processing (ETL) conversions or financial services industry 3 – 5 years’ Experience and strong knowledge of MS SQL/PSQL, MS SSIS and data warehousing concepts Strong communication skills and ability to provide technical information to non-technical colleagues. Team players with ability to work independently. Experience in full software development life cycle using agile methodologies. Should have good understanding of Agile methodologies and can handle agile ceremonies. Efficient in Reviewing, coding, testing, and debugging of application/Bank programs. Should be able to work under pressure while resolving critical issues in Prod environment. Good communication skills and experience in working with Clients. Good understanding in Banking Domain. What would be great to have Experience with Informatica, Power BI, MS Visual Basic, Microsoft Access and Microsoft Excel required. Experience with Card Management systems, debit card processing is a plus Strong communication skills and ability to provide technical information to non-technical colleagues Ability to manage and prioritize work queue across multiple workstreams Team player with ability to work independently Highest attention to detail and accuracy Thank you for considering employment with Fiserv. Please: Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our commitment to Diversity and Inclusion: Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note to agencies: Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning about fake job posts: Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

Pune, Maharashtra

On-site

Indeed logo

You deserve to do what you love, and love what you do – a career that works as hard for you as you do. At Fiserv, we are more than 40,000 #FiservProud innovators delivering superior value for our clients through leading technology, targeted innovation and excellence in everything we do. You have choices – if you strive to be a part of a team driven to create with purpose, now is your chance to Find your Forward with Fiserv. Responsibilities Requisition ID R-10358184 Date posted 06/11/2025 End Date 06/26/2025 City Pune State/Region Maharashtra Country India Location Type Onsite Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Specialist, Data Architecture What does a successful Professional, Data Conversions doat Fiserv? A Conversion Professional is responsible for timely and accurate conversion of new and existing Bank/Client data to Fiserv systems, from both internal and external sources. This role is responsible for providing data analysis for client projects and to accommodate other ad hoc data updates to meet client requests. As part of the overall Service Delivery organization, a Conversion Professional plays a critical role in mapping in data to support project initiatives for new and existing banks. Working with financial services data means a high priority on accuracy and adherence to procedures and guidelines. What will you do A Conversion Professional is responsible for timely and accurate conversion of new and existing Bank/Client data to Fiserv systems, from both internal and external sources. This role is responsible for providing data analysis for client projects and to accommodate other ad hoc data updates to meet client requests. As part of the overall Service Delivery organization, a Conversion Professional plays a critical role in mapping in data to support project initiatives for new and existing banks. Working with financial services data means a high priority on accuracy and adherence to procedures and guidelines. The person stepping in as the backup would need to review the specifications history and then review and understand the code that was being developed to resolve the issue and or change. This would also have to occur on the switch back to the original developer. Today, the associate handling the project would log back in to support the effort and address the issue and or change. What you will need to have Bachelor’s degree in programming or related field Minimum 3 years’ relevant experience in data processing (ETL) conversions or financial services industry 3 – 5 years’ Experience and strong knowledge of MS SQL/PSQL, MS SSIS and data warehousing concepts Strong communication skills and ability to provide technical information to non-technical colleagues. Team players with ability to work independently. Experience in full software development life cycle using agile methodologies. Should have good understanding of Agile methodologies and can handle agile ceremonies. Efficient in Reviewing, coding, testing, and debugging of application/Bank programs. Should be able to work under pressure while resolving critical issues in Prod environment. Good communication skills and experience in working with Clients. Good understanding in Banking Domain. What would be great to have Experience with Informatica, Power BI, MS Visual Basic, Microsoft Access and Microsoft Excel required. Experience with Card Management systems, debit card processing is a plus Strong communication skills and ability to provide technical information to non-technical colleagues Ability to manage and prioritize work queue across multiple workstreams Team player with ability to work independently Highest attention to detail and accuracy Thank you for considering employment with Fiserv. Please: Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our commitment to Diversity and Inclusion: Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note to agencies: Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning about fake job posts: Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.

Posted 1 week ago

Apply

0.0 - 5.0 years

0 Lacs

Pune, Maharashtra

On-site

Indeed logo

You deserve to do what you love, and love what you do – a career that works as hard for you as you do. At Fiserv, we are more than 40,000 #FiservProud innovators delivering superior value for our clients through leading technology, targeted innovation and excellence in everything we do. You have choices – if you strive to be a part of a team driven to create with purpose, now is your chance to Find your Forward with Fiserv. Responsibilities Requisition ID R-10358162 Date posted 06/11/2025 End Date 06/30/2025 City Pune State/Region Maharashtra Country India Location Type Onsite Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Specialist, Data Architecture Good Hand on experience on ETL and BI tools like SSIS, SSRS, Power BI etc. Readiness to play an individual contributor role on the technical front Excellent communication skills Readiness to travel onsite for short term, as required A good experience in ETL development for 3-5 years and with hands-on experience in a migration or data warehousing project Should have strong database fundamentals and experience in writing Unit test cases and test scenarios Expert knowledge in writing SQL commands, queries and stored procedures Good knowledge of ETL tools like SSIS, Informatica, etc. and data warehousing concepts Should have good knowledge in writing macros Good client handling skills with preferred onsite experience Thank you for considering employment with Fiserv. Please: Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our commitment to Diversity and Inclusion: Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note to agencies: Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning about fake job posts: Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

Noida, Uttar Pradesh

On-site

Indeed logo

You deserve to do what you love, and love what you do – a career that works as hard for you as you do. At Fiserv, we are more than 40,000 #FiservProud innovators delivering superior value for our clients through leading technology, targeted innovation and excellence in everything we do. You have choices – if you strive to be a part of a team driven to create with purpose, now is your chance to Find your Forward with Fiserv. Responsibilities Requisition ID R-10358179 Date posted 06/11/2025 End Date 07/15/2025 City Noida State/Region Uttar Pradesh Country India Location Type Onsite Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Specialist, Data Architecture What does a successful Lead, Data Conversions do? A Conversion Lead is responsible for timely and accurate conversion of new and existing Bank/Client data to Fiserv systems, from both internal and external sources. This role is responsible to provide data analysis for client projects and to accommodate other ad hoc data updates to meet client requests. As part of the overall Service Delivery organization, a Conversion Lead plays a critical role in mapping in data to support project initiatives for new and existing banks. Leads provide a specialized service to the Project Manager teams—developing custom reporting, providing technical assistance, and ensuring project timelines are met. Working with financial services data means a high priority on accuracy and adherence to procedures and guidelines. What you will do A Conversion Lead is responsible for timely and accurate conversion of new and existing Bank/Client data to Fiserv systems, from both internal and external sources. This role is responsible to provide data analysis for client projects and to accommodate other ad hoc data updates to meet client requests. As part of the overall Service Delivery organization, a Conversion Lead plays a critical role in mapping in data to support project initiatives for new and existing banks/clients. Lead provides a specialized service to the Project Manager teams—developing custom reporting, providing technical assistance, and ensuring project timelines are met. Working with financial services data means a high priority on accuracy and adherence to procedures and guidelines. The person stepping in as the backup would need to review the specifications history and then review and understand the code that was being developed to resolve the issue and or change. This would also have to occur on the switch back to the original developer. Today, the associate handling the project would log back in to support the effort and address the issue and or change. What you will need to have Bachelor’s degree in programming or related field Working Hours (IST): 12:00 p.m. – 09:00 p.m. (IST) Monday through Friday Highest attention to detail and accuracy Team player with ability to work independently Ability to manage and prioritize work queue across multiple workstreams Strong communication skills and ability to provide technical information to non-technical colleagues What would be great to have Experience with Data Modelling, Informatica, Power BI, MS Visual Basic, Microsoft Access and Microsoft Excel required. Experience with Card Management systems, debit card processing is a plus Understanding Applications and related database features that can be leveraged to improve performance Experience of creating testing artifacts (test cases, test plans) and knowledge of various testing types. 8 – 12 years’ Experience and strong knowledge of MS SQL/PSQL, MS SSIS and Data warehousing concepts Should have strong database fundamentals and Expert knowledge in writing SQL commands, queries and stored procedures Experience in Performance Tuning of SQL complex queries. Strong communication skills and ability to provide technical information to non-technical colleagues. Ability to mentor junior team members Ability to manage and prioritize work queue across multiple workstreams. Team player with ability to work independently. Experience in full software development life cycle using agile methodologies. Should have good understanding of Agile methodologies and can handle agile ceremonies. Efficient in Reviewing, Analyzing, coding, testing, and debugging of application programs. Should be able to work under pressure while resolving critical issues in Prod environment. Good communication skills and experience in working with Clients. Good understanding in Banking Domain. Minimum 8 years’ relevant experience in data processing (ETL) conversions or financial services industry Thank you for considering employment with Fiserv. Please: Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our commitment to Diversity and Inclusion: Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note to agencies: Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning about fake job posts: Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.

Posted 1 week ago

Apply

7.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Linkedin logo

We are seeking a highly skilled and experienced Lead Data Engineer (7+ years) to join our dynamic team. As a Lead Data Engineer, you will play a crucial role in designing, developing, and maintaining our data infrastructure. You will be responsible for ensuring the efficient and reliable collection, storage, and transformation of large-scale data to support business intelligence, analytics, and data-driven decision-making. Key Responsibilities Data Architecture & Design : Lead the design and implementation of robust data architectures that support data warehousing (DWH), data integration, and analytics platforms. Develop and maintain ETL (Extract, Transform, Load) pipelines to ensure the efficient processing of large datasets. ETL Development Design, develop, and optimize ETL processes using tools like Informatica Power Center, Intelligent Data Management Cloud (IDMC), or custom Python scripts. Implement data transformation and cleansing processes to ensure data quality and consistency across the enterprise. Data Warehouse Development Build and maintain scalable data warehouse solutions using Snowflake, Databricks, Redshift, or similar technologies. Ensure efficient storage, retrieval, and processing of structured and semi-structured data. Big Data & Cloud Technologies Utilize AWS Glue and PySpark for large-scale data processing and transformation. Implement and manage data pipelines using Apache Airflow for orchestration and scheduling. Leverage cloud platforms (AWS, Azure, GCP) for data storage, processing, and analytics. Data Management & Governance Establish and enforce data governance and security best practices. Ensure data integrity, accuracy, and availability across all data platforms. Implement monitoring and alerting systems to ensure data pipeline reliability. Collaboration & Leadership Work closely with data Stewards, analysts, and business stakeholders to understand data requirements and deliver solutions that meet business needs. Mentor and guide junior data engineers, fostering a culture of continuous learning and development within the team. Lead data-related projects from inception to delivery, ensuring alignment with business objectives and timelines. Database Management Design and manage relational databases (RDBMS) to support transactional and analytical workloads. Optimize SQL queries for performance and scalability across various database platforms. Required Skills & Qualifications Education: Bachelors or Masters degree in Computer Science, Information Systems, Engineering, or a related field. Experience Minimum of 7+ years of experience in data engineering, ETL, and data warehouse development. Proven experience with ETL tools like Informatica Power Center or IDMC. Strong proficiency in Python and PySpark for data processing. Experience with cloud-based data platforms such as AWS Glue, Snowflake, Databricks, or Redshift. Hands-on experience with SQL and RDBMS platforms (e.g., Oracle, MySQL, PostgreSQL). Familiarity with data orchestration tools like Apache Airflow. Technical Skills Advanced knowledge of data warehousing concepts and best practices. Strong understanding of data modeling, schema design, and data governance. Proficiency in designing and implementing scalable ETL pipelines. Experience with cloud infrastructure (AWS, Azure, GCP) for data storage and processing. Soft Skills Excellent communication and collaboration skills. Ability to lead and mentor a team of engineers. Strong problem-solving and analytical thinking abilities. Ability to manage multiple projects and prioritize tasks effectively. Preferred Qualifications Experience with machine learning workflows and data science tools. Certification in AWS, Snowflake, Databricks, or relevant data engineering technologies. Experience with Agile methodologies and DevOps practices. (ref:hirist.tech) Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description We are looking for a skilled and motivated Senior Data Engineer to join data integration and analytics team. The ideal candidate will have hands-on experience with Informatica IICS, AWS Redshift, Python scripting, and Unix/Linux systems. You will be responsible for building and maintaining scalable ETL pipelines to support business intelligence and analytics needs. A strong passion for continuous learning, problem-solving, and enabling data-driven decision-making is highly valued. Primary Skills : Informatica Skills : Description : We are looking for a Senior Data Engineer to lead the design, development, and management of scalable data platforms and pipelines. This role demands a strong technical foundation in data architecture, big data technologies, and database systems (both SQL and NoSQL), along with the ability to work across functional teams to deliver robust, secure, and high-performing data solutions. Role Responsibility Design, develop, and maintain end-toend data pipelines and infrastructure. Translate business and functional requirements into scalable, welldocumented technical solutions. Build and manage data flows across structured and unstructured data sources, including streaming and batch integrations. Ensure data integrity and quality through automated validations, unit testing, and robust documentation. Optimize data processing performance and manage large datasets efficiently Collaborate closely with stakeholders and project teams to align data solutions with business objectives. Implement and maintain security and privacy protocols to ensure safe data handling. Lead development environment setup and configuration of tools and services. Mentor junior data engineers and contribute to continuous improvement and automation initiatives. Coordinate with QA and UAT teams during testing and release phases Role Requirement Strong proficiency in SQL (including procedures, performance tuning, and analytical functions). Solid understanding of data warehousing concepts, including dimensional modeling and SCDs. Hands-on experience with scripting languages (Shell / PowerShell). Familiarity with Cloud and Big data technologies. Experience working with relational, non-relational databases, and data streaming systems. Proficiency in data profiling, validation, and testing practices. Excellent problem-solving, communication (written and verbal), and documentation skills. Exposure to Agile methodologies and CI/CD practices. Self-motivated, adaptable, and capable of working in a fast-paced Requirement : Overall 5 years and 3+ years of hands-on experience with Informatica IICS (Cloud Data Integration, Application Integration). Strong proficiency in AWS Redshift and writing complex SQL queries. Solid programming experience in Python for scripting, data wrangling, and automation. Experience with version control tools like Git and CI/CD workflows. Knowledge of data modeling and data warehousing concepts. Prior experience with data lakes and big data technologies is a plus (ref:hirist.tech) Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Data Analysis And Assessment Conduct thorough data analysis of source systems to understand data structures, quality, and dependencies. Identify data quality issues and develop strategies to cleanse and standardize data before migration. Create data profiling reports to identify potential data migration challenges. Migration Design And Architecture Design comprehensive data migration plans, including data mapping, transformation rules, and loading procedures. Develop data migration architecture considering source and target systems, data volumes, and performance requirements. Select appropriate methods and patterns based on project needs. Data Mapping And Transformation Create detailed data mapping documents to define how data will be transformed and translated between source and target systems. Develop data cleansing and transformation logic to ensure data quality in the target system. Design data validation rules to identify and address data inconsistencies. Testing And Validation Work with the Testers to develop and execute comprehensive data migration test plans, including unit testing, integration testing, and user acceptance testing. Work with the testing and development teams to resolve defects. Stakeholder Management Collaborate with business stakeholders to understand data requirements and migration objectives. Communicate data migration plans and progress updates to relevant stakeholders. Address concerns and provide technical guidance throughout the migration process. Required Skills And Qualifications Knowledge related to computer technology, network infrastructure, systems and applications, security, and storage Intermediate knowledge and experience with Microsoft Office Suite with proficiency in Excel Intermediate knowledge and experience in Informatica ILM, AWS, Abinitio and Database [SQL/NoSQL] Concepts. Ability to collaborate, engage and manage resources outside of the Data centricity Team. General conceptual understanding of programming and DB querying. Ability to work collaboratively with cross-functional teams. Prior knowledge of Agile project management tools,such as Jira. Ability to work effectively with internal and external IT support, senior leadership, project teams and individuals Ability to perform in a dynamic project management environment. (ref:hirist.tech) Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Jaipur, Rajasthan, India

Remote

Linkedin logo

Company Description At Ekloud Inc., we are a forward-thinking technology company specializing in technology consulting, contract staffing, and contingent workforce solutions. Our team consists of industry experts committed to delivering excellence across a broad spectrum of services. Whether you're a startup aiming to scale or an enterprise in need of specialized expertise, Ekloud is your trusted partner for success. Role Description We are seeking a skilled Informatica MDM Developer for a remote contract position. In this role, you will be responsible for delivering high-quality Master Data Management solutions, ensuring data integrity, and supporting various data integration initiatives. Key Responsibilities Develop and support Informatica MDM solutions across multiple domains. Handle ETL processes, including Extract, Transform, and Load operations. Implement and maintain data quality, data integration, and data modeling frameworks. Work closely with business and technical stakeholders to understand data requirements and deliver scalable solutions. Troubleshoot and optimize MDM workflows and performance. Participate in Agile/Scrum ceremonies and contribute to continuous improvement initiatives. Required Qualifications Bachelors degree in Computer Science, Information Systems, or a related field. 5+ years of hands-on experience with Informatica MDM. Strong understanding of Master Data Management, data governance, and data modeling concepts. Proficiency in SQL/PLSQL and working knowledge of relational databases such as Oracle and SQL Server. Experience with data quality, ETL development, and data integration tools. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Prior experience in Agile/Scrum environments is a plus. (ref:hirist.tech) Show more Show less

Posted 1 week ago

Apply

2.0 years

0 Lacs

Sadar, Uttar Pradesh, India

On-site

Linkedin logo

About The Company Veersa is a healthtech company that leverages emerging technology and data science to solve business problems in the US healthcare industry. Veersa has established a niche in serving small and medium entities in the US healthcare space through its tech frameworks, platforms, and tech accelerators. Veersa is known for providing innovative solutions using technology and data science to its client base and is the preferred innovation partner to its clients. Veersas rich technology expertise manifests in the various tech accelerators and frameworks developed in-house to assist in rapid solutions delivery and implementations. Its end-to-end data ingestion, curation, transformation, and augmentation framework has helped several clients quickly derive business insights and monetize data assets. Veersa teams work across all emerging technology areas such as AI/ML, IoT, and Blockchain and using tech stacks as MEAN, MERN, PYTHON, GoLang, ROR, and backend such as Java Springboot, NodeJs, and using databases as PostgreSQL, MS SQL, MySQL, Oracle on AWS and Azure cloud using serverless architecture. Veersa has two major business lines Veersalabs : an In-house R&D and product development platform and Veersa tech consulting : Technical solutions delivery for clients. Veersas customer base includes large US Healthcare software vendors, Pharmacy chains, Payers, providers, and Hospital chains. Though Veersas focus geography is North America, Veersa also provides product engineering expertise to a few clients in Australia and Singapore. About The Role We are seeking talented and detail-oriented Data Engineers with expertise in Informatica MDM to join our fast-growing data engineering team. Depending on your experience, youll join as a Software Engineer or Senior Software Engineer, contributing to the design, development, and maintenance of enterprise data management solutions that support our business objectives. As a key player, you will be responsible for building reliable data pipelines, working with master data management, and ensuring data quality, governance, and integration across systems. Responsibilities Design, develop, and implement data pipelines using ETL tools like Informatica PowerCenter, IICS, etc., and MDM solutions using Informatica MDM. Develop and maintain batch and real-time data integration workflows. Collaborate with data architects, business analysts, and stakeholders to understand data requirements. Perform data profiling, data quality assessments, and master data matching/merging. Implement governance, stewardship, and metadata management practices. Optimize the performance of Informatica MDM Hub, IDD, and associated components. Write complex SQL queries and stored procedures as needed. Senior Software Engineer Additional Responsibilities Lead design discussions and code reviews; mentor junior engineers. Architect scalable data integration solutions using Informatica and complementary tools. Drive adoption of best practices in data modeling, governance, and engineering. Work closely with cross-functional teams to shape the data strategy. Required Qualifications Software Engineer : Bachelors degree in Computer Science, Information Systems, or related field. 2 to 4 years of experience with Informatica MDM (Customer 360, Business Entity Services, Match/Merge rules). Strong SQL and data modeling skills. Familiarity with ETL concepts, REST APIs, and data integration tools. Understanding of data governance and quality frameworks. Senior Software Engineer Bachelors or Masters in Computer Science, Data Engineering, or related field. 4+ years of experience in Informatica MDM, with at least 2 years in a lead role. Proven track record of designing scalable MDM solutions in large-scale environments. Strong leadership, communication, and stakeholder management skills. Hands-on experience with data lakes, cloud platforms (AWS, Azure, or GCP), and big data tools is a plus. Preferred Skills (Nice To Have) Experience with other Informatica products (IDQ, PowerCenter). Exposure to cloud MDM platforms or cloud data integration tools. Agile/Scrum development experience. Knowledge of industry-standard data security and compliance practices. (ref:hirist.tech) Show more Show less

Posted 1 week ago

Apply

11.0 - 16.0 years

10 - 20 Lacs

Noida, Greater Noida, Delhi / NCR

Work from Office

Naukri logo

-Data Architect Department: Data & Analytics The Data Architect having more than 14 years of experience and should play a pivotal role in designing, developing, and governing scalable data architectures to support enterprise-wide data integration, analytics, and reporting. This role will focus on creating unified data models, optimizing data pipelines, and ensuring compliance with regulatory standards (GDPR) using cloud-based platforms. The ideal candidate is a strategic thinker with deep expertise in data modeling, cloud data platforms, and governance. Key Responsibilities: - - Design and implement logical and physical data models to support diverse data sources (e.g., relational database) - Develop scalable architectures integrating data lakes, data warehouses, and master data management (MDM) solutions to create unified views (e.g., customer 36o). - Leverage services to build ETL/ELT pipelines and ensure data consistency. - Establish data governance frameworks using Data Governance and Catalog to ensure metadata management, lineage tracking, and data discoverability. - Design models and processes to comply with regulatory requirements (e.g., GDPR, HIPAA), including encryption, data masking, and access controls. - Define and enforce data quality standards through profiling, cleansing, and validation. - Architect solutions to handle high-volume, high-velocity data environments, leveraging cloud platforms with auto-scaling capabilities. - Optimize data models and pipelines for query performance, using indexing, denormalization, and caching strategies. - Partner with data engineers, analysts, and business stakeholders to translate requirements into technical designs. Mandatory Skills- Data warehousing Data Modelling, Data Migration projects ETL Tools (SSIS, Informatica) SQL scripting Connect- Aarushi.Shukla@coforge.com

Posted 1 week ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 9 to 11 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills An inclination to mentor; an ability to lead and deliver medium sized components independently Technical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management Expertise around Data : Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Data Governance: A strong grasp of principles and practice including data quality, security, privacy and compliance Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes. File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Experience of using a Job scheduler e.g., Autosys. Exposure to Business Intelligence tools e.g., Tableau, Power BI Certification on any one or more of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Mantas Scenario Developer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 5 to 8 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Hands on Mantas ( Oracle FCCM ) expert throughout the full development life cycle, including: requirements analysis, functional design, technical design, programming, testing, documentation, implementation, and on-going technical support Translate business needs (BRD) into effective technical solutions and documents (FRD/TSD) Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Mantas: Expert in Oracle Mantas/FCCM, Scenario Manager, Scenario Development, thorough knowledge and hands on experience in Mantas FSDM, DIS, Batch Scenario Manager Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Mantas Scenario Developer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 5 to 8 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Hands on Mantas ( Oracle FCCM ) expert throughout the full development life cycle, including: requirements analysis, functional design, technical design, programming, testing, documentation, implementation, and on-going technical support Translate business needs (BRD) into effective technical solutions and documents (FRD/TSD) Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Mantas: Expert in Oracle Mantas/FCCM, Scenario Manager, Scenario Development, thorough knowledge and hands on experience in Mantas FSDM, DIS, Batch Scenario Manager Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 1 week ago

Apply

15.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Our people work differently depending on their jobs and needs. From hybrid working to flexible hours, we have plenty of options that help our people to thrive. This role is based in India and as such all normal working days must be carried out in India. Job Description Join us as a Data & Analytics Analyst Take on a new challenge in Data & Analytics and help us shape the future of our business You’ll take accountability for the analysis of complex data to identify business issues and opportunities, and supporting the delivery of high quality business solutions We're committed to mapping a career path that works for you, with a focus on helping you build new skills and engage with the latest ideas and technologies in data analytics We're offering this role at vice president level What you'll do As a Data & Analytics Analyst, you’ll be driving the production of high quality analytical input to support the development and implementation of innovative processes and problem resolution. You’ll be capturing, validating and documenting business and data requirements, making sure they are in line with key strategic principles. We’ll look to you to interrogate, interpret and visualise large volumes of data to identify, support and challenge business opportunities and identify solutions. You’ll also be: Performing data extraction, storage, manipulation, processing and analysis Conducting and supporting options analysis, identifying the most appropriate solution Accountable for the full traceability and linkage of business requirements of analytics outputs Seeking opportunities to challenge and improve current business processes, ensuring the best result for the customer Creating and executing quality assurance at various stages of the project in order to validate the analysis and to ensure data quality, identify data inconsistencies, and resolve as needed Strong sense of ownership with a focus on delivering high-quality outcomes Exceptional attention to detail Emphasis on measurable outcomes and impact of work Expertise in data analytics and reporting The skills you'll need You’ll need a background in business analysis tools and techniques, along with the ability to influence through communications tailored to a specific audience. Additionally, you’ll need the ability to use core technical skills. You’ll also demonstrate: Clear and effective communication Proficiency in SQL, and tools such as Excel and Power BI Experience in Informatica, Snowflake or others Responsible for performance metrics and data solutions across the entire data architecture team Skilled in data visualization, report generation and presentation to both technical and business audiences Over 15 years of professional experience Show more Show less

Posted 1 week ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Pune

Work from Office

Naukri logo

Role Purpose The purpose of this role is to provide significant technical expertise in architecture planning and design of the concerned tower (platform, database, middleware, backup etc) as well as managing its day-to-day operations Do Provide adequate support in architecture planning, migration & installation for new projects in own tower (platform/dbase/ middleware/ backup) Lead the structural/ architectural design of a platform/ middleware/ database/ back up etc. according to various system requirements to ensure a highly scalable and extensible solution Conduct technology capacity planning by reviewing the current and future requirements Utilize and leverage the new features of all underlying technologies to ensure smooth functioning of the installed databases and applications/ platforms, as applicable Strategize & implement disaster recovery plans and create and implement backup and recovery plans Manage the day-to-day operations of the tower Manage day-to-day operations by troubleshooting any issues, conducting root cause analysis (RCA) and developing fixes to avoid similar issues. Plan for and manage upgradations, migration, maintenance, backup, installation and configuration functions for own tower Review the technical performance of own tower and deploy ways to improve efficiency, fine tune performance and reduce performance challenges Develop shift roster for the team to ensure no disruption in the tower Create and update SOPs, Data Responsibility Matrices, operations manuals, daily test plans, data architecture guidance etc. Provide weekly status reports to the client leadership team, internal stakeholders on database activities w.r.t. progress, updates, status, and next steps Leverage technology to develop Service Improvement Plan (SIP) through automation and other initiatives for higher efficiency and effectiveness Team Management Resourcing Forecast talent requirements as per the current and future business needs Hire adequate and right resources for the team Train direct reportees to make right recruitment and selection decisions Talent Management Ensure 100% compliance to Wipros standards of adequate onboarding and training for team members to enhance capability & effectiveness Build an internal talent pool of HiPos and ensure their career progression within the organization Promote diversity in leadership positions Performance Management Set goals for direct reportees, conduct timely performance reviews and appraisals, and give constructive feedback to direct reports. Ensure that organizational programs like Performance Nxt are well understood and that the team is taking the opportunities presented by such programs to their and their levels below Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Proactively challenge the team with larger and enriching projects/ initiatives for the organization or team Exercise employee recognition and appreciation Deliver NoPerformance ParameterMeasure1Operations of the towerSLA adherence Knowledge management CSAT/ Customer

Posted 1 week ago

Apply

5.0 - 7.0 years

18 - 30 Lacs

Bengaluru

Hybrid

Naukri logo

About the Company Greetings from Teamware Solutions a division of Quantum Leap Consulting Pvt. Ltd About the Role We are hiring an IDMC Manager Locations: Bangalore Work Model: Hybrid Experience: 5-7 years Notice Period: Immediate to 15 Days Job Description: JD for IDMC Manager: 8+ years of experience in MDM development, with at least 2 years on the Informatica IDMC platform. Job Description: "JD for IDMC Manager: 8+ years of experience in MDM development, with at least 2 years on the Informatica IDMC platform. Key Responsibilities: • Lead the development and implementation of MDM solutions on the Informatica IDMC platform. • Design and configure Business 360 (B360), Customer 360 (C360), Product 360 (P360), and Reference 360 (R360) solutions. • Implement match/merge logic, survivorship rules, business validations, and workflow orchestration using Cloud Application Integration (CAI). • Configure Data Quality (DQ) services and perform data profiling to support cleansing and validation processes. • Integrate MDM with enterprise systems such as ERP, CRM, and data warehouses using IDMCs standard connectors, APIs, and real-time integration patterns. • Collaborate with data architects, business stakeholders, and project teams to gather requirements and translate them into scalable MDM solutions. • Utilize data modeling best practices to design and maintain golden records for domains like Customer, Product. • Work with real-time REST APIs exposed by Informatica for operations such as search, create, update, and delete. • Provide leadership in solution design reviews, troubleshooting, performance tuning, and production support. JD for IDMC Developer: 4+ years of experience in MDM development, with at least 1 year on the Informatica IDMC platform. Key Responsibilities: • Develop and implement Master Data Management (MDM) solutions on the Informatica IDMC platform. • Configure and maintain IDMC components such as Customer 360 (C360), Product 360 (P360), Business 360 (B360), and Reference 360 (R360). • Design and implement match rules, business rules, survivorship rules, and data validations. • Build and orchestrate workflows using Cloud Application Integration (CAI). • Perform data profiling and implement data quality (DQ) rules and transformations. • Develop data pipelines and workflows using IDMC Data Integration service. • Integrate IDMC MDM with external systems like ERP, CRM, and data lakes using connectors and APIs. • Use real-time REST APIs provided by IDMC for operations such as search, create, update, and delete. • Collaborate with data architects, analysts, and business users to gather requirements and deliver scalable solutions." Additional Information: Mandatory Skills IDMC MDM Nice to have skills CDQ, IDQ Interview Mode Virtual Interview Please let me know if you are interested in this position and send me your resumes to netra.s@twsol.com

Posted 1 week ago

Apply

5.0 - 8.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure 1 ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT 2 Team ManagementProductivity, efficiency, absenteeism 3 Capability developmentTriages completed, Technical Test performance Mandatory Skills: Snowflake.

Posted 1 week ago

Apply

6.0 - 9.0 years

10 - 20 Lacs

Noida, Greater Noida, Delhi / NCR

Work from Office

Naukri logo

Key Responsibilities:- Experience: 6-8 Years Design, develop, and maintain data warehouse solutions to support business reporting and analytics. Implement ETL processes using SSIS and Informatica to extract, transform, and load data efficiently. Optimize data pipelines for performance, scalability, and reliability. Collaborate with cross-functional teams to understand data requirements and ensure data integrity. Perform data modelling, schema design, and database optimization. Troubleshoot and resolve data-related issues, ensuring high availability and accuracy. Maintain documentation for data processes, workflows, and best practices. Mandatory Skills- SSIS, SQL, ETL Data warehousing

Posted 1 week ago

Apply

2.0 - 7.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Req ID: 326254 We are currently seeking a Oracle FCCM/OFSAA_326254 to join our team in Hyderabad, Telangana (IN-TG), India (IN). NTT DATA Services currently seeks Oracle FCCM/OFSAA Engineer to join our team in Hyderabad Oracle Mantas/FCCM (Financial Crime and Compliance)/OFSAA (Oracle Financial Services Analytical Applications) experience good to have. FCCM Configuration 2+ years of Python development experience Unix, PL/SQL Mantas scenario development

Posted 1 week ago

Apply

Exploring Informatica Jobs in India

The informatica job market in India is thriving with numerous opportunities for skilled professionals in this field. Companies across various industries are actively hiring informatica experts to manage and optimize their data integration and data quality processes.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

Average Salary Range

The average salary range for informatica professionals in India varies based on experience and expertise: - Entry-level: INR 3-5 lakhs per annum - Mid-level: INR 6-10 lakhs per annum - Experienced: INR 12-20 lakhs per annum

Career Path

A typical career progression in the informatica field may include roles such as: - Junior Developer - Informatica Developer - Senior Developer - Informatica Tech Lead - Informatica Architect

Related Skills

In addition to informatica expertise, professionals in this field are often expected to have skills in: - SQL - Data warehousing - ETL tools - Data modeling - Data analysis

Interview Questions

  • What is Informatica and why is it used? (basic)
  • Explain the difference between a connected and unconnected lookup transformation. (medium)
  • How can you improve the performance of a session in Informatica? (medium)
  • What are the various types of cache in Informatica? (medium)
  • How do you handle rejected rows in Informatica? (basic)
  • What is a reusable transformation in Informatica? (basic)
  • Explain the difference between a filter and router transformation in Informatica. (medium)
  • What is a workflow in Informatica? (basic)
  • How do you handle slowly changing dimensions in Informatica? (advanced)
  • What is a mapplet in Informatica? (medium)
  • Explain the difference between an aggregator and joiner transformation in Informatica. (medium)
  • How do you create a mapping parameter in Informatica? (basic)
  • What is a session and a workflow in Informatica? (basic)
  • What is a rank transformation in Informatica and how is it used? (medium)
  • How do you debug a mapping in Informatica? (medium)
  • Explain the difference between static and dynamic cache in Informatica. (advanced)
  • What is a sequence generator transformation in Informatica? (basic)
  • How do you handle null values in Informatica? (basic)
  • Explain the difference between a mapping and mapplet in Informatica. (basic)
  • What are the various types of transformations in Informatica? (basic)
  • How do you implement partitioning in Informatica? (medium)
  • Explain the concept of pushdown optimization in Informatica. (advanced)
  • How do you create a session in Informatica? (basic)
  • What is a source qualifier transformation in Informatica? (basic)
  • How do you handle exceptions in Informatica? (medium)

Closing Remark

As you prepare for informatica job opportunities in India, make sure to enhance your skills, stay updated with the latest trends in data integration, and approach interviews with confidence. With the right knowledge and expertise, you can excel in the informatica field and secure rewarding career opportunities. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies