Home
Jobs

2331 Informatica Jobs - Page 21

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 4.0 years

4 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Greeting from Infosys BPM Ltd, Exclusive Women's Walkin drive We are hiring for MIM, Content and Technical writer, Informatica skills. Please walk-in for interview on 16th June 2025 at Hyderabad location Note: Please carry copy of this email to the venue and make sure you register your application before attending the walk-in. Please use below link to apply and register your application. Please mention Candidate ID on top of the Resume *** https://career.infosys.com/jobdesc?jobReferenceCode=PROGEN-HRODIRECT-215115 Interview details Interview Date: 16th June 2025 Interview Time: 10 AM till 1 PM Interview Venue: Hyderabad :: Infosys STP Infosys STP Madhava Reddy colony, Near Wipro Circle Gachibowli Hyderabad 500032 Please find below Job Description for your reference: Work from Office*** Min 2 years of experience on project is mandate*** Job Description: MIM Strong knowledge of IT service management including ITIL Responding to a reported incident, identifying the cause, and initiating the incident management process. Participate in root cause analysis meetings, gathering lessons learned and managing and implement continuous improvement processes. Ensuring Client SLAs / KPIs and Customer satisfaction expectations are achieved. Restore a failed IT Service as quickly as possible. Job Description: Content and Technical writer Develop high-quality technical documents, including user manuals, guides, and release notes. Collaborate with cross-functional teams to gather requirements and create accurate documentation. Conduct functional testing and manual testing to ensure compliance with FDA regulations. Ensure adherence to ISO standards and maintain a clean, organized document management system. Strong understanding of Infra domain Technical writer that can convert complex technical concepts into easy to consume documents for the targeted audience. In addition, will also be a mentor to the team with technical writing. Job Description: Informatica Strong experience in ETL development using Informatica and SQL. Design, develop, and maintain ETL workflows using Informatica PowerCenter. Perform data extraction, transformation, and loading (ETL) from various sources into data warehouses. Develop and optimize SQL queries, stored procedures, views, and indexing for performance tuning. Ensure data integrity, consistency, and accuracy in all ETL processes. Monitor, debug, and troubleshoot ETL failures and performance bottlenecks. REGISTRATION PROCESS: The Candidate ID & SHL Test(AMCAT ID) is mandatory to attend the interview. Please follow the below instructions to successfully complete the registration. (Talents without registration & assessment will not be allowed for the Interview). Candidate ID Registration process: STEP 1: Visit: https://career.infosys.com/joblist STEP 2: Click on "Register" and provide the required details and submit. STEP 3: Once submitted, Your Candidate ID(100XXXXXXXX) will be generated. STEP 4: The candidate ID will be shared to the registered Email ID. SHL Test(AMCAT ID) Registration process: This assessment is proctored, and talent gets evaluated on Basic analytics, English Comprehension and writex (email writing). STEP 1: Visit: https://apc01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fautologin-talentcentral.shl.com%2F%3Flink%3Dhttps%3A%2F%2Famcatglobal.aspiringminds.com%2F%3Fdata%3DJTdCJTIybG9naW4lMjIlM0ElN0IlMjJsYW5ndWFnZSUyMiUzQSUyMmVuLVVTJTIyJTJDJTIyaXNBdXRvbG9naW4lMjIlM0ExJTJDJTIycGFydG5lcklkJTIyJTNBJTIyNDE4MjQlMjIlMkMlMjJhdXRoa2V5JTIyJTNBJTIyWm1abFpUazFPV1JsTnpJeU1HVTFObU5qWWpRNU5HWTFOVEU1Wm1JeE16TSUzRCUyMiUyQyUyMnVzZXJuYW1lJTIyJTNBJTIydXNlcm5hbWVfc3E5QmgxSWI5NEVmQkkzN2UlMjIlMkMlMjJwYXNzd29yZCUyMiUzQSUyMnBhc3N3b3JkJTIyJTJDJTIycmV0dXJuVXJsJTIyJTNBJTIyJTIyJTdEJTJDJTIycmVnaW9uJTIyJTNBJTIyVVMlMjIlN0Q%3D%26apn%3Dcom.shl.talentcentral%26ibi%3Dcom.shl.talentcentral%26isi%3D1551117793%26efr%3D1&data=05%7C02%7Comar.muqtar%40infosys.com%7Ca7ffe71a4fe4404f3dac08dca01c0bb3%7C63ce7d592f3e42cda8ccbe764cff5eb6%7C0%7C0%7C638561289526257677%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C&sdata=s28G3ArC9nR5S7J4j%2FV1ZujEnmYCbysbYke41r5svPw%3D&reserved=0 STEP 2: Click on "Start new test" and follow the instructions to complete the assessment. STEP 3: Once completed, please make a note of the AMCAT ID( Access you Amcat id by clicking 3 dots on top right corner of screen). NOTE: During registration, you'll be asked to provide the following information: Personal Details: Name, Email Address, Mobile Number, PAN number. Availability: Acknowledgement of work schedule preferences (Shifts, Work from Office, Rotational Weekends, 24/7 availability, Transport Boundary) and reason for career change. Employment Details: Current notice period and total annual compensation (CTC) in the format 390000 - 4 LPA (example). Candidate Information: 10-digit candidate ID starting with 100XXXXXXX, Gender, Source (e.g., Vendor name, Naukri/LinkedIn/Found it, or Direct), and Location Interview Mode: Walk-in Attempt all questions in the SHL Assessment app. The assessment is proctored, so choose a quiet environment. Use a headset or Bluetooth headphones for clear communication. A passing score is required for further interview rounds. 5 or above toggles, multi face detected, face not detected, or any malpractice will be considered rejected Once you've finished, submit the assessment and make a note of the AMCAT ID (15 Digit) used for the assessment. Documents to Carry: Please have a note of Candidate ID & AMCAT ID along with registered Email ID. Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Please carry 2 set of updated Resume/CV (Hard Copy). Please carry original ID proof for security clearance. Please carry individual headphone/Bluetooth for the interview. Pointers to note: Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Original Government ID card is must for Security Clearance. Regards, Infosys BPM Recruitment team.

Posted 1 week ago

Apply

5.0 - 8.0 years

11 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Key Responsibilities: Design, develop, and maintain ETL processes using tools such as Talend, Informatica, SSIS, or similar. Extract data from various sources, including databases, APIs, and flat files, transforming it to meet business requirements. Load transformed data into target systems while ensuring data integrity and accuracy. Collaborate with data analysts and business stakeholders to understand data needs and requirements. Optimize ETL processes for enhanced performance and efficiency. Debug and troubleshoot ETL jobs, providing effective solutions to data-related issues. Document ETL processes, data models, and workflows for future reference and team collaboration. Qualifications: • Bachelor's degree in computer science, Information Technology, or a related field. 3-5 years of experience in ETL development and data integration. Experience with Big Data technologies such as Hadoop or Spark. Knowledge of cloud platforms like AWS, Azure, or Google Cloud and their ETL services. Familiarity with data visualization tools such as Tableau or Power BI. Hands-on experience with Snowflake for data warehousing and analytics

Posted 1 week ago

Apply

6.0 - 8.0 years

19 - 25 Lacs

Noida, Chennai, Bengaluru

Hybrid

Naukri logo

We are seeking a highly skilled and experienced Senior ETL & Reporting QA Analyst to join our dynamic team. The ideal candidate will bring strong expertise in ETL and Report Testing, with a solid command of SQL, and hands-on experience in Informatica, as well as BI Reporting tools. A strong understanding of the Insurance domain is crucial to this role. This position will be instrumental in ensuring the accuracy, reliability, and performance of our data pipelines and reporting solutions. Key Responsibilities: Design, develop, and execute detailed test plans and test cases for ETL processes, data migration, and data warehousing solutions. Perform data validation and data reconciliation using complex SQL queries across various source and target systems. Validate Informatica ETL workflows and mappings to ensure accurate data transformation and loading. Conduct end-to-end report testing and dashboard validations using Cognos (preferred), or comparable BI tools such as Tableau or Power BI. Collaborate with cross-functional teams including Business Analysts, Developers, and Data Engineers to understand business requirements and transform them into comprehensive test strategies. Identify, log, and track defects to closure using test management tools and actively participate in defect triage meetings. Maintain and enhance test automation scripts and frameworks where applicable. Ensure data integrity, consistency, and compliance across reporting environments, particularly in the insurance domain context.

Posted 1 week ago

Apply

0.0 - 3.0 years

0 Lacs

Noida, Uttar Pradesh

On-site

Indeed logo

About the Role: We are seeking talented and detail-oriented Data Engineers with expertise in Informatica MDM to join our fast-growing data engineering team. Depending on your experience, you’ll join as a Software Engineer or Senior Software Engineer, contributing to the design, development, and maintenance of enterprise data management solutions that support our business objectives. As a key player, you will be responsible for building reliable data pipelines, working with master data management, and ensuring data quality, governance, and integration across systems. Responsibilities: Design, develop, and implement data pipelines using ETL tools like Informatica PowerCenter, IICS, etc., and MDM solutions using Informatica MDM. Develop and maintain batch and real-time data integration workflows. Collaborate with data architects, business analysts, and stakeholders to understand data requirements. Perform data profiling, data quality assessments, and master data matching/merging. Implement governance, stewardship, and metadata management practices. Optimize the performance of Informatica MDM Hub, IDD, and associated components. Write complex SQL queries and stored procedures as needed. Senior Software Engineer – Additional Responsibilities: Lead design discussions and code reviews; mentor junior engineers. Architect scalable data integration solutions using Informatica and complementary tools. Drive adoption of best practices in data modeling, governance, and engineering. Work closely with cross-functional teams to shape the data strategy. Required Qualifications: Software Engineer: Bachelor’s degree in Computer Science, Information Systems, or related field. 2–4 years of experience with Informatica MDM (Customer 360, Business Entity Services, Match/Merge rules). Strong SQL and data modeling skills. Familiarity with ETL concepts, REST APIs, and data integration tools. Understanding of data governance and quality frameworks. Senior Software Engineer: Bachelor’s or Master’s in Computer Science, Data Engineering, or related field. 4+ years of experience in Informatica MDM, with at least 2 years in a lead role. Proven track record of designing scalable MDM solutions in large-scale environments. Strong leadership, communication, and stakeholder management skills. Hands-on experience with data lakes, cloud platforms (AWS, Azure, or GCP), and big data tools is a plus. Preferred Skills (Nice to Have): Experience with other Informatica products (IDQ, PowerCenter). Exposure to cloud MDM platforms or cloud data integration tools. Agile/Scrum development experience. Knowledge of industry-standard data security and compliance practices. Job Type: Full-time Pay: ₹219,797.43 - ₹1,253,040.32 per year Benefits: Health insurance Schedule: Day shift Ability to commute/relocate: Noida, Uttar Pradesh: Reliably commute or planning to relocate before starting work (Preferred) Application Question(s): Are you willing to start immediately(Preferred)? Experience: Data warehouse: 3 years (Required) Informatica MDM: 3 years (Required) Work Location: In person

Posted 1 week ago

Apply

1.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Description Are you passionate about data? Does the prospect of dealing with massive volumes of data excite you? Do you want to build data engineering solutions that process billions of records a day in a scalable fashion using AWS technologies? Do you want to create the next-generation tools for intuitive data access? If so, Amazon Finance Technology (FinTech) is for you! FinTech is seeking a Data Engineer to join the team that is shaping the future of the finance data platform. The team is committed to building the next generation big data platform that will be one of the world's largest finance data warehouse to support Amazon's rapidly growing and dynamic businesses, and use it to deliver the BI applications which will have an immediate influence on day-to-day decision making. Amazon has culture of data-driven decision-making, and demands data that is timely, accurate, and actionable. Our platform serves Amazon's finance, tax and accounting functions across the globe. As a Data Engineer, you should be an expert with data warehousing technical components (e.g. Data Modeling, ETL and Reporting), infrastructure (e.g. hardware and software) and their integration. You should have deep understanding of the architecture for enterprise level data warehouse solutions using multiple platforms (RDBMS, Columnar, Cloud). You should be an expert in the design, creation, management, and business use of large data-sets. You should have excellent business and communication skills to be able to work with business owners to develop and define key business questions, and to build data sets that answer those questions. The candidate is expected to be able to build efficient, flexible, extensible, and scalable ETL and reporting solutions. You should be enthusiastic about learning new technologies and be able to implement solutions using them to provide new functionality to the users or to scale the existing platform. Excellent written and verbal communication skills are required as the person will work very closely with diverse teams. Having strong analytical skills is a plus. Above all, you should be passionate about working with huge data sets and someone who loves to bring data-sets together to answer business questions and drive change. Our ideal candidate thrives in a fast-paced environment, relishes working with large transactional volumes and big data, enjoys the challenge of highly complex business contexts (that are typically being defined in real-time), and, above all, is a passionate about data and analytics. In this role you will be part of a team of engineers to create world's largest financial data warehouses and BI tools for Amazon's expanding global footprint. Key job responsibilities Design, implement, and support a platform providing secured access to large datasets. Interface with tax, finance and accounting customers, gathering requirements and delivering complete BI solutions. Model data and metadata to support ad-hoc and pre-built reporting. Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions. Recognize and adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation. Tune application and query performance using profiling tools and SQL. Analyze and solve problems at their root, stepping back to understand the broader context. Learn and understand a broad range of Amazon’s data resources and know when, how, and which to use and which not to use. Keep up to date with advances in big data technologies and run pilots to design the data architecture to scale with the increased data volume using AWS. Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for datasets. Triage many possible courses of action in a high-ambiguity environment, making use of both quantitative analysis and business judgment. Basic Qualifications Experience with SQL 1+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Preferred Qualifications Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka Job ID: A2968106 Show more Show less

Posted 1 week ago

Apply

1.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Description Do you have the technical skill to build BI solutions that process billions of rows a day using AWS technologies? Do you want to create next-generation tools for intuitive data access? Do you wake up in the middle of the night with new ideas that will benefit your customers? Are you persistent in bringing your ideas to fruition? First things first, you know SQL and data modelling like the back of your hand. You also need to know Big Data and MPP systems. You have a history of coming up with innovative solutions to complex technical problems. You are a quick and willing learner of new technologies and have examples to prove your aptitude. You are not tool-centric; you determine what technology works best for the problem at hand and apply it accordingly. You can explain complex concepts to your non-technical customers in simple terms. Key job responsibilities Work with SDE teams and business stakeholders to understand data requirements and design data ingress flow for team Lead the design, model, and implementation of large, evolving, structured, semi-structured and unstructured datasets Evaluate and implement efficient distributed storage and query techniques Interact and integrate with internal and external teams and systems to extract, transform, and load data from a wide variety of sources Implement robust and maintainable code with clear and maintained documentation Implement test automation on code implemented through unit testing and integration testing Work in a tech stack which is a mix of NAWS services and legacy ETL tools within Amazon About The Team Data Insights, Metrics & Reporting team (DIMR) is the central data engineering team in Amazon Warehousing & Distribution org which is responsible for 4 things mainly - Building and maintaining data engineering and reporting infrastructure using NAWS to support internal/external data use-cases. Building data ingestions pipelines from any kind of upstream data sources which include (but not limited to) real time event streaming services, data lakes, manual file uploads, etc. Building mechanisms to vend data to internal team members or external sellers with right data handling techniques in place. Build robust data mart to support diverse use-cases powered by GenAI tool. Basic Qualifications 1+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Preferred Qualifications Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka Job ID: A2970459 Show more Show less

Posted 1 week ago

Apply

4.0 - 7.0 years

20 - 30 Lacs

Noida

Hybrid

Naukri logo

Role & responsibilities Collaborate with customers' Business and IT teams to define and gather solution requirements for custom development, B2B/ETL/EAI, and cloud integration initiatives using the Adeptia Integration Platform. Analyze, interpret, and translate customer business needs into scalable and maintainable technical solution designs, aligned with best practices and the capabilities of the Adeptia platform. Storyboard and present solutions to customers and prospects, ensuring a clear understanding of proposed designs and technical workflows. Provide end-to-end project leadership, including planning, tracking deliverables, and coordinating efforts with offshore development teams as required. Review implementation designs and provide architectural guidance and best practices to the implementation team to ensure high-quality execution. Actively assist and mentor customers in configuring and implementing the Adeptia platform, ensuring alignment with technical and business objectives. Solutions Lead (Implementation Services Team) Full Time (Permanent) Noida, India 1 Offer expert recommendations on design and configuration to ensure successful deployment and long-term maintainability of customer solutions. Define clear project requirements, create work breakdown structures, and establish realistic delivery timelines. Delegate tasks effectively, and manage progress against daily, weekly, and monthly targets, ensuring the team remains focused and productive. Serve as a liaison among customers, internal stakeholders, and offshore teams to maintain alignment, track progress, and ensure delivery meets both quality and timeline expectations. Monitor project baselines, identify and mitigate risks, and lead participation in all Agile ceremonies, including sprint grooming, planning, reviews, and retrospectives. Maintain a hands-on technical role, contributing to development activities and conducting detailed code reviews to ensure technical soundness and optimal performance. Take full ownership of assigned projects, driving them to successful, on-time delivery with high quality standards. Preferred candidate profile Proven experience in designing and developing integration solutions involving Cloud/SaaS applications, APIs, SDKs, and legacy systems. Skilled in implementing SOA/EAI principles and integration patterns in B2B, ETL, EAI, and Cloud Integration using platforms such as Adeptia, Talend, MuleSoft or similar tools. Good hands-on experience with Core Java (version 8+) and widely-used Java frameworks including Spring (version 6+), Hibernate (version 6+). Proficient in SOA, RESTful and SOAP web services and related technologies including JMS, SAAJ, JAXP, and XML technologies (XSD, XPath, XSLT, parsing). Strong command over SQL and RDBMS (e.g., Oracle, MySQL, PostgreSQL). Solid understanding of Enterprise Service Bus (ESB) concepts and messaging technologies such as Kafka and RabbitMQ. Familiar with transport protocols including HTTPS, Secure FTP, POP/IMAP/SMTP, and JDBC. Skilled in working with Windows and Linux operating systems, and experienced with application servers such as JBoss, Jetty, and Tomcat. Solid understanding of security best practices, including authentication, authorization, data encryption, and compliance frameworks relevant to enterprise integrations. Basic understanding of modern JavaScript frameworks such as React, with the ability to collaborate effectively on front-end and full-stack development scenarios

Posted 1 week ago

Apply

6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description Job Title : Data Engineer Location : Chennai Candidate Specification Any Graduate, Min 6+ years relevant Experience Job Description The role involves spinning up and managing AWS data infrastructure and building data ingestion pipelines in StreamSets and EMR. Candidates must have working experience with any modern ETL tools (PySpark or EMR, or Glue or others). Skills Required RoleData Engineer - Chennai Industry Type Functional Area Required Education Employment TypeFull Time, Permanent Key Skills DBT SNOWFLAKE AWS ETL INFORMATICA EMR PYSPARK GLUE Other Information Job CodeGO/JC/046/2025 Recruiter Name Key Skills DBT SNOWFLAKE AWS ETL INFORMATICA EMR PYSPARK GLUE Other Information Job CodeGO/JC/046/2025 Recruiter Name Show more Show less

Posted 1 week ago

Apply

1.0 years

4 - 6 Lacs

Hyderābād

On-site

GlassDoor logo

- 1+ years of data engineering experience - Experience with SQL - Experience with data modeling, warehousing and building ETL pipelines - Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) - Experience with one or more scripting language (e.g., Python, KornShell) Business Data Technologies (BDT) makes it easier for teams across Amazon to produce, store, catalog, secure, move, and analyze data at massive scale. Our managed solutions combine standard AWS tooling, open-source products, and custom services to free teams from worrying about the complexities of operating at Amazon scale. This lets BDT customers move beyond the engineering and operational burden associated with managing and scaling platforms, and instead focus on scaling the value they can glean from their data, both for their customers and their teams. We own the one of the biggest (largest) data lakes for Amazon where 1000’s of Amazon teams can search, share, and store EB (Exabytes) of data in a secure and seamless way; using our solutions, teams around the world can schedule/process millions of workloads on a daily basis. We provide enterprise solutions that focus on compliance, security, integrity, and cost efficiency of operating and managing EBs of Amazon data. Key job responsibilities CORE RESPONSIBILITIES: · Be hands-on with ETL to build data pipelines to support automated reporting · Interface with other technology teams to extract, transform, and load data from a wide variety of data sources · Implement data structures using best practices in data modeling, ETL/ELT processes, and SQL, Redshift. · Model data and metadata for ad-hoc and pre-built reporting · Interface with business customers, gathering requirements and delivering complete reporting solutions · Build robust and scalable data integration (ETL) pipelines using SQL, Python and Spark. · Build and deliver high quality data sets to support business analyst, data scientists, and customer reporting needs. · Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers · Participate in strategic & tactical planning discussions A day in the life As a Data Engineer, you will be working with cross-functional partners from Science, Product, SDEs, Operations and leadership to translate raw data into actionable insights for stakeholders, empowering them to make data-driven decisions. Some of the key activities include: Crafting the Data Flow: Design and build data pipelines, the backbone of our data ecosystem. Ensure the integrity of the data journey by implementing robust data quality checks and monitoring processes. Architect for Insights: Translate complex business requirements into efficient data models that optimize data analysis and reporting. Automate data processing tasks to streamline workflows and improve efficiency. Become a data detective! ensuring data availability and performance Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Knowledge of cloud services such as AWS or equivalent Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

As an employee at Thomson Reuters, you will play a role in shaping and leading the global knowledge economy. Our technology drives global markets and helps professionals around the world make decisions that matter. As the world’s leading provider of intelligent information, we want your unique perspective to create the solutions that advance our business and your career.Our Service Management function is transforming into a truly global, data and standards-driven organization, employing best-in-class tools and practices across all disciplines of Technology Operations. This will drive ever-greater stability and consistency of service across the technology estate as we drive towards optimal Customer and Employee experience. About the role: In this opportunity as Application Support Analyst, you will: Experience on Informatica support. The engineer will be responsible for supporting Informatica Development, Extractions, and loading. Fixing the data discrepancies and take care of performance monitoring. Collaborate with stakeholders such as business teams, product owners, and project management in defining roadmaps for applications and processes. Drive continual service improvement and innovation in productivity, software quality, and reliability, including meeting/exceeding SLAs. Thorough understanding of ITIL processes related to incident management, problem management, application life cycle management, operational health management. Experience in supporting applications built on modern application architecture and cloud infrastructure, Informatica PowerCenter/IDQ, Javascript frameworks and Libraries, HTML/CSS/JS, Node.JS, TypeScript, jQuery, Docker, AWS/Azure. About You: You're a fit for the role of Application Support Analyst - Informatica if your background includes: 3 to 8+ experienced Informatica Developer and Support will be responsible for implementation of ETL methodology in Data Extraction, Transformation and Loading. Have Knowledge in ETL Design of new or changing mappings and workflows with the team and prepares technical specifications. Should have experience in creating ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter 10.x and prepare corresponding documentation. Designs and builds integrations supporting standard data warehousing objects (type-2 dimensions, aggregations, star schema, etc.). Should be able to perform source system analysis as required. Works with DBAs and Data Architects to plan and implement appropriate data partitioning strategy in Enterprise Data Warehouse. Implements versioning of the ETL repository and supporting code as necessary. Develops stored procedures, database triggers and SQL queries where needed. Implements best practices and tunes SQL code for optimization. Loads data from SF Power Exchange to Relational database using Informatica. Works with XML's, XML parser, Java and HTTP transformation within Informatica. Experience in Integration of various data sources like Oracle, SQL Server, DB2 and Flat Files in various formats like fixed width, CSV, Salesforce and excel Manage. Have in depth knowledge and experience in implementing the best practices for design and development of data warehouses using Star schema & Snowflake schema design concepts. Experience in Performance Tuning of sources, targets, mappings, transformations, and sessions Carried out support and development activities in a relational database environment, designed tables, procedures/Functions, Packages, Triggers and Views in relational databases and used SQL proficiently in database programming using SNFL Thousand Coffees Thomson Reuters café networking. #LI-VGA1 What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here . Learn more on how to protect yourself from fraudulent job postings here . More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 1 week ago

Apply

15.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Informatica MDM Good to have skills : NA Minimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions, providing insights and solutions to enhance application performance and user experience. Your role will require you to stay updated with the latest technologies and methodologies to ensure the applications are built using best practices. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Facilitate communication between technical teams and stakeholders to ensure alignment on project goals. - Mentor junior team members, providing guidance and support in their professional development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM. - Strong understanding of data integration processes and methodologies. - Experience with data quality management and data governance practices. - Familiarity with database management systems and data modeling techniques. - Ability to troubleshoot and resolve application issues efficiently. Additional Information: - The candidate should have minimum 2 years of experience in Informatica MDM. - This position is based at our Hyderabad office. - A 15 years full time education is required. 15 years full time education

Posted 1 week ago

Apply

5.0 years

3 - 8 Lacs

Hyderābād

On-site

GlassDoor logo

Applause is raising the bar for digital quality and employee experience. Recognized as a Top Workplace, Applause provides award-winning software testing and UX research solutions to top brands. Our fully managed services leverage a global team and the world's largest independent testing community. We improve digital experiences for global innovators like Google, Microsoft, PayPal, Starbucks, Vodafone, and BMW. As a Business Intelligence Analyst you will be part of our global Data and Analytics team. This position will play a key role in maintaining and enhancing our enterprise business intelligence environment. This individual will form relationships with subject matter experts across the company and business leaders to help enhance business decisions and reporting capabilities by having a strong data background. The right candidate will exhibit outstanding data understanding, a drive to learn new systems and business data, and the ability to thrive in a fast-paced, and sometimes ambiguous, work environment. . Key Responsibilities: Available to work until 10:30 PM IST to ensure effective collaboration with global teams. Collaborating with business users and stakeholders to understand their data analysis and reporting requirements. Identifying the key metrics, dimensions, and data sources needed for the Qlik applications. Designing and implementing the data model within the Qlik environment. This includes extracting, transforming, and loading (ETL) data from various sources, creating data connections, and defining relationships between data tables. Developing interactive dashboards, visualizations, and reports using Qlik's data visualization tools. Designing and implementing user-friendly interfaces that allow users to explore data, apply filters, and drill down into details. Writing and maintaining Qlik scripting to load and transform data from different sources. This involves data cleansing, aggregation, joining tables, and implementing complex calculations or business logic. Writing, modifying, testing, and verifying SQL queries based on business requirements. Optimizing Qlik applications for performance and efficiency. Identifying and resolving issues related to data model design, data loading, scripting, or visualizations to ensure optimal application responsiveness and speed. Conducting thorough testing of Qlik applications to ensure data accuracy, functionality, and performance Documenting the design, development process, and application functionalities for future reference in Jira and internal training documentation Creating user guides and providing training to end-users on how to use the Qlik applications effectively. Designing and building complex BI solutions that have a global perspective, but can be flexible for regional-specific requirements. Working with colleagues across the company to obtain requirements, business logic, and technical details for BI solutions. Determining and scheduling data jobs during optimum business hours. Working closely with and collaborating with team members on initiatives. Maintaining high standards of data quality and integrity. Taking lead on projects, but collaborating with team members. Job Requirements and Preferred Skills: 5+ years working with Qlik Sense, Qlik View, or Qlik Cloud, other BI tool experience may be considered. 5+ years of Business Intelligence experience 5+ years of SQL writing experience Experience with Fivetran, Snowflake, Hightouch, Informatica, or other related tools is a plus Strong analytical skills to troubleshoot databases and data issues and identify and solutions. A clear sense of urgency and a desire to learn. Ability to manage communications effectively with various cultures and across multiple time zones across the globe. Excellent organizational, analytical, problem-solving and communication skills. Team player with solid communication and presentation skills. Why Applause? We’re proud to cultivate an inspiring, engaging employee culture that’s consistently reflected in high employee retention rates and satisfaction. Our talented team – known as Applause Nation – is set up for success with the latest collaboration and learning tools, opportunities for career advancement, and more. We have a flexible work environment with top talent from across the globe Collaborate with an international team of 450+ passionate, talented co-workers Expand your portfolio with exciting, hands-on projects providing exposure to well-known, global brands Learn and grow through structured onboarding, in-house knowledge sessions and access to thousands of virtual courses available on demand Incorporate AI and other exciting technologies into your work, to help you prioritize and boost productivity Experience a supportive culture that emphasizes teamwork, innovation and transparency Share your voice! Contribute and integrate creative and innovative ideas across roles and departments Applause Core Values: As a global employee community, we strive to uphold the following core values, which are critical to business success and how we measure individual and team performance. Do you share our core values? Be Accountable: You love to take ownership, and hold yourself and others accountable to increase empowerment and success. Celebrate Authenticity: You love bringing your true self to work and creating genuine and trustful relationships within a diverse environment. In It Together: You have a team-first mindset and love collaborating with your peers. Create Value for Our Customers: You love delivering meaningful business impact and being a release partner for all aspects of digital quality. Crush Your Goals: You always strive for excellence and constantly seek ways to be better, more effective and more efficient. #LI-DA1

Posted 1 week ago

Apply

15.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Informatica Data Quality Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with team members to enhance data workflows and contribute to the overall efficiency of data management practices. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the design and implementation of data architecture to support data initiatives. - Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica Data Quality. - Strong understanding of data integration techniques and ETL processes. - Experience with data profiling and data cleansing methodologies. - Familiarity with database management systems and SQL. - Knowledge of data governance and data quality best practices. Additional Information: - The candidate should have minimum 3 years of experience in Informatica Data Quality. - This position is based at our Hyderabad office. - A 15 years full time education is required. 15 years full time education

Posted 1 week ago

Apply

3.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

We are looking for a passionate and skilled Azure Data Engineer to join our team and help design, build, and maintain scalable data solutions on the Azure cloud platform. If you're experienced in Azure Data Factory, Synapse, and Databricks and enjoy solving complex data problems, we’d love to connect! Key Responsibilities Develop and maintain data pipelines using Azure Data Factory , Databricks , and Azure Synapse Analytics Design and implement robust data lake and data warehouse architectures on Azure Write complex SQL and Python scripts for data transformation and analysis Enable CI/CD for data pipelines and monitor pipeline performance Collaborate with data analysts and business stakeholders to build data models and reports Leverage tools like Azure Monitor and Log Analytics for proactive monitoring and debugging Required Qualifications 3+ years of hands-on experience as a Data Engineer working in Azure cloud environments Proficiency in Azure Data Factory, Synapse Analytics, Azure Data Lake (Gen2), Azure SQL , Databricks , and Microsoft Fabric Strong programming skills in SQL , Python , and Spark Experience in implementing CI/CD pipelines for data projects Solid understanding of data modeling, warehousing , and data architecture principles Familiarity with Power BI , Azure Monitor , and Log Analytics Excellent communication and problem-solving skills Preferred Qualifications Microsoft Certified: Azure Data Engineer Associate (DP-203) or similar certification Experience with real-time data processing tools like Azure Stream Analytics or Kafka Exposure to big data platforms and large-scale analytics systems Understanding of data governance and experience with tools such as Azure Purview , Informatica , or Data Catalog Why Join Us? Work with cutting-edge Azure technologies Opportunity to be part of impactful data-driven projects Collaborative and innovation-focused culture Competitive salary and flexible work environment 📩 Apply now or reach out to us directly to learn more about this exciting opportunity! Show more Show less

Posted 1 week ago

Apply

3.0 years

7 - 10 Lacs

Chennai

On-site

GlassDoor logo

Our software engineers at Fiserv bring an open and creative mindset to a global team developing mobile applications, user interfaces and much more to deliver industry-leading financial services technologies to our clients. Our talented technology team members solve challenging problems quickly and with quality. We're seeking individuals who can create frameworks, leverage developer tools, and mentor and guide other members of the team. Collaboration is key and whether you are an expert in a legacy software system or are fluent in a variety of coding languages you're sure to find an opportunity as a software engineer that will challenge you to perform exceptionally and deliver excellence for our clients. Full-time Entry, Mid, Senior Yes (occasional), Minimal (if any) Responsibilities Requisition ID R-10358182 Date posted 06/11/2025 End Date 06/26/2025 City Chennai State/Region Tamil Nadu Country India Location Type Onsite Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Professional, Software Development Engineering What does a successful Professional, Data Conversions doat Fiserv? A Conversion Professional is responsible for timely and accurate conversion of new and existing Bank/Client data to Fiserv systems, from both internal and external sources. This role is responsible for providing data analysis for client projects and to accommodate other ad hoc data updates to meet client requests. As part of the overall Service Delivery organization, a Conversion Professional plays a critical role in mapping in data to support project initiatives for new and existing banks. Working with financial services data means a high priority on accuracy and adherence to procedures and guidelines. What will you do A Conversion Professional is responsible for timely and accurate conversion of new and existing Bank/Client data to Fiserv systems, from both internal and external sources. This role is responsible for providing data analysis for client projects and to accommodate other ad hoc data updates to meet client requests. As part of the overall Service Delivery organization, a Conversion Professional plays a critical role in mapping in data to support project initiatives for new and existing banks. Working with financial services data means a high priority on accuracy and adherence to procedures and guidelines. The person stepping in as the backup would need to review the specifications history and then review and understand the code that was being developed to resolve the issue and or change. This would also have to occur on the switch back to the original developer. Today, the associate handling the project would log back in to support the effort and address the issue and or change. What you will need to have Bachelor’s degree in programming or related field Minimum 3 years’ relevant experience in data processing (ETL) conversions or financial services industry 3 – 5 years’ Experience and strong knowledge of MS SQL/PSQL, MS SSIS and data warehousing concepts Strong communication skills and ability to provide technical information to non-technical colleagues. Team players with ability to work independently. Experience in full software development life cycle using agile methodologies. Should have good understanding of Agile methodologies and can handle agile ceremonies. Efficient in Reviewing, coding, testing, and debugging of application/Bank programs. Should be able to work under pressure while resolving critical issues in Prod environment. Good communication skills and experience in working with Clients. Good understanding in Banking Domain. What would be great to have Experience with Informatica, Power BI, MS Visual Basic, Microsoft Access and Microsoft Excel required. Experience with Card Management systems, debit card processing is a plus Strong communication skills and ability to provide technical information to non-technical colleagues Ability to manage and prioritize work queue across multiple workstreams Team player with ability to work independently Highest attention to detail and accuracy Thank you for considering employment with Fiserv. Please: Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our commitment to Diversity and Inclusion: Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note to agencies: Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning about fake job posts: Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.

Posted 1 week ago

Apply

8.0 - 11.0 years

6 - 9 Lacs

Noida

On-site

GlassDoor logo

Snowflake - Senior Technical Lead Full-time Company Description About Sopra Steria Sopra Steria, a major Tech player in Europe with 50,000 employees in nearly 30 countries, is recognised for its consulting, digital services and solutions. It helps its clients drive their digital transformation and obtain tangible and sustainable benefits. The Group provides end-to-end solutions to make large companies and organisations more competitive by combining in-depth knowledge of a wide range of business sectors and innovative technologies with a collaborative approach. Sopra Steria places people at the heart of everything it does and is committed to putting digital to work for its clients in order to build a positive future for all. In 2024, the Group generated revenues of €5.8 billion. The world is how we shape it. Job Description Position: Snowflake - Senior Technical Lead Experience: 8-11 years Location: Noida/ Bangalore Education: B.E./ B.Tech./ MCA Primary Skills: Snowflake, Snowpipe, SQL, Data Modelling, DV 2.0, Data Quality, AWS, Snowflake Security Good to have Skills: Snowpark, Data Build Tool, Finance Domain Experience with Snowflake-specific features: Snowpipe, Streams & Tasks, Secure Data Sharing. Experience in data warehousing, with at least 2 years focused on Snowflake. Hands-on expertise in SQL, Snowflake scripting (JavaScript UDFs), and Snowflake administration. Proven experience with ETL/ELT tools (e.g., dbt, Informatica, Talend, Matillion) and orchestration frameworks. Deep knowledge of data modeling techniques (star schema, data vault) and performance tuning. Familiarity with data security, compliance requirements, and governance best practices. Experience in Python, Scala, or Java for Snowpark development is good to have. Strong understanding of cloud platforms (AWS, Azure, or GCP) and related services (S3, ADLS, IAM) Key Responsibilities Define data partitioning, clustering, and micro-partition strategies to optimize performance and cost. Lead the implementation of ETL/ELT processes using Snowflake features (Streams, Tasks, Snowpipe). Automate schema migrations, deployments, and pipeline orchestration (e.g., with dbt, Airflow, or Matillion). Monitor query performance and resource utilization; tune warehouses, caching, and clustering. Implement workload isolation (multi-cluster warehouses, resource monitors) for concurrent workloads. Define and enforce role-based access control (RBAC), masking policies, and object tagging. Ensure data encryption, compliance (e.g., GDPR, HIPAA), and audit logging are correctly configured. Establish best practices for dimensional modeling, data vault architecture, and data quality. Create and maintain data dictionaries, lineage documentation, and governance standards. Partner with business analysts and data scientists to understand requirements and deliver analytics-ready datasets. Stay current with Snowflake feature releases (e.g., Snowpark, Native Apps) and propose adoption strategies. Contribute to the long-term data platform roadmap and cloud cost-optimization initiatives. Qualifications BTech/MCA Additional Information At our organization, we are committed to fighting against all forms of discrimination. We foster a work environment that is inclusive and respectful of all differences. All of our positions are open to people with disabilities.

Posted 1 week ago

Apply

0 years

3 - 7 Lacs

Noida

On-site

GlassDoor logo

Req ID: 328454 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Business Intelligence Senior Specialist to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Mandatory Skills : Oracle SQL , PL/SQL, Informatica, Linux Scripting , Tidal Desired Skills : , Python scripting , Autosys - Responsible for workload prioritization and management of resources, both on service requests and small projects. - Maintaining and Providing the Status to mgmt. and Onshore leads - Expert in Architecture, design, develop and deliver ETL applications using Informatica , Teradata, Oracle , Tidal and Linux Scripting Hands on code development, source code control, specification writing and production implementation. Participate in requirement gathering sessions, guide the design & development by providing insights of data sources and peer reviews. Participate and guide data integration solutions that are needed to fill the data gaps Debug data quality issues by analyzing the upstream sources and provide guidance to data integration team resolutions Closely work with DBAs to fix performance bottlenecks Participate in technology governance groups that defines policies, best practices and make design decisions in on-going projects - Mentor junior developers regarding best practices and technology stacks used to build the application - Work closely with Operations, Teradata administration teams for code migrations and production support - Provide Resource and effort estimates for EDW - ETL & Extract projects. - Experience working as part of a global development team. Shoudl be able to bring innovation to provide value add to the customer . About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.

Posted 1 week ago

Apply

5.0 years

5 - 10 Lacs

Noida

On-site

GlassDoor logo

Country/Region: IN Requisition ID: 26208 Work Model: Position Type: Salary Range: Location: INDIA - NOIDA- BIRLASOFT OFFICE Title: Technical Specialist-Data Engg Description: Area(s) of responsibility Must have at least 5+ Years of working experience in ETL Informatica tools. Responsible for designing, developing, and maintaining complex data integration solutions using Informatica tools using Informatica PowerCenter10.x/9.5. PowerCenter Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools –Informatica Server, Repository Server manager. Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Exchange, Power Connect as ETL tool on Oracle, SQL Server Databases. Hands on experience in IICS and IDQ. Extensive experience in developing Stored Procedures, Views, Complex SQL queries using SQL Server and Oracle PL/SQL. Gather the requirements from the business and create detailed technical design documents, and model data flows for complex ETL processes using Informatica PowerCenter Experience in resolving on-going maintenance issues and bug fixes, monitoring Informatica sessions as well as performance tuning of mappings and sessions. Experience in all phases of Data warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing and Documentation. Proficient in the Integration of various data sources with multiple relational databases like Oracle11g /Oracle10g/9i, MS SQL Server, XML,Flat Files into the staging area, ODS, Data Warehouse and Data Mart. Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP. Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support. Ability and experience in managing/ coordinating with Onshore-Offshore teams Skills with M/O flag are part of Specialization Programming/Software Development -PL3 (Functional) Systems Integration And Build -PL3 (Functional) Help the tribe -PL2 (Behavioural) Think Holistically -PL2 (Behavioural) Database Design -PL1 (Functional) Win the Customer -PL2 (Behavioural) Data Visualisation -PL2 (Functional) One Birlasoft -PL2 (Behavioural) Data Management -PL2 (Functional) Results Matter -PL2 (Behavioural) Data Governance -PL1 (Functional) Get Future Ready -PL2 (Behavioural) Requirements Definition And Management -PL2 (Functional) Test Execution -PL2 (Functional) Data Engineering -PL3 (Functional) Data Modelling And Design -PL2 (Functional) MS SQL - PL3 (Mandatory) Python - PL3 (Mandatory) Informatica IICS - PL3 (Mandatory) Spark - PL2 (Optional) Oracle SQL - PL2 (Optional) Oracle PL/SQL - PL2 (Optional) Informatica Power Center - PL3 (Mandatory) Unix Shell Scripting - PL2 (Optional) Unix - PL2 (Optional)

Posted 1 week ago

Apply

0 years

0 Lacs

Noida

On-site

GlassDoor logo

Req ID: 328451 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Informatica PowerCenter Lead with IICS to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). andatory Skills : Oracle SQL , PL/SQL, Informatica, Linux Scripting , Tidal Desired Skills : Python scripting Responsible for workload prioritization and management of resources, both on service requests and small projects. - Maintaining and Providing the Status to mgmt. and Onshore leads - Expert in Architecture, design, develop and deliver ETL applications using Informatica , Teradata, Oracle , Tidal and Linux Scripting Hands on code development, source code control, specification writing and production implementation. Participate in requirement gathering sessions, guide the design & development by providing insights of data sources and peer reviews. Participate and guide data integration solutions that are needed to fill the data gaps Debug data quality issues by analyzing the upstream sources and provide guidance to data integration team resolutions Closely work with DBAs to fix performance bottlenecks Participate in technology governance groups that defines policies, best practices and make design decisions in on-going projects - Mentor junior developers regarding best practices and technology stacks used to build the application - Work closely with Operations, Teradata administration teams for code migrations and production support - Provide Resource and effort estimates for EDW - ETL & Extract projects. - Experience working as part of a global development team. Shoudl be able to bring innovation to provide value add to the customer . Shoudl be able to do innvations / Automations to provide value add to customer . About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.

Posted 1 week ago

Apply

8.0 - 12.0 years

6 - 8 Lacs

Noida

On-site

GlassDoor logo

You deserve to do what you love, and love what you do – a career that works as hard for you as you do. At Fiserv, we are more than 40,000 #FiservProud innovators delivering superior value for our clients through leading technology, targeted innovation and excellence in everything we do. You have choices – if you strive to be a part of a team driven to create with purpose, now is your chance to Find your Forward with Fiserv. Responsibilities Requisition ID R-10358179 Date posted 06/11/2025 End Date 07/15/2025 City Noida State/Region Uttar Pradesh Country India Location Type Onsite Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Specialist, Data Architecture What does a successful Lead, Data Conversions do? A Conversion Lead is responsible for timely and accurate conversion of new and existing Bank/Client data to Fiserv systems, from both internal and external sources. This role is responsible to provide data analysis for client projects and to accommodate other ad hoc data updates to meet client requests. As part of the overall Service Delivery organization, a Conversion Lead plays a critical role in mapping in data to support project initiatives for new and existing banks. Leads provide a specialized service to the Project Manager teams—developing custom reporting, providing technical assistance, and ensuring project timelines are met. Working with financial services data means a high priority on accuracy and adherence to procedures and guidelines. What you will do A Conversion Lead is responsible for timely and accurate conversion of new and existing Bank/Client data to Fiserv systems, from both internal and external sources. This role is responsible to provide data analysis for client projects and to accommodate other ad hoc data updates to meet client requests. As part of the overall Service Delivery organization, a Conversion Lead plays a critical role in mapping in data to support project initiatives for new and existing banks/clients. Lead provides a specialized service to the Project Manager teams—developing custom reporting, providing technical assistance, and ensuring project timelines are met. Working with financial services data means a high priority on accuracy and adherence to procedures and guidelines. The person stepping in as the backup would need to review the specifications history and then review and understand the code that was being developed to resolve the issue and or change. This would also have to occur on the switch back to the original developer. Today, the associate handling the project would log back in to support the effort and address the issue and or change. What you will need to have Bachelor’s degree in programming or related field Working Hours (IST): 12:00 p.m. – 09:00 p.m. (IST) Monday through Friday Highest attention to detail and accuracy Team player with ability to work independently Ability to manage and prioritize work queue across multiple workstreams Strong communication skills and ability to provide technical information to non-technical colleagues What would be great to have Experience with Data Modelling, Informatica, Power BI, MS Visual Basic, Microsoft Access and Microsoft Excel required. Experience with Card Management systems, debit card processing is a plus Understanding Applications and related database features that can be leveraged to improve performance Experience of creating testing artifacts (test cases, test plans) and knowledge of various testing types. 8 – 12 years’ Experience and strong knowledge of MS SQL/PSQL, MS SSIS and Data warehousing concepts Should have strong database fundamentals and Expert knowledge in writing SQL commands, queries and stored procedures Experience in Performance Tuning of SQL complex queries. Strong communication skills and ability to provide technical information to non-technical colleagues. Ability to mentor junior team members Ability to manage and prioritize work queue across multiple workstreams. Team player with ability to work independently. Experience in full software development life cycle using agile methodologies. Should have good understanding of Agile methodologies and can handle agile ceremonies. Efficient in Reviewing, Analyzing, coding, testing, and debugging of application programs. Should be able to work under pressure while resolving critical issues in Prod environment. Good communication skills and experience in working with Clients. Good understanding in Banking Domain. Minimum 8 years’ relevant experience in data processing (ETL) conversions or financial services industry Thank you for considering employment with Fiserv. Please: Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our commitment to Diversity and Inclusion: Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note to agencies: Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning about fake job posts: Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.

Posted 1 week ago

Apply

7.0 years

0 Lacs

Jaipur

On-site

GlassDoor logo

ABOUT HAKKODA Hakkoda, an IBM Company, is a modern data consultancy that empowers data driven organizations to realize the full value of the Snowflake Data Cloud. We provide consulting and managed services in data architecture, data engineering, analytics and data science. We are renowned for bringing our clients deep expertise, being easy to work with, and being an amazing place to work! We are looking for curious and creative individuals who want to be part of a fast-paced, dynamic environment, where everyone’s input and efforts are valued. We hire outstanding individuals and give them the opportunity to thrive in a collaborative atmosphere that values learning, growth, and hard work. Our team is distributed across North America, Latin America, India and Europe. If you have the desire to be a part of an exciting, challenging, and rapidly-growing Snowflake consulting services company, and if you are passionate about making a difference in this world, we would love to talk to you!. We are looking for people experienced with data architecture, design and development of database mapping and migration processes. This person will have direct experience optimizing new and current databases, data pipelines and implementing advanced capabilities while ensuring data integrity and security. Ideal candidates will have strong communication skills and the ability to guide clients and project team members. Acting as a key point of contact for direction and expertise. Key Responsibilities Design, develop, and optimize database architectures and data pipelines. Ensure data integrity and security across all databases and data pipelines. Lead and guide clients and project team members, acting as a key point of contact for direction and expertise. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Manage and support large-scale technology programs, ensuring they meet business objectives and compliance requirements. Develop and implement migration, dev/ops, and ETL/ELT ingestion pipelines using tools such as DataStage, Informatica, and Matillion. Utilize project management skills to work effectively within Scrum and Agile Development methods. Create and leverage metrics to develop actionable and measurable insights, influencing business decisions. Qualifications 7+ years of proven work experience in data warehousing, business intelligence (BI), and analytics. 3+ years of experience as a Data Architect. 3+ years of experience working on Cloud platforms (AWS, Azure, GCP). Bachelor's Degree (BA/BS) in Computer Science, Information Systems, Mathematics, MIS, or a related field. Strong understanding of migration processes, dev/ops, and ETL/ELT ingestion pipelines. Proficient in tools such as DataStage, Informatica, and Matillion. Excellent project management skills and experience with Scrum and Agile Development methods. Ability to develop actionable and measurable insights and create metrics to influence business decisions. Previous consulting experience managing and supporting large-scale technology programs. Nice to Have 6-12 months of experience working with Snowflake. Understanding of Snowflake design patterns and migration architectures. Knowledge of Snowflake roles, user security, and capabilities like Snowpipe. Proficiency in SQL scripting. Cloud experience on AWS (Azure and GCP are also beneficial) Python scripting skills. Benefits: Health Insurance Paid leave Technical training and certifications Robust learning and development opportunities Incentive Toastmasters Food Program Fitness Program Referral Bonus Program Hakkoda is committed to fostering diversity, equity, and inclusion within our teams. A diverse workforce enhances our ability to serve clients and enriches our culture. We encourage candidates of all races, genders, sexual orientations, abilities, and experiences to apply, creating a workplace where everyone can succeed and thrive. Ready to take your career to the next level? \uD83D\uDE80 \uD83D\uDCBB Apply today\uD83D\uDC47 and join a team that’s shaping the future!! Hakkoda is an IBM subsidiary which has been acquired by IBM and will be integrated in the IBM organization. Hakkoda will be the hiring entity. By Proceeding with this application, you understand that Hakkoda will share your personal information with other IBM subsidiaries involved in your recruitment process, wherever these are located. More information on how IBM protects your personal information, including the safeguards in case of cross-border data transfer, are available here.

Posted 1 week ago

Apply

0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

P2 C1 STS Primary- Informatica IDQ 9 or higher Secondary- PL/SQL skills JD- IDQ/Informatica Developer experience working on transformations, mapplet, mapping and Scorecards Informatica Axon tool experience Experience Translating business rules into IDQ Rules, IDQ application deployment, Schedulers Knowledge with IDQ repository objects IDQ Performance management & Access management Understands concept for Data governance Experience with IDQ and Axon integration Informatica PowerCenter experience (9 higher) Experience with unix scripting IBM Information analyzer : Good to have Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

Role: IICS Developer Work Mode: Hybrid Work timings: 2pm to 11pm Location: Chennai & Hyderabad Primary Skills: IICS Job Summary We are looking for a highly experienced Senior Lead Data Engineer role with strong expertise in Informatica IICS, Snowflake, Unix/Linux Shell Scripting, CI/CD tools, Agile, and cloud platforms. The ideal candidate will lead complex data engineering initiatives, optimize data architecture, and drive automation while ensuring high standards of data quality and governance within an agile environment. Required Qualifications Required Minimum 5+ years of experience in data warehousing and data warehouse concepts. Extensive experience in Informatica IICS, and Snowflake. Experience in designing, developing, and maintaining data integration solutions using IICS. Experience in designing, implementing, and optimizing data storage and processing solutions using Snowflake. Design and execute complex SQL queries for data extraction, transformation, and analysis. Strong proficiency in Unix/Linux shell scripting and SQL. Extensive expertise in CI/CD tools and ESP Scheduling. Experience working in agile environments, with a focus on iterative improvements and collaboration. Knowledge of SAP Data Services is an added advantage. Expertise in cloud platforms (AWS, Azure). Proven track record in data warehousing, data integration, and data governance. Excellent data analysis and data profiling skills Collaborate with stakeholders to define data requirements and develop effective data strategies. Strong leadership and communication skills, with the ability to drive strategic data initiatives. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

requirement to work in night shifts Should have 5 years of experience in Abnitio, ETL Informatica, AWS. Develop, test, and maintain ETL workflows using Informatica or Ab Initio under senior guidance. Monitor and manage batch jobs using Autosys or Control-M. Write SQL queries for data extraction and transformation. Collaborate with QA, BA, and senior team members for issue resolution. Document code, job schedules, and workflows. Assist in basic performance monitoring using Dynatrace. Show more Show less

Posted 1 week ago

Apply

4.0 - 7.0 years

10 - 19 Lacs

Bengaluru

Hybrid

Naukri logo

Job Description Experience 4 to 7 years. Experience in any ETL tools [e.g. DataStage] with implementation experience in large Data Warehouse Proficiency in programming languages such as Python etc. Experience with data warehousing solutions (e.g., Snowflake, Redshift) and big data technologies (e.g., Hadoop, Spark). Strong knowledge of SQL and database management systems. Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and data pipeline orchestration tools (e.g. Airflow). Proven ability to lead and develop high-performing teams, with excellent communication and interpersonal skills. Strong analytical and problem-solving abilities, with a focus on delivering actionable insights. Responsibilities Design, develop, and maintain advanced data pipelines and ETL processes using niche technologies. Collaborate with cross-functional teams to understand complex data requirements and deliver tailored solutions. Ensure data quality and integrity by implementing robust data validation and monitoring processes. Optimize data systems for performance, scalability, and reliability. Develop comprehensive documentation for data engineering processes and systems.

Posted 1 week ago

Apply

Exploring Informatica Jobs in India

The informatica job market in India is thriving with numerous opportunities for skilled professionals in this field. Companies across various industries are actively hiring informatica experts to manage and optimize their data integration and data quality processes.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

Average Salary Range

The average salary range for informatica professionals in India varies based on experience and expertise: - Entry-level: INR 3-5 lakhs per annum - Mid-level: INR 6-10 lakhs per annum - Experienced: INR 12-20 lakhs per annum

Career Path

A typical career progression in the informatica field may include roles such as: - Junior Developer - Informatica Developer - Senior Developer - Informatica Tech Lead - Informatica Architect

Related Skills

In addition to informatica expertise, professionals in this field are often expected to have skills in: - SQL - Data warehousing - ETL tools - Data modeling - Data analysis

Interview Questions

  • What is Informatica and why is it used? (basic)
  • Explain the difference between a connected and unconnected lookup transformation. (medium)
  • How can you improve the performance of a session in Informatica? (medium)
  • What are the various types of cache in Informatica? (medium)
  • How do you handle rejected rows in Informatica? (basic)
  • What is a reusable transformation in Informatica? (basic)
  • Explain the difference between a filter and router transformation in Informatica. (medium)
  • What is a workflow in Informatica? (basic)
  • How do you handle slowly changing dimensions in Informatica? (advanced)
  • What is a mapplet in Informatica? (medium)
  • Explain the difference between an aggregator and joiner transformation in Informatica. (medium)
  • How do you create a mapping parameter in Informatica? (basic)
  • What is a session and a workflow in Informatica? (basic)
  • What is a rank transformation in Informatica and how is it used? (medium)
  • How do you debug a mapping in Informatica? (medium)
  • Explain the difference between static and dynamic cache in Informatica. (advanced)
  • What is a sequence generator transformation in Informatica? (basic)
  • How do you handle null values in Informatica? (basic)
  • Explain the difference between a mapping and mapplet in Informatica. (basic)
  • What are the various types of transformations in Informatica? (basic)
  • How do you implement partitioning in Informatica? (medium)
  • Explain the concept of pushdown optimization in Informatica. (advanced)
  • How do you create a session in Informatica? (basic)
  • What is a source qualifier transformation in Informatica? (basic)
  • How do you handle exceptions in Informatica? (medium)

Closing Remark

As you prepare for informatica job opportunities in India, make sure to enhance your skills, stay updated with the latest trends in data integration, and approach interviews with confidence. With the right knowledge and expertise, you can excel in the informatica field and secure rewarding career opportunities. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies