Home
Jobs

2362 Informatica Jobs - Page 22

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. A career in IBM Consulting embraces long-term relationships and close collaboration with clients across the globe. You will collaborate with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio, including IBM Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you will be supported by mentors and coaches who will encourage you to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground-breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and learning opportunities in an environment that embraces your unique skills and experience Your Role And Responsibilities As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Your Primary Responsibilities Include Develop & maintain data pipelines for batch & stream processing using informatica power centre or cloud ETL/ELT tools. Liaise with business team and technical leads, gather requirements, identify data sources, identify data quality issues, design target data structures, develop pipelines and data processing routines, perform unit testing and support UAT. Work with data scientist and business analytics team to assist in data ingestion and data-related technical issues. Preferred Education Master's Degree Required Technical And Professional Expertise Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter Knowledge of Cloud, Power BI, Data migration on cloud skills. Experience in Unix shell scripting and python Experience with relational SQL, Big Data etc Preferred Technical And Professional Experience Knowledge of MS-Azure Cloud Experience in Informatica PowerCenter Experience in Unix shell scripting and python Show more Show less

Posted 1 week ago

Apply

1.0 - 3.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

The opportunity At Hitachi Energy, we are building a future-ready data ecosystem. As a Data Governance Specialist, you will be a key enabler in shaping and operationalizing our enterprise-wide data governance framework. You will focus on the implementation and evolution of our Data Catalog, Metadata Management, and Data Compliance initiatives ensuring our data assets are trusted, discoverable, and aligned with business value . This role is ideal for early-career professionals with a can-do mindset and a passion for making things happen. You will work in a dynamic, cross-functional environment that values curiosity, ownership, and ethical leadership. How you ll make an impact Data Catalog Compliance Implementation learning by doing Define and maintain the roadmap for the Enterprise Data Catalog and Data Supermarket Con figur e and execute deployment of cataloging tools (e. g. , metadata management, lineage, glossary) Ensure alignment with DAMA - DMBOK principles Governance Framework Execution Collaborate with Data Owners, Stewards, and Custodians to define and enforce data policies, standards, and RACI mode Support the Data Governance Council and contribute to the development of governance artifacts (e. g. , roles, regulations, KPIs) Data Quality Stewardship Partner with domain experts to drive data profiling, cleansing, and validation initiatives Monitor data quality metrics and support remediation efforts across domains Stakeholder Engagement Enablement Provide training and support to business users on catalog usage and governance practices Act as a liaison between business and IT to ensure data needs are met and governance is embedded in operations Innovation Continuous Improvement Stay current with industry trends and tool capabilities (e. g. , Databricks, SAP MDG) Propose enhancements to governance processes and tooling based on user feedback and analytics Your background Bachelors degree in information systems , Data Science, Business Informatics, or related field 1-3 years of experience in data governance, data management, or analytics roles Familiarity with DAMA DMBOK2 frame wo rk and data governance tools ( e. g. SAP MDG, Data Sphere, Business Warehouse , Data Intelligence, Informatica ETL , ) Strong communication and collaboration skills; ability to work across business and technical teams. Proactive, solution-oriented, and eager to learn ready to make it happen. Autonomy and ambiguity management capacities are competitive advantages Complete the preference for a candidate, new technology focus, next stop learning and embracing the challenge attitude CDMP certifications preference attribute More about us When joining us you may expect: A purpose-driven role in a global energy leader committed to sustainability and digital transformation Mentorship and development opportunities within a diverse and inclusive team. a initiatives and cutting-edge technologies A culture that values integrity, curiosity, and collaboration aligned with Hitachi Energy s Leadership Pillars: Lead with Purpose Opportunity to create customer value Drive Results Build Collaboration Develop Self Others Qualified individuals with a disability may request a reasonable accommodation if you are unable or limited in your ability to use or access the Hitachi Energy career site as a result of your disability. You may request reasonable accommodations by completing a general inquiry form on our website. Please include your contact information and specific details about your required accommodation to support you during the job application process. .

Posted 1 week ago

Apply

2.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Description Amazon Music is awash in data! To help make sense of it all, the DISCO (Data, Insights, Science & Optimization) team: (i) enables the Consumer Product Tech org make data driven decisions that improve the customer retention, engagement and experience on Amazon Music. We build and maintain automated self-service data solutions, data science models and deep dive difficult questions that provide actionable insights. We also enable measurement, personalization and experimentation by operating key data programs ranging from attribution pipelines, northstar weblabs metrics to causal frameworks. (ii) delivering exceptional Analytics & Science infrastructure for DISCO teams, fostering a data-driven approach to insights and decision making. As platform builders, we are committed to constructing flexible, reliable, and scalable solutions to empower our customers. (iii) accelerates and facilitates content analytics and provides independence to generate valuable insights in a fast, agile, and accurate way. This domain provides analytical support for the below topics within Amazon Music: Programming / Label Relations / PR / Stations / Livesports / Originals / Case & CAM. DISCO team enables repeatable, easy, in depth analysis of music customer behaviors. We reduce the cost in time and effort of analysis, data set building, model building, and user segmentation. Our goal is to empower all teams at Amazon Music to make data driven decisions and effectively measure their results by providing high quality, high availability data, and democratized data access through self-service tools. If you love the challenges that come with big data then this role is for you. We collect billions of events a day, manage petabyte scale data on Redshift and S3, and develop data pipelines using Spark/Scala EMR, SQL based ETL, Airflow and Java services. We are looking for talented, enthusiastic, and detail-oriented Data Engineer, who knows how to take on big data challenges in an agile way. Duties include big data design and analysis, data modeling, and development, deployment, and operations of big data pipelines. You'll help build Amazon Music's most important data pipelines and data sets, and expand self-service data knowledge and capabilities through an Amazon Music data university. DISCO team develops data specifically for a set of key business domains like personalization and marketing and provides and protects a robust self-service core data experience for all internal customers. We deal in AWS technologies like Redshift, S3, EMR, EC2, DynamoDB, Kinesis Firehose, and Lambda. Your team will manage the data exchange store (Data Lake) and EMR/Spark processing layer using Airflow as orchestrator. You'll build our data university and partner with Product, Marketing, BI, and ML teams to build new behavioural events, pipelines, datasets, models, and reporting to support their initiatives. You'll also continue to develop big data pipelines. Key job responsibilities Deep understanding of data, analytical techniques, and how to connect insights to the business, and you have practical experience in insisting on highest standards on operations in ETL and big data pipelines. With our Amazon Music Unlimited and Prime Music services, and our top music provider spot on the Alexa platform, providing high quality, high availability data to our internal customers is critical to our customer experiences. Assist the DISCO team with management of our existing environment that consists of Redshift and SQL based pipelines. The activities around these systems will be well defined via standard operation procedures (SOP) and typically involve approving data access requests, subscribing or adding new data to the environment SQL data pipeline management (creating or updating existing pipelines) Perform maintenance tasks on the Redshift cluster. Assist the team with the management of our next-generation AWS infrastructure. Tasks includes infrastructure monitoring via CloudWatch alarms, infrastructure maintenance through code changes or enhancements, and troubleshooting/root cause analysis infrastructure issues that arise, and in some cases this resource may also be asked to submit code changes based on infrastructure issues that arise. About The Team Amazon Music is an immersive audio entertainment service that deepens connections between fans, artists, and creators.From personalized music playlists to exclusive podcasts,concert livestreams to artist merch,we are innovating at some of the most exciting intersections of music and culture.We offer experiences that serve all listeners with our different tiers of service:Prime members get access to all music in shuffle mode,and top ad-free podcasts,included with their membership;customers can upgrade to Music Unlimited for unlimited on-demand access to 100 million songs including millions in HD,Ultra HD,spatial audio and anyone can listen for free by downloading Amazon Music app or via Alexa-enabled devices.Join us for opportunity to influence how Amazon Music engages fans, artists,and creators on a global scale. Basic Qualifications 2+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with SQL Experience with one or more scripting language (e.g., Python, KornShell) Experience in Unix Experience in Troubleshooting the issues related to Data and Infrastructure issues. Preferred Qualifications Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Knowledge of distributed systems as it pertains to data storage and computing Experience in building or administering reporting/analytics platforms Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI MAA 15 SEZ Job ID: A2838395 Show more Show less

Posted 1 week ago

Apply

2.0 - 5.5 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Role: Associate Tower: Data, Analytics & Specialist Managed Service Experience: 2.0 - 5.5 years Key Skills: AWS Educational Qualification: BE / B Tech / ME / M Tech / MBA Work Location: India.;l Job Description As a Associate, you will work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review, Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Demonstrate leadership capabilities by working, with clients directly and leading the engagement. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good team player, take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements Required Skills: AWS Cloud Engineer Job description: Candidate is expected to demonstrate extensive knowledge and/or a proven record of success in the following areas: Should have minimum 2 years hand on experience building advanced Data warehousing solutions on leading cloud platforms. Should have minimum 1-3 years of Operate/Managed Services/Production Support Experience Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data that for downstream consumption like Business Intelligence systems, Analytics modelling, Data scientists etc. Designing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems, such as data warehouses or data lakes. Should have experience in building efficient, ETL/ELT processes using industry leading tools like AWS, AWS GLUE, AWS Lambda, AWS DMS, PySpark, SQL, Python, DBT, Prefect, Snoflake, etc. Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in AWS. Work together with data scientists and analysts to understand the needs for data and create effective data workflows. Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Improve the scalability, efficiency, and cost-effectiveness of data pipelines. Monitoring and troubleshooting data pipelines and resolving issues related to data processing, transformation, or storage. Implementing and maintaining data security and privacy measures, including access controls and encryption, to protect sensitive data Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Should have experience in Building and maintaining Data Governance solutions (Data Quality, Metadata management, Lineage, Master Data Management and Data security) using industry leading tools Scaling and optimizing schema and performance tuning SQL and ETL pipelines in data lake and data warehouse environments. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Should have Experience of ITIL processes like Incident management, Problem Management, Knowledge management, Release management, Data DevOps etc. Should have Strong communication, problem solving, quantitative and analytical abilities. Nice To Have AWS certification Managed Services- Data, Analytics & Insights Managed Service At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights where we focus more so on the evolution of our clients’ Data and Analytics ecosystem. Our focus is to empower our clients to navigate and capture the value of their Data & Analytics portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Data, Analytics & Insights offerings and engagement including help desk support, enhancement, and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective. Show more Show less

Posted 1 week ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Bengaluru, Mumbai (All Areas)

Work from Office

Naukri logo

Job Summary We are seeking a skilled Data Engineer to design, develop, and optimize scalable data pipelines and infrastructure. The ideal candidate will have expertise in relational databases, data modeling, cloud migration, and automation , working closely with cross-functional teams to drive data-driven decision-making. Key Responsibilities Design, develop, and maintain scalable data pipeline architectures for seamless data extraction, transformation, and distribution. Optimize data storage, processing, and availability while ensuring scalability and performance. Lead the migration of on-premise data platforms to cloud-based infrastructure . Collaborate with senior management, technology teams, and clients to support data-related initiatives and troubleshoot complex issues. Qualifications & Experience 5-10 years of hands-on experience in Data Engineering . Strong expertise in SQL , data modeling, and relational databases (Sybase, SQL Server, PostgreSQL, DB2, etc.). Experience with distributed database systems and handling large, disconnected datasets . Proficiency in cloud platforms (Azure, Databricks, Snowflake preferred). Experience with ETL tools (Azure Data Factory, Informatica – a plus). Familiarity with job scheduling tools (Autosys – a plus). Strong scripting skills (Shell, Python – a plus). Experience working in agile teams across globally distributed environments.

Posted 1 week ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

Azurity Pharmaceuticals is a privately held, specialty pharmaceutical company that focuses on innovative products that meet the needs of underserved patients. As an industry leader in providing unique, accessible, and high-quality medications, Azurity leverages its integrated capabilities and vast partner network to continually expand its broad commercial product portfolio and robust late-stage pipeline. The company’s patient-centric products span the cardiovascular, neurology, endocrinology, gastro-intestinal, institutional, and orphan markets, and have benefited millions of patients. For more information, visit www.azurity.com. Azurity Pharmaceuticals is proud to be an inclusive workplace and an Equal Opportunity Employer. Azurity's success is attributable to our incredibly talented, dedicated team that focuses on benefiting the lives of patients by bringing the best science and commitment to quality into everything that we do. We seek highly motivated individuals with the dedication, integrity, and creative spirit needed to thrive in our organization. Brief Team/department Description Our Digital team at Azurity is building new capabilities utilizing cutting-edge Salesforce systems. We are looking for a dynamic, change inspired, Individual self-driven hands-on Team Member. The Salesforce Developer – Life Sciences is responsible for designing, developing, and optimizing Salesforce solutions to support Azurity pharma. This role focuses on customizing Salesforce platform ensuring seamless HCP/HCO engagement, sales rep support, regulatory compliance, and commercial operations. The ideal candidate will collaborate with onshore architects, business analysts, and stakeholders to develop scalable, high-performing Salesforce solutions while maintaining compliance with HIPAA, GDPR, Sunshine Act, and FDA regulations. Principle Responsibilities Salesforce Development & Customization Develop and enhance Life Sciences-specific CRM functionalities in Salesforce platform and Sales Cloud to support HCP/HCO engagement, Commercial Operations, KOL (Key Opinion Leader) management, and field rep journeys. Customize Salesforce objects, Apex triggers, Lightning Web Components (LWC), Visualforce pages, and declarative automation (Flows, Process Builder). Implement consent tracking, call planning, sample management, and omnichannel engagement workflows for field reps, MSLs (Medical Science Liaisons), and sales teams. Ensure territory management, commercial operations, and compliance tracking are seamlessly integrated into Salesforce. Performance Optimization & Security Optimize Apex code, SOQL queries, and Lightning Web Components for scalability and high performance. Implement role-based security, audit logs, and field-level encryption to maintain compliance with HIPAA, GDPR, and FDA regulations. Conduct code reviews, unit testing, and debugging to ensure high-quality solution delivery. Collaboration & Agile Development Work closely with onshore architects, business analysts, and product owners to gather requirements and translate them into technical solutions. Participate in scrum meetings, sprint planning, and UAT (User Acceptance Testing) as part of an Agile team. Provide technical documentation and deployment support for Salesforce enhancements. Continuous Improvement & Best Practices Stay updated with Salesforce releases, Life Sciences Cloud advancements, and best practices. Implement Salesforce DevOps methodologies, using tools like Gearset, Copado, Jenkins for CI/CD automation. Preferred Skills And Experience 3+ years of experience in Salesforce development, preferably in Life Sciences or Healthcare. Expertise in Salesforce Health Cloud, Sales Cloud, or Veeva CRM Proficiency in Apex, Lightning Web Components (LWC), Visualforce, SOQL, and API development. Hands-on experience with Salesforce APIs (REST, SOAP), middleware tools (MuleSoft, Informatica, Boomi), and data migration. Hands-on experience with Salesforce DevOps tools (Gearset, Copado, Jenkins) for release management. Understanding of HIPAA, GDPR, Sunshine Act, FDA regulations, and data security best practices. Experience working in a global offshore-onshore collaboration model using Agile methodologies. Salesforce Platform Developer I & II certifications required. Excellent problem-solving and communication skills in a remote, global team setup. By applying for this role, you confirm that you are mentally and physically capable of fulfilling the job responsibilities detailed in the job description without any restrictions. If you have any concerns or even the slightest disability that may affect your ability to perform the job, please inform HR in advance. Show more Show less

Posted 1 week ago

Apply

2.0 - 4.0 years

4 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Greeting from Infosys BPM Ltd, Exclusive Women's Walkin drive We are hiring for MIM, Content and Technical writer, Informatica skills. Please walk-in for interview on 16th June 2025 at Hyderabad location Note: Please carry copy of this email to the venue and make sure you register your application before attending the walk-in. Please use below link to apply and register your application. Please mention Candidate ID on top of the Resume *** https://career.infosys.com/jobdesc?jobReferenceCode=PROGEN-HRODIRECT-215115 Interview details Interview Date: 16th June 2025 Interview Time: 10 AM till 1 PM Interview Venue: Hyderabad :: Infosys STP Infosys STP Madhava Reddy colony, Near Wipro Circle Gachibowli Hyderabad 500032 Please find below Job Description for your reference: Work from Office*** Min 2 years of experience on project is mandate*** Job Description: MIM Strong knowledge of IT service management including ITIL Responding to a reported incident, identifying the cause, and initiating the incident management process. Participate in root cause analysis meetings, gathering lessons learned and managing and implement continuous improvement processes. Ensuring Client SLAs / KPIs and Customer satisfaction expectations are achieved. Restore a failed IT Service as quickly as possible. Job Description: Content and Technical writer Develop high-quality technical documents, including user manuals, guides, and release notes. Collaborate with cross-functional teams to gather requirements and create accurate documentation. Conduct functional testing and manual testing to ensure compliance with FDA regulations. Ensure adherence to ISO standards and maintain a clean, organized document management system. Strong understanding of Infra domain Technical writer that can convert complex technical concepts into easy to consume documents for the targeted audience. In addition, will also be a mentor to the team with technical writing. Job Description: Informatica Strong experience in ETL development using Informatica and SQL. Design, develop, and maintain ETL workflows using Informatica PowerCenter. Perform data extraction, transformation, and loading (ETL) from various sources into data warehouses. Develop and optimize SQL queries, stored procedures, views, and indexing for performance tuning. Ensure data integrity, consistency, and accuracy in all ETL processes. Monitor, debug, and troubleshoot ETL failures and performance bottlenecks. REGISTRATION PROCESS: The Candidate ID & SHL Test(AMCAT ID) is mandatory to attend the interview. Please follow the below instructions to successfully complete the registration. (Talents without registration & assessment will not be allowed for the Interview). Candidate ID Registration process: STEP 1: Visit: https://career.infosys.com/joblist STEP 2: Click on "Register" and provide the required details and submit. STEP 3: Once submitted, Your Candidate ID(100XXXXXXXX) will be generated. STEP 4: The candidate ID will be shared to the registered Email ID. SHL Test(AMCAT ID) Registration process: This assessment is proctored, and talent gets evaluated on Basic analytics, English Comprehension and writex (email writing). STEP 1: Visit: https://apc01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fautologin-talentcentral.shl.com%2F%3Flink%3Dhttps%3A%2F%2Famcatglobal.aspiringminds.com%2F%3Fdata%3DJTdCJTIybG9naW4lMjIlM0ElN0IlMjJsYW5ndWFnZSUyMiUzQSUyMmVuLVVTJTIyJTJDJTIyaXNBdXRvbG9naW4lMjIlM0ExJTJDJTIycGFydG5lcklkJTIyJTNBJTIyNDE4MjQlMjIlMkMlMjJhdXRoa2V5JTIyJTNBJTIyWm1abFpUazFPV1JsTnpJeU1HVTFObU5qWWpRNU5HWTFOVEU1Wm1JeE16TSUzRCUyMiUyQyUyMnVzZXJuYW1lJTIyJTNBJTIydXNlcm5hbWVfc3E5QmgxSWI5NEVmQkkzN2UlMjIlMkMlMjJwYXNzd29yZCUyMiUzQSUyMnBhc3N3b3JkJTIyJTJDJTIycmV0dXJuVXJsJTIyJTNBJTIyJTIyJTdEJTJDJTIycmVnaW9uJTIyJTNBJTIyVVMlMjIlN0Q%3D%26apn%3Dcom.shl.talentcentral%26ibi%3Dcom.shl.talentcentral%26isi%3D1551117793%26efr%3D1&data=05%7C02%7Comar.muqtar%40infosys.com%7Ca7ffe71a4fe4404f3dac08dca01c0bb3%7C63ce7d592f3e42cda8ccbe764cff5eb6%7C0%7C0%7C638561289526257677%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C&sdata=s28G3ArC9nR5S7J4j%2FV1ZujEnmYCbysbYke41r5svPw%3D&reserved=0 STEP 2: Click on "Start new test" and follow the instructions to complete the assessment. STEP 3: Once completed, please make a note of the AMCAT ID( Access you Amcat id by clicking 3 dots on top right corner of screen). NOTE: During registration, you'll be asked to provide the following information: Personal Details: Name, Email Address, Mobile Number, PAN number. Availability: Acknowledgement of work schedule preferences (Shifts, Work from Office, Rotational Weekends, 24/7 availability, Transport Boundary) and reason for career change. Employment Details: Current notice period and total annual compensation (CTC) in the format 390000 - 4 LPA (example). Candidate Information: 10-digit candidate ID starting with 100XXXXXXX, Gender, Source (e.g., Vendor name, Naukri/LinkedIn/Found it, or Direct), and Location Interview Mode: Walk-in Attempt all questions in the SHL Assessment app. The assessment is proctored, so choose a quiet environment. Use a headset or Bluetooth headphones for clear communication. A passing score is required for further interview rounds. 5 or above toggles, multi face detected, face not detected, or any malpractice will be considered rejected Once you've finished, submit the assessment and make a note of the AMCAT ID (15 Digit) used for the assessment. Documents to Carry: Please have a note of Candidate ID & AMCAT ID along with registered Email ID. Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Please carry 2 set of updated Resume/CV (Hard Copy). Please carry original ID proof for security clearance. Please carry individual headphone/Bluetooth for the interview. Pointers to note: Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Original Government ID card is must for Security Clearance. Regards, Infosys BPM Recruitment team.

Posted 1 week ago

Apply

5.0 - 8.0 years

11 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Key Responsibilities: Design, develop, and maintain ETL processes using tools such as Talend, Informatica, SSIS, or similar. Extract data from various sources, including databases, APIs, and flat files, transforming it to meet business requirements. Load transformed data into target systems while ensuring data integrity and accuracy. Collaborate with data analysts and business stakeholders to understand data needs and requirements. Optimize ETL processes for enhanced performance and efficiency. Debug and troubleshoot ETL jobs, providing effective solutions to data-related issues. Document ETL processes, data models, and workflows for future reference and team collaboration. Qualifications: • Bachelor's degree in computer science, Information Technology, or a related field. 3-5 years of experience in ETL development and data integration. Experience with Big Data technologies such as Hadoop or Spark. Knowledge of cloud platforms like AWS, Azure, or Google Cloud and their ETL services. Familiarity with data visualization tools such as Tableau or Power BI. Hands-on experience with Snowflake for data warehousing and analytics

Posted 1 week ago

Apply

6.0 - 8.0 years

19 - 25 Lacs

Noida, Chennai, Bengaluru

Hybrid

Naukri logo

We are seeking a highly skilled and experienced Senior ETL & Reporting QA Analyst to join our dynamic team. The ideal candidate will bring strong expertise in ETL and Report Testing, with a solid command of SQL, and hands-on experience in Informatica, as well as BI Reporting tools. A strong understanding of the Insurance domain is crucial to this role. This position will be instrumental in ensuring the accuracy, reliability, and performance of our data pipelines and reporting solutions. Key Responsibilities: Design, develop, and execute detailed test plans and test cases for ETL processes, data migration, and data warehousing solutions. Perform data validation and data reconciliation using complex SQL queries across various source and target systems. Validate Informatica ETL workflows and mappings to ensure accurate data transformation and loading. Conduct end-to-end report testing and dashboard validations using Cognos (preferred), or comparable BI tools such as Tableau or Power BI. Collaborate with cross-functional teams including Business Analysts, Developers, and Data Engineers to understand business requirements and transform them into comprehensive test strategies. Identify, log, and track defects to closure using test management tools and actively participate in defect triage meetings. Maintain and enhance test automation scripts and frameworks where applicable. Ensure data integrity, consistency, and compliance across reporting environments, particularly in the insurance domain context.

Posted 1 week ago

Apply

0.0 - 3.0 years

0 Lacs

Noida, Uttar Pradesh

On-site

Indeed logo

About the Role: We are seeking talented and detail-oriented Data Engineers with expertise in Informatica MDM to join our fast-growing data engineering team. Depending on your experience, you’ll join as a Software Engineer or Senior Software Engineer, contributing to the design, development, and maintenance of enterprise data management solutions that support our business objectives. As a key player, you will be responsible for building reliable data pipelines, working with master data management, and ensuring data quality, governance, and integration across systems. Responsibilities: Design, develop, and implement data pipelines using ETL tools like Informatica PowerCenter, IICS, etc., and MDM solutions using Informatica MDM. Develop and maintain batch and real-time data integration workflows. Collaborate with data architects, business analysts, and stakeholders to understand data requirements. Perform data profiling, data quality assessments, and master data matching/merging. Implement governance, stewardship, and metadata management practices. Optimize the performance of Informatica MDM Hub, IDD, and associated components. Write complex SQL queries and stored procedures as needed. Senior Software Engineer – Additional Responsibilities: Lead design discussions and code reviews; mentor junior engineers. Architect scalable data integration solutions using Informatica and complementary tools. Drive adoption of best practices in data modeling, governance, and engineering. Work closely with cross-functional teams to shape the data strategy. Required Qualifications: Software Engineer: Bachelor’s degree in Computer Science, Information Systems, or related field. 2–4 years of experience with Informatica MDM (Customer 360, Business Entity Services, Match/Merge rules). Strong SQL and data modeling skills. Familiarity with ETL concepts, REST APIs, and data integration tools. Understanding of data governance and quality frameworks. Senior Software Engineer: Bachelor’s or Master’s in Computer Science, Data Engineering, or related field. 4+ years of experience in Informatica MDM, with at least 2 years in a lead role. Proven track record of designing scalable MDM solutions in large-scale environments. Strong leadership, communication, and stakeholder management skills. Hands-on experience with data lakes, cloud platforms (AWS, Azure, or GCP), and big data tools is a plus. Preferred Skills (Nice to Have): Experience with other Informatica products (IDQ, PowerCenter). Exposure to cloud MDM platforms or cloud data integration tools. Agile/Scrum development experience. Knowledge of industry-standard data security and compliance practices. Job Type: Full-time Pay: ₹219,797.43 - ₹1,253,040.32 per year Benefits: Health insurance Schedule: Day shift Ability to commute/relocate: Noida, Uttar Pradesh: Reliably commute or planning to relocate before starting work (Preferred) Application Question(s): Are you willing to start immediately(Preferred)? Experience: Data warehouse: 3 years (Required) Informatica MDM: 3 years (Required) Work Location: In person

Posted 1 week ago

Apply

1.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Description Are you passionate about data? Does the prospect of dealing with massive volumes of data excite you? Do you want to build data engineering solutions that process billions of records a day in a scalable fashion using AWS technologies? Do you want to create the next-generation tools for intuitive data access? If so, Amazon Finance Technology (FinTech) is for you! FinTech is seeking a Data Engineer to join the team that is shaping the future of the finance data platform. The team is committed to building the next generation big data platform that will be one of the world's largest finance data warehouse to support Amazon's rapidly growing and dynamic businesses, and use it to deliver the BI applications which will have an immediate influence on day-to-day decision making. Amazon has culture of data-driven decision-making, and demands data that is timely, accurate, and actionable. Our platform serves Amazon's finance, tax and accounting functions across the globe. As a Data Engineer, you should be an expert with data warehousing technical components (e.g. Data Modeling, ETL and Reporting), infrastructure (e.g. hardware and software) and their integration. You should have deep understanding of the architecture for enterprise level data warehouse solutions using multiple platforms (RDBMS, Columnar, Cloud). You should be an expert in the design, creation, management, and business use of large data-sets. You should have excellent business and communication skills to be able to work with business owners to develop and define key business questions, and to build data sets that answer those questions. The candidate is expected to be able to build efficient, flexible, extensible, and scalable ETL and reporting solutions. You should be enthusiastic about learning new technologies and be able to implement solutions using them to provide new functionality to the users or to scale the existing platform. Excellent written and verbal communication skills are required as the person will work very closely with diverse teams. Having strong analytical skills is a plus. Above all, you should be passionate about working with huge data sets and someone who loves to bring data-sets together to answer business questions and drive change. Our ideal candidate thrives in a fast-paced environment, relishes working with large transactional volumes and big data, enjoys the challenge of highly complex business contexts (that are typically being defined in real-time), and, above all, is a passionate about data and analytics. In this role you will be part of a team of engineers to create world's largest financial data warehouses and BI tools for Amazon's expanding global footprint. Key job responsibilities Design, implement, and support a platform providing secured access to large datasets. Interface with tax, finance and accounting customers, gathering requirements and delivering complete BI solutions. Model data and metadata to support ad-hoc and pre-built reporting. Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions. Recognize and adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation. Tune application and query performance using profiling tools and SQL. Analyze and solve problems at their root, stepping back to understand the broader context. Learn and understand a broad range of Amazon’s data resources and know when, how, and which to use and which not to use. Keep up to date with advances in big data technologies and run pilots to design the data architecture to scale with the increased data volume using AWS. Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for datasets. Triage many possible courses of action in a high-ambiguity environment, making use of both quantitative analysis and business judgment. Basic Qualifications Experience with SQL 1+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Preferred Qualifications Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka Job ID: A2968106 Show more Show less

Posted 1 week ago

Apply

1.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Description Do you have the technical skill to build BI solutions that process billions of rows a day using AWS technologies? Do you want to create next-generation tools for intuitive data access? Do you wake up in the middle of the night with new ideas that will benefit your customers? Are you persistent in bringing your ideas to fruition? First things first, you know SQL and data modelling like the back of your hand. You also need to know Big Data and MPP systems. You have a history of coming up with innovative solutions to complex technical problems. You are a quick and willing learner of new technologies and have examples to prove your aptitude. You are not tool-centric; you determine what technology works best for the problem at hand and apply it accordingly. You can explain complex concepts to your non-technical customers in simple terms. Key job responsibilities Work with SDE teams and business stakeholders to understand data requirements and design data ingress flow for team Lead the design, model, and implementation of large, evolving, structured, semi-structured and unstructured datasets Evaluate and implement efficient distributed storage and query techniques Interact and integrate with internal and external teams and systems to extract, transform, and load data from a wide variety of sources Implement robust and maintainable code with clear and maintained documentation Implement test automation on code implemented through unit testing and integration testing Work in a tech stack which is a mix of NAWS services and legacy ETL tools within Amazon About The Team Data Insights, Metrics & Reporting team (DIMR) is the central data engineering team in Amazon Warehousing & Distribution org which is responsible for 4 things mainly - Building and maintaining data engineering and reporting infrastructure using NAWS to support internal/external data use-cases. Building data ingestions pipelines from any kind of upstream data sources which include (but not limited to) real time event streaming services, data lakes, manual file uploads, etc. Building mechanisms to vend data to internal team members or external sellers with right data handling techniques in place. Build robust data mart to support diverse use-cases powered by GenAI tool. Basic Qualifications 1+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Preferred Qualifications Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka Job ID: A2970459 Show more Show less

Posted 1 week ago

Apply

4.0 - 7.0 years

20 - 30 Lacs

Noida

Hybrid

Naukri logo

Role & responsibilities Collaborate with customers' Business and IT teams to define and gather solution requirements for custom development, B2B/ETL/EAI, and cloud integration initiatives using the Adeptia Integration Platform. Analyze, interpret, and translate customer business needs into scalable and maintainable technical solution designs, aligned with best practices and the capabilities of the Adeptia platform. Storyboard and present solutions to customers and prospects, ensuring a clear understanding of proposed designs and technical workflows. Provide end-to-end project leadership, including planning, tracking deliverables, and coordinating efforts with offshore development teams as required. Review implementation designs and provide architectural guidance and best practices to the implementation team to ensure high-quality execution. Actively assist and mentor customers in configuring and implementing the Adeptia platform, ensuring alignment with technical and business objectives. Solutions Lead (Implementation Services Team) Full Time (Permanent) Noida, India 1 Offer expert recommendations on design and configuration to ensure successful deployment and long-term maintainability of customer solutions. Define clear project requirements, create work breakdown structures, and establish realistic delivery timelines. Delegate tasks effectively, and manage progress against daily, weekly, and monthly targets, ensuring the team remains focused and productive. Serve as a liaison among customers, internal stakeholders, and offshore teams to maintain alignment, track progress, and ensure delivery meets both quality and timeline expectations. Monitor project baselines, identify and mitigate risks, and lead participation in all Agile ceremonies, including sprint grooming, planning, reviews, and retrospectives. Maintain a hands-on technical role, contributing to development activities and conducting detailed code reviews to ensure technical soundness and optimal performance. Take full ownership of assigned projects, driving them to successful, on-time delivery with high quality standards. Preferred candidate profile Proven experience in designing and developing integration solutions involving Cloud/SaaS applications, APIs, SDKs, and legacy systems. Skilled in implementing SOA/EAI principles and integration patterns in B2B, ETL, EAI, and Cloud Integration using platforms such as Adeptia, Talend, MuleSoft or similar tools. Good hands-on experience with Core Java (version 8+) and widely-used Java frameworks including Spring (version 6+), Hibernate (version 6+). Proficient in SOA, RESTful and SOAP web services and related technologies including JMS, SAAJ, JAXP, and XML technologies (XSD, XPath, XSLT, parsing). Strong command over SQL and RDBMS (e.g., Oracle, MySQL, PostgreSQL). Solid understanding of Enterprise Service Bus (ESB) concepts and messaging technologies such as Kafka and RabbitMQ. Familiar with transport protocols including HTTPS, Secure FTP, POP/IMAP/SMTP, and JDBC. Skilled in working with Windows and Linux operating systems, and experienced with application servers such as JBoss, Jetty, and Tomcat. Solid understanding of security best practices, including authentication, authorization, data encryption, and compliance frameworks relevant to enterprise integrations. Basic understanding of modern JavaScript frameworks such as React, with the ability to collaborate effectively on front-end and full-stack development scenarios

Posted 1 week ago

Apply

6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description Job Title : Data Engineer Location : Chennai Candidate Specification Any Graduate, Min 6+ years relevant Experience Job Description The role involves spinning up and managing AWS data infrastructure and building data ingestion pipelines in StreamSets and EMR. Candidates must have working experience with any modern ETL tools (PySpark or EMR, or Glue or others). Skills Required RoleData Engineer - Chennai Industry Type Functional Area Required Education Employment TypeFull Time, Permanent Key Skills DBT SNOWFLAKE AWS ETL INFORMATICA EMR PYSPARK GLUE Other Information Job CodeGO/JC/046/2025 Recruiter Name Key Skills DBT SNOWFLAKE AWS ETL INFORMATICA EMR PYSPARK GLUE Other Information Job CodeGO/JC/046/2025 Recruiter Name Show more Show less

Posted 1 week ago

Apply

1.0 years

4 - 6 Lacs

Hyderābād

On-site

GlassDoor logo

- 1+ years of data engineering experience - Experience with SQL - Experience with data modeling, warehousing and building ETL pipelines - Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) - Experience with one or more scripting language (e.g., Python, KornShell) Business Data Technologies (BDT) makes it easier for teams across Amazon to produce, store, catalog, secure, move, and analyze data at massive scale. Our managed solutions combine standard AWS tooling, open-source products, and custom services to free teams from worrying about the complexities of operating at Amazon scale. This lets BDT customers move beyond the engineering and operational burden associated with managing and scaling platforms, and instead focus on scaling the value they can glean from their data, both for their customers and their teams. We own the one of the biggest (largest) data lakes for Amazon where 1000’s of Amazon teams can search, share, and store EB (Exabytes) of data in a secure and seamless way; using our solutions, teams around the world can schedule/process millions of workloads on a daily basis. We provide enterprise solutions that focus on compliance, security, integrity, and cost efficiency of operating and managing EBs of Amazon data. Key job responsibilities CORE RESPONSIBILITIES: · Be hands-on with ETL to build data pipelines to support automated reporting · Interface with other technology teams to extract, transform, and load data from a wide variety of data sources · Implement data structures using best practices in data modeling, ETL/ELT processes, and SQL, Redshift. · Model data and metadata for ad-hoc and pre-built reporting · Interface with business customers, gathering requirements and delivering complete reporting solutions · Build robust and scalable data integration (ETL) pipelines using SQL, Python and Spark. · Build and deliver high quality data sets to support business analyst, data scientists, and customer reporting needs. · Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers · Participate in strategic & tactical planning discussions A day in the life As a Data Engineer, you will be working with cross-functional partners from Science, Product, SDEs, Operations and leadership to translate raw data into actionable insights for stakeholders, empowering them to make data-driven decisions. Some of the key activities include: Crafting the Data Flow: Design and build data pipelines, the backbone of our data ecosystem. Ensure the integrity of the data journey by implementing robust data quality checks and monitoring processes. Architect for Insights: Translate complex business requirements into efficient data models that optimize data analysis and reporting. Automate data processing tasks to streamline workflows and improve efficiency. Become a data detective! ensuring data availability and performance Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Knowledge of cloud services such as AWS or equivalent Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

As an employee at Thomson Reuters, you will play a role in shaping and leading the global knowledge economy. Our technology drives global markets and helps professionals around the world make decisions that matter. As the world’s leading provider of intelligent information, we want your unique perspective to create the solutions that advance our business and your career.Our Service Management function is transforming into a truly global, data and standards-driven organization, employing best-in-class tools and practices across all disciplines of Technology Operations. This will drive ever-greater stability and consistency of service across the technology estate as we drive towards optimal Customer and Employee experience. About the role: In this opportunity as Application Support Analyst, you will: Experience on Informatica support. The engineer will be responsible for supporting Informatica Development, Extractions, and loading. Fixing the data discrepancies and take care of performance monitoring. Collaborate with stakeholders such as business teams, product owners, and project management in defining roadmaps for applications and processes. Drive continual service improvement and innovation in productivity, software quality, and reliability, including meeting/exceeding SLAs. Thorough understanding of ITIL processes related to incident management, problem management, application life cycle management, operational health management. Experience in supporting applications built on modern application architecture and cloud infrastructure, Informatica PowerCenter/IDQ, Javascript frameworks and Libraries, HTML/CSS/JS, Node.JS, TypeScript, jQuery, Docker, AWS/Azure. About You: You're a fit for the role of Application Support Analyst - Informatica if your background includes: 3 to 8+ experienced Informatica Developer and Support will be responsible for implementation of ETL methodology in Data Extraction, Transformation and Loading. Have Knowledge in ETL Design of new or changing mappings and workflows with the team and prepares technical specifications. Should have experience in creating ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter 10.x and prepare corresponding documentation. Designs and builds integrations supporting standard data warehousing objects (type-2 dimensions, aggregations, star schema, etc.). Should be able to perform source system analysis as required. Works with DBAs and Data Architects to plan and implement appropriate data partitioning strategy in Enterprise Data Warehouse. Implements versioning of the ETL repository and supporting code as necessary. Develops stored procedures, database triggers and SQL queries where needed. Implements best practices and tunes SQL code for optimization. Loads data from SF Power Exchange to Relational database using Informatica. Works with XML's, XML parser, Java and HTTP transformation within Informatica. Experience in Integration of various data sources like Oracle, SQL Server, DB2 and Flat Files in various formats like fixed width, CSV, Salesforce and excel Manage. Have in depth knowledge and experience in implementing the best practices for design and development of data warehouses using Star schema & Snowflake schema design concepts. Experience in Performance Tuning of sources, targets, mappings, transformations, and sessions Carried out support and development activities in a relational database environment, designed tables, procedures/Functions, Packages, Triggers and Views in relational databases and used SQL proficiently in database programming using SNFL Thousand Coffees Thomson Reuters café networking. #LI-VGA1 What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here . Learn more on how to protect yourself from fraudulent job postings here . More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 1 week ago

Apply

15.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Informatica MDM Good to have skills : NA Minimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions, providing insights and solutions to enhance application performance and user experience. Your role will require you to stay updated with the latest technologies and methodologies to ensure the applications are built using best practices. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Facilitate communication between technical teams and stakeholders to ensure alignment on project goals. - Mentor junior team members, providing guidance and support in their professional development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM. - Strong understanding of data integration processes and methodologies. - Experience with data quality management and data governance practices. - Familiarity with database management systems and data modeling techniques. - Ability to troubleshoot and resolve application issues efficiently. Additional Information: - The candidate should have minimum 2 years of experience in Informatica MDM. - This position is based at our Hyderabad office. - A 15 years full time education is required. 15 years full time education

Posted 1 week ago

Apply

5.0 years

3 - 8 Lacs

Hyderābād

On-site

GlassDoor logo

Applause is raising the bar for digital quality and employee experience. Recognized as a Top Workplace, Applause provides award-winning software testing and UX research solutions to top brands. Our fully managed services leverage a global team and the world's largest independent testing community. We improve digital experiences for global innovators like Google, Microsoft, PayPal, Starbucks, Vodafone, and BMW. As a Business Intelligence Analyst you will be part of our global Data and Analytics team. This position will play a key role in maintaining and enhancing our enterprise business intelligence environment. This individual will form relationships with subject matter experts across the company and business leaders to help enhance business decisions and reporting capabilities by having a strong data background. The right candidate will exhibit outstanding data understanding, a drive to learn new systems and business data, and the ability to thrive in a fast-paced, and sometimes ambiguous, work environment. . Key Responsibilities: Available to work until 10:30 PM IST to ensure effective collaboration with global teams. Collaborating with business users and stakeholders to understand their data analysis and reporting requirements. Identifying the key metrics, dimensions, and data sources needed for the Qlik applications. Designing and implementing the data model within the Qlik environment. This includes extracting, transforming, and loading (ETL) data from various sources, creating data connections, and defining relationships between data tables. Developing interactive dashboards, visualizations, and reports using Qlik's data visualization tools. Designing and implementing user-friendly interfaces that allow users to explore data, apply filters, and drill down into details. Writing and maintaining Qlik scripting to load and transform data from different sources. This involves data cleansing, aggregation, joining tables, and implementing complex calculations or business logic. Writing, modifying, testing, and verifying SQL queries based on business requirements. Optimizing Qlik applications for performance and efficiency. Identifying and resolving issues related to data model design, data loading, scripting, or visualizations to ensure optimal application responsiveness and speed. Conducting thorough testing of Qlik applications to ensure data accuracy, functionality, and performance Documenting the design, development process, and application functionalities for future reference in Jira and internal training documentation Creating user guides and providing training to end-users on how to use the Qlik applications effectively. Designing and building complex BI solutions that have a global perspective, but can be flexible for regional-specific requirements. Working with colleagues across the company to obtain requirements, business logic, and technical details for BI solutions. Determining and scheduling data jobs during optimum business hours. Working closely with and collaborating with team members on initiatives. Maintaining high standards of data quality and integrity. Taking lead on projects, but collaborating with team members. Job Requirements and Preferred Skills: 5+ years working with Qlik Sense, Qlik View, or Qlik Cloud, other BI tool experience may be considered. 5+ years of Business Intelligence experience 5+ years of SQL writing experience Experience with Fivetran, Snowflake, Hightouch, Informatica, or other related tools is a plus Strong analytical skills to troubleshoot databases and data issues and identify and solutions. A clear sense of urgency and a desire to learn. Ability to manage communications effectively with various cultures and across multiple time zones across the globe. Excellent organizational, analytical, problem-solving and communication skills. Team player with solid communication and presentation skills. Why Applause? We’re proud to cultivate an inspiring, engaging employee culture that’s consistently reflected in high employee retention rates and satisfaction. Our talented team – known as Applause Nation – is set up for success with the latest collaboration and learning tools, opportunities for career advancement, and more. We have a flexible work environment with top talent from across the globe Collaborate with an international team of 450+ passionate, talented co-workers Expand your portfolio with exciting, hands-on projects providing exposure to well-known, global brands Learn and grow through structured onboarding, in-house knowledge sessions and access to thousands of virtual courses available on demand Incorporate AI and other exciting technologies into your work, to help you prioritize and boost productivity Experience a supportive culture that emphasizes teamwork, innovation and transparency Share your voice! Contribute and integrate creative and innovative ideas across roles and departments Applause Core Values: As a global employee community, we strive to uphold the following core values, which are critical to business success and how we measure individual and team performance. Do you share our core values? Be Accountable: You love to take ownership, and hold yourself and others accountable to increase empowerment and success. Celebrate Authenticity: You love bringing your true self to work and creating genuine and trustful relationships within a diverse environment. In It Together: You have a team-first mindset and love collaborating with your peers. Create Value for Our Customers: You love delivering meaningful business impact and being a release partner for all aspects of digital quality. Crush Your Goals: You always strive for excellence and constantly seek ways to be better, more effective and more efficient. #LI-DA1

Posted 1 week ago

Apply

15.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Informatica Data Quality Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with team members to enhance data workflows and contribute to the overall efficiency of data management practices. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the design and implementation of data architecture to support data initiatives. - Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica Data Quality. - Strong understanding of data integration techniques and ETL processes. - Experience with data profiling and data cleansing methodologies. - Familiarity with database management systems and SQL. - Knowledge of data governance and data quality best practices. Additional Information: - The candidate should have minimum 3 years of experience in Informatica Data Quality. - This position is based at our Hyderabad office. - A 15 years full time education is required. 15 years full time education

Posted 1 week ago

Apply

3.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

We are looking for a passionate and skilled Azure Data Engineer to join our team and help design, build, and maintain scalable data solutions on the Azure cloud platform. If you're experienced in Azure Data Factory, Synapse, and Databricks and enjoy solving complex data problems, we’d love to connect! Key Responsibilities Develop and maintain data pipelines using Azure Data Factory , Databricks , and Azure Synapse Analytics Design and implement robust data lake and data warehouse architectures on Azure Write complex SQL and Python scripts for data transformation and analysis Enable CI/CD for data pipelines and monitor pipeline performance Collaborate with data analysts and business stakeholders to build data models and reports Leverage tools like Azure Monitor and Log Analytics for proactive monitoring and debugging Required Qualifications 3+ years of hands-on experience as a Data Engineer working in Azure cloud environments Proficiency in Azure Data Factory, Synapse Analytics, Azure Data Lake (Gen2), Azure SQL , Databricks , and Microsoft Fabric Strong programming skills in SQL , Python , and Spark Experience in implementing CI/CD pipelines for data projects Solid understanding of data modeling, warehousing , and data architecture principles Familiarity with Power BI , Azure Monitor , and Log Analytics Excellent communication and problem-solving skills Preferred Qualifications Microsoft Certified: Azure Data Engineer Associate (DP-203) or similar certification Experience with real-time data processing tools like Azure Stream Analytics or Kafka Exposure to big data platforms and large-scale analytics systems Understanding of data governance and experience with tools such as Azure Purview , Informatica , or Data Catalog Why Join Us? Work with cutting-edge Azure technologies Opportunity to be part of impactful data-driven projects Collaborative and innovation-focused culture Competitive salary and flexible work environment 📩 Apply now or reach out to us directly to learn more about this exciting opportunity! Show more Show less

Posted 1 week ago

Apply

3.0 years

7 - 10 Lacs

Chennai

On-site

GlassDoor logo

Our software engineers at Fiserv bring an open and creative mindset to a global team developing mobile applications, user interfaces and much more to deliver industry-leading financial services technologies to our clients. Our talented technology team members solve challenging problems quickly and with quality. We're seeking individuals who can create frameworks, leverage developer tools, and mentor and guide other members of the team. Collaboration is key and whether you are an expert in a legacy software system or are fluent in a variety of coding languages you're sure to find an opportunity as a software engineer that will challenge you to perform exceptionally and deliver excellence for our clients. Full-time Entry, Mid, Senior Yes (occasional), Minimal (if any) Responsibilities Requisition ID R-10358182 Date posted 06/11/2025 End Date 06/26/2025 City Chennai State/Region Tamil Nadu Country India Location Type Onsite Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Professional, Software Development Engineering What does a successful Professional, Data Conversions doat Fiserv? A Conversion Professional is responsible for timely and accurate conversion of new and existing Bank/Client data to Fiserv systems, from both internal and external sources. This role is responsible for providing data analysis for client projects and to accommodate other ad hoc data updates to meet client requests. As part of the overall Service Delivery organization, a Conversion Professional plays a critical role in mapping in data to support project initiatives for new and existing banks. Working with financial services data means a high priority on accuracy and adherence to procedures and guidelines. What will you do A Conversion Professional is responsible for timely and accurate conversion of new and existing Bank/Client data to Fiserv systems, from both internal and external sources. This role is responsible for providing data analysis for client projects and to accommodate other ad hoc data updates to meet client requests. As part of the overall Service Delivery organization, a Conversion Professional plays a critical role in mapping in data to support project initiatives for new and existing banks. Working with financial services data means a high priority on accuracy and adherence to procedures and guidelines. The person stepping in as the backup would need to review the specifications history and then review and understand the code that was being developed to resolve the issue and or change. This would also have to occur on the switch back to the original developer. Today, the associate handling the project would log back in to support the effort and address the issue and or change. What you will need to have Bachelor’s degree in programming or related field Minimum 3 years’ relevant experience in data processing (ETL) conversions or financial services industry 3 – 5 years’ Experience and strong knowledge of MS SQL/PSQL, MS SSIS and data warehousing concepts Strong communication skills and ability to provide technical information to non-technical colleagues. Team players with ability to work independently. Experience in full software development life cycle using agile methodologies. Should have good understanding of Agile methodologies and can handle agile ceremonies. Efficient in Reviewing, coding, testing, and debugging of application/Bank programs. Should be able to work under pressure while resolving critical issues in Prod environment. Good communication skills and experience in working with Clients. Good understanding in Banking Domain. What would be great to have Experience with Informatica, Power BI, MS Visual Basic, Microsoft Access and Microsoft Excel required. Experience with Card Management systems, debit card processing is a plus Strong communication skills and ability to provide technical information to non-technical colleagues Ability to manage and prioritize work queue across multiple workstreams Team player with ability to work independently Highest attention to detail and accuracy Thank you for considering employment with Fiserv. Please: Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our commitment to Diversity and Inclusion: Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note to agencies: Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning about fake job posts: Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.

Posted 1 week ago

Apply

8.0 - 11.0 years

6 - 9 Lacs

Noida

On-site

GlassDoor logo

Snowflake - Senior Technical Lead Full-time Company Description About Sopra Steria Sopra Steria, a major Tech player in Europe with 50,000 employees in nearly 30 countries, is recognised for its consulting, digital services and solutions. It helps its clients drive their digital transformation and obtain tangible and sustainable benefits. The Group provides end-to-end solutions to make large companies and organisations more competitive by combining in-depth knowledge of a wide range of business sectors and innovative technologies with a collaborative approach. Sopra Steria places people at the heart of everything it does and is committed to putting digital to work for its clients in order to build a positive future for all. In 2024, the Group generated revenues of €5.8 billion. The world is how we shape it. Job Description Position: Snowflake - Senior Technical Lead Experience: 8-11 years Location: Noida/ Bangalore Education: B.E./ B.Tech./ MCA Primary Skills: Snowflake, Snowpipe, SQL, Data Modelling, DV 2.0, Data Quality, AWS, Snowflake Security Good to have Skills: Snowpark, Data Build Tool, Finance Domain Experience with Snowflake-specific features: Snowpipe, Streams & Tasks, Secure Data Sharing. Experience in data warehousing, with at least 2 years focused on Snowflake. Hands-on expertise in SQL, Snowflake scripting (JavaScript UDFs), and Snowflake administration. Proven experience with ETL/ELT tools (e.g., dbt, Informatica, Talend, Matillion) and orchestration frameworks. Deep knowledge of data modeling techniques (star schema, data vault) and performance tuning. Familiarity with data security, compliance requirements, and governance best practices. Experience in Python, Scala, or Java for Snowpark development is good to have. Strong understanding of cloud platforms (AWS, Azure, or GCP) and related services (S3, ADLS, IAM) Key Responsibilities Define data partitioning, clustering, and micro-partition strategies to optimize performance and cost. Lead the implementation of ETL/ELT processes using Snowflake features (Streams, Tasks, Snowpipe). Automate schema migrations, deployments, and pipeline orchestration (e.g., with dbt, Airflow, or Matillion). Monitor query performance and resource utilization; tune warehouses, caching, and clustering. Implement workload isolation (multi-cluster warehouses, resource monitors) for concurrent workloads. Define and enforce role-based access control (RBAC), masking policies, and object tagging. Ensure data encryption, compliance (e.g., GDPR, HIPAA), and audit logging are correctly configured. Establish best practices for dimensional modeling, data vault architecture, and data quality. Create and maintain data dictionaries, lineage documentation, and governance standards. Partner with business analysts and data scientists to understand requirements and deliver analytics-ready datasets. Stay current with Snowflake feature releases (e.g., Snowpark, Native Apps) and propose adoption strategies. Contribute to the long-term data platform roadmap and cloud cost-optimization initiatives. Qualifications BTech/MCA Additional Information At our organization, we are committed to fighting against all forms of discrimination. We foster a work environment that is inclusive and respectful of all differences. All of our positions are open to people with disabilities.

Posted 1 week ago

Apply

0 years

3 - 7 Lacs

Noida

On-site

GlassDoor logo

Req ID: 328454 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Business Intelligence Senior Specialist to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Mandatory Skills : Oracle SQL , PL/SQL, Informatica, Linux Scripting , Tidal Desired Skills : , Python scripting , Autosys - Responsible for workload prioritization and management of resources, both on service requests and small projects. - Maintaining and Providing the Status to mgmt. and Onshore leads - Expert in Architecture, design, develop and deliver ETL applications using Informatica , Teradata, Oracle , Tidal and Linux Scripting Hands on code development, source code control, specification writing and production implementation. Participate in requirement gathering sessions, guide the design & development by providing insights of data sources and peer reviews. Participate and guide data integration solutions that are needed to fill the data gaps Debug data quality issues by analyzing the upstream sources and provide guidance to data integration team resolutions Closely work with DBAs to fix performance bottlenecks Participate in technology governance groups that defines policies, best practices and make design decisions in on-going projects - Mentor junior developers regarding best practices and technology stacks used to build the application - Work closely with Operations, Teradata administration teams for code migrations and production support - Provide Resource and effort estimates for EDW - ETL & Extract projects. - Experience working as part of a global development team. Shoudl be able to bring innovation to provide value add to the customer . About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.

Posted 1 week ago

Apply

5.0 years

5 - 10 Lacs

Noida

On-site

GlassDoor logo

Country/Region: IN Requisition ID: 26208 Work Model: Position Type: Salary Range: Location: INDIA - NOIDA- BIRLASOFT OFFICE Title: Technical Specialist-Data Engg Description: Area(s) of responsibility Must have at least 5+ Years of working experience in ETL Informatica tools. Responsible for designing, developing, and maintaining complex data integration solutions using Informatica tools using Informatica PowerCenter10.x/9.5. PowerCenter Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools –Informatica Server, Repository Server manager. Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Exchange, Power Connect as ETL tool on Oracle, SQL Server Databases. Hands on experience in IICS and IDQ. Extensive experience in developing Stored Procedures, Views, Complex SQL queries using SQL Server and Oracle PL/SQL. Gather the requirements from the business and create detailed technical design documents, and model data flows for complex ETL processes using Informatica PowerCenter Experience in resolving on-going maintenance issues and bug fixes, monitoring Informatica sessions as well as performance tuning of mappings and sessions. Experience in all phases of Data warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing and Documentation. Proficient in the Integration of various data sources with multiple relational databases like Oracle11g /Oracle10g/9i, MS SQL Server, XML,Flat Files into the staging area, ODS, Data Warehouse and Data Mart. Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP. Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support. Ability and experience in managing/ coordinating with Onshore-Offshore teams Skills with M/O flag are part of Specialization Programming/Software Development -PL3 (Functional) Systems Integration And Build -PL3 (Functional) Help the tribe -PL2 (Behavioural) Think Holistically -PL2 (Behavioural) Database Design -PL1 (Functional) Win the Customer -PL2 (Behavioural) Data Visualisation -PL2 (Functional) One Birlasoft -PL2 (Behavioural) Data Management -PL2 (Functional) Results Matter -PL2 (Behavioural) Data Governance -PL1 (Functional) Get Future Ready -PL2 (Behavioural) Requirements Definition And Management -PL2 (Functional) Test Execution -PL2 (Functional) Data Engineering -PL3 (Functional) Data Modelling And Design -PL2 (Functional) MS SQL - PL3 (Mandatory) Python - PL3 (Mandatory) Informatica IICS - PL3 (Mandatory) Spark - PL2 (Optional) Oracle SQL - PL2 (Optional) Oracle PL/SQL - PL2 (Optional) Informatica Power Center - PL3 (Mandatory) Unix Shell Scripting - PL2 (Optional) Unix - PL2 (Optional)

Posted 1 week ago

Apply

0 years

0 Lacs

Noida

On-site

GlassDoor logo

Req ID: 328451 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Informatica PowerCenter Lead with IICS to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). andatory Skills : Oracle SQL , PL/SQL, Informatica, Linux Scripting , Tidal Desired Skills : Python scripting Responsible for workload prioritization and management of resources, both on service requests and small projects. - Maintaining and Providing the Status to mgmt. and Onshore leads - Expert in Architecture, design, develop and deliver ETL applications using Informatica , Teradata, Oracle , Tidal and Linux Scripting Hands on code development, source code control, specification writing and production implementation. Participate in requirement gathering sessions, guide the design & development by providing insights of data sources and peer reviews. Participate and guide data integration solutions that are needed to fill the data gaps Debug data quality issues by analyzing the upstream sources and provide guidance to data integration team resolutions Closely work with DBAs to fix performance bottlenecks Participate in technology governance groups that defines policies, best practices and make design decisions in on-going projects - Mentor junior developers regarding best practices and technology stacks used to build the application - Work closely with Operations, Teradata administration teams for code migrations and production support - Provide Resource and effort estimates for EDW - ETL & Extract projects. - Experience working as part of a global development team. Shoudl be able to bring innovation to provide value add to the customer . Shoudl be able to do innvations / Automations to provide value add to customer . About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.

Posted 1 week ago

Apply

Exploring Informatica Jobs in India

The informatica job market in India is thriving with numerous opportunities for skilled professionals in this field. Companies across various industries are actively hiring informatica experts to manage and optimize their data integration and data quality processes.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

Average Salary Range

The average salary range for informatica professionals in India varies based on experience and expertise: - Entry-level: INR 3-5 lakhs per annum - Mid-level: INR 6-10 lakhs per annum - Experienced: INR 12-20 lakhs per annum

Career Path

A typical career progression in the informatica field may include roles such as: - Junior Developer - Informatica Developer - Senior Developer - Informatica Tech Lead - Informatica Architect

Related Skills

In addition to informatica expertise, professionals in this field are often expected to have skills in: - SQL - Data warehousing - ETL tools - Data modeling - Data analysis

Interview Questions

  • What is Informatica and why is it used? (basic)
  • Explain the difference between a connected and unconnected lookup transformation. (medium)
  • How can you improve the performance of a session in Informatica? (medium)
  • What are the various types of cache in Informatica? (medium)
  • How do you handle rejected rows in Informatica? (basic)
  • What is a reusable transformation in Informatica? (basic)
  • Explain the difference between a filter and router transformation in Informatica. (medium)
  • What is a workflow in Informatica? (basic)
  • How do you handle slowly changing dimensions in Informatica? (advanced)
  • What is a mapplet in Informatica? (medium)
  • Explain the difference between an aggregator and joiner transformation in Informatica. (medium)
  • How do you create a mapping parameter in Informatica? (basic)
  • What is a session and a workflow in Informatica? (basic)
  • What is a rank transformation in Informatica and how is it used? (medium)
  • How do you debug a mapping in Informatica? (medium)
  • Explain the difference between static and dynamic cache in Informatica. (advanced)
  • What is a sequence generator transformation in Informatica? (basic)
  • How do you handle null values in Informatica? (basic)
  • Explain the difference between a mapping and mapplet in Informatica. (basic)
  • What are the various types of transformations in Informatica? (basic)
  • How do you implement partitioning in Informatica? (medium)
  • Explain the concept of pushdown optimization in Informatica. (advanced)
  • How do you create a session in Informatica? (basic)
  • What is a source qualifier transformation in Informatica? (basic)
  • How do you handle exceptions in Informatica? (medium)

Closing Remark

As you prepare for informatica job opportunities in India, make sure to enhance your skills, stay updated with the latest trends in data integration, and approach interviews with confidence. With the right knowledge and expertise, you can excel in the informatica field and secure rewarding career opportunities. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies