Home
Jobs

2257 Informatica Jobs - Page 10

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

14 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Build Your Career at Informatica Were looking for a diverse group of collaborators who believe data has the power to improve society. Adventurous minds who value solving some of the worlds most challenging problems. Here, employees are encouraged to push their boldest ideas forward, united by a passion to create a world where data improves the quality of life for people and businesses everywhere. Product Specialist Were looking for an Product Specialist candidate with experience in Java / C++, Troubleshooting, Debugging, and to join our team in Bangalore. You will report to the Senior Manager, Product Specialist. Technology Youll Use Troubleshooting skills for issues like, unexpected termination/memory leak/performance issue. Proficiency in java and C++ source code and also exposure in analyzing Heap dump, Thread dump, Jstack,pstack,strace,GDB and proficiency in troubleshooting tools like tcpdump/wireshark/java runtime injection, DB performance tunings, memory profilers/analyzers. Your Role ResponsibilitiesHeres What Youll Do You will also work with internal development staff to solve technical product problems and provide third-line support. Primary responsibilities involve troubleshooting complex technical problems, proposing probable fixes, and working with the customer and Informatica development to resolve technical roadblocks. Will also support the customers implementation process, working with our customers implementation teams and Informatica technical support. Knowledge of CFCN, serverless paradigm with the ability to understand issues in cloud native applications is has must have skill. Troubleshoot issues not only in product but also on generic stacks like system/network/echo systems (Azure/GCP/AWS) is an additional expectation from the role. The ability to develop tools for multiple endpoints ( odbc/message queue/rest/ echo systems end points) would be an added advantage.Manage customer support technical issues daily, including verifying issues, isolating and diagnosing the problem, and resolving the issue. What Wed Like to See You are knowledgeable in cloud architecture-based applications with an understanding of how these applications are designed/developed for scale and performance. You are comfortable programming in JAVA, and C++ and have in-depth knowledge of systems (OS) programming Role Essentials 3+ years of industry experience in Java/C/C++ environment or enterprise software sustenance/support. Bachelors degree in computer engineering/Technology Perks Benefits Comprehensive health, vision, and wellness benefits (Paid parental leave, adoption benefits, life insurance, disability insurance, and 401k plan or international pension/retirement plans Flexible time-off policy and hybrid working practices Tuition reimbursement program to support your personal growth Equity opportunities and an employee stock purchase program (ESPP) Comprehensive Mental Health and Employee Assistance Program (EAP) benefit Were guided by our DATA values and passionate about building and delivering solutions that accelerate data innovations. We do that by creating an inclusive culture that celebrates and supports diversity. So, if your experience aligns but doesnt exactly match every qualification, apply anyway. You may be exactly who we need to fuel our future with innovative ideas and a thriving culture. Informatica (NYSE: INFA), an Enterprise Cloud Data Management leader, brings data and AI to life by empowering businesses to realize the transformative power of their most critical assets. We pioneered the Informatica Intelligent Data Management Cloud that manages data across any multi-cloud, hybrid system, democratizing data to advance business strategies. Customers in over 100 countries and 85 of the Fortune 100 rely on Informatica. www.informatica.com . Connect with LinkedIn , Twitter , and Facebook . Informatica. Where data and AI come to life. ","

Posted 4 days ago

Apply

4.0 - 9.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Schneider Digital is seeking an Informatica developer who will be an excellent addition to the forBusiness Intelligence team. Responsibilities: This role is for BI KPI portal operations and Development. Domains including: Data Integration, Data Extraction from Legacy systems, Data Warehousing, efficient in Extract/Load/Transform (ETL)workflows (Informatica PowerCenter and Informatica cloud services) Strong experience in design, development and testing of Informatica based applications (PowerCenter10.2 and Informatica cloud services) Should have Strong knowledge of Oracle database, PL/SQL development, UNIX scripting. Should understand the overall system landscape, upstream and downstream systems Excellent knowledge of debugging, tuning and optimizing performance of database queries Good experience in Data Integration: Data Extraction from legacy systems and Load into Redshift and Redshift spectrum. Supports the module in production, resolves hot issues and implement and deploy enhancements to the application/package Should be proficient in the end-to-end software development life cycle including: Requirement Analysis, Design, Development, Code Review, Testing Responsible for ensuring defect-free and on-time delivery Responsible for Issue resolution along with corrective and preventive measures. Should be able to manage diverse set of Stakeholders and report on key project metric / KPIs Lead brainstorming sessions, provide guidance to team members, identify value creation areas, be responsible for quality control Establish standard processes and procedures and promote team collaboration selfimprovements Should be able to work on agile Methodologies (Jira) Internal Qualification Education: Graduate Engineering Degree (B.E. / B.Tech) 4+ years of experience working as Informatica, Informatica cloud, Data warehousing and Unix Scripting 4+ years of experience working in Agile teams Demonstrates strong ability to articulate technical concepts and implications to business partners Excellent communication skills Strong experience in design, development and testing of Informatica based applications (PowerCenter10.2 and Informatica cloud services) Should have Strong knowledge of Oracle database, PL/SQL development, UNIX scripting. Should understand the overall system landscape, upstream and downstream systems Excellent knowledge of debugging, tuning and optimizing performance of database queries Good experience in Data Integration: Data Extraction from legacy systems and Load into Redshift and Redshift spectrum. Supports the module in production, resolves hot issues and implement and deploy enhancements to the application/package

Posted 4 days ago

Apply

5.0 - 10.0 years

6 - 10 Lacs

Noida, Gurugram, Bengaluru

Work from Office

Naukri logo

">Informatica Developer 5-10 Years Noida, Gurgaon, Bengaluru Informatica Cloud IICS Experience Required : 5-10 Years Location : Delhi NCR Job Summary: We are seeking a skilled Application Integration Developer to design, develop, and maintain robust integration solutions using Informatica Intelligent Cloud Services (IICS). This role will be pivotal in connecting critical business systems such as Salesforce, ERP platforms, RevPro, and other SaaS applications to ensure seamless data flow, quality, integrity, and high performance across all integration points. Key Responsibilities: Design, develop, and maintain scalable and reliable integration solutions using IICS (Cloud Data Integration - CDI and Cloud Application Integration - CAI). Develop complex mappings, taskflows, and processes to support diverse integration needs. Integrate multiple data sources and targets including databases, APIs, and flat files. Collaborate with business and technical teams to understand requirements and translate them into efficient integration designs. Monitor and optimize the performance of integration workflows to meet business SLAs. Ensure data quality, integrity, and consistency across all integrated systems. Troubleshoot and resolve integration issues in a timely manner. Document integration processes, configurations, and best practices. Qualifications: Proven experience with Informatica Intelligent Cloud Services (IICS), specifically CDI and CAI modules. Strong knowledge of application integration concepts and techniques. Experience integrating Salesforce, ERP systems, RevPro, and other SaaS applications. Proficient with various data sources and targets including relational databases, REST/SOAP APIs, and file-based systems. Ability to develop complex mappings, workflows, and taskflows. Strong analytical, problem-solving, and communication skills.

Posted 4 days ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

We at Dexian India, are looking to hire a Cloud Data PM with over 10 years of hands-on experience in Azure, DWH, ETL and Presales. The role is based in Chennai with a shift from 2.00pm to 11.00pm IST. Key qualifications we seek in candidates include: - Solid understanding of SQL and data modeling - Proficiency in DWH architecture, including EDW/DM concepts and Star/Snowflake schema - Experience in designing and building data pipelines on Azure Cloud stack - Familiarity with Azure Data Explorer, Data Factory, Data Bricks, Synapse Analytics, Azure Fabric, Azure Analysis Services, and Azure SQL Datawarehouse - Knowledge of Azure DevOps and CI/CD Pipelines - Previous experience managing scrum teams and working as a Scrum Master or Project Manager on at least 2 projects - Exposure to on-premise transactional database environments like Oracle, SQL Server, Snowflake, MySQL, and/or Postgres - Ability to lead enterprise data strategies, including data lake delivery - Proficiency in data visualization tools such as Power BI or Tableau, and statistical analysis using R or Python - Strong problem-solving skills with a track record of deriving business insights from large datasets - Excellent communication skills and the ability to provide strategic direction to technical and business teams - Prior experience in presales, RFP and RFI responses, and proposal writing is mandatory - Capability to explain complex data solutions clearly to senior management - Experience in implementing, managing, and supporting data warehouse projects or applications - Track record of leading full-cycle implementation projects related to Business Intelligence - Strong team and stakeholder management skills - Attention to detail, accuracy, and ability to meet tight deadlines - Knowledge of application development, APIs, Microservices, and Integration components Tools & Technology Experience Required: - Strong hands-on experience in SQL or PLSQL - Proficiency in Python - SSIS or Informatica (Mandatory one of the tools) - BI: Power BI, or Tableau (Mandatory one of the tools) Looking for immediate joiners. Contact: Sriranjini.rammohan@dexian.com Show more Show less

Posted 4 days ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Introduction We believe that every candidate brings something special to the table, including you! So, even if you feel that you re close but not an exact match, we encourage you to apply. We d be thrilled to receive applications from exceptional individuals like yourself. Gallagher, a global industry leader in insurance, risk management, and consulting services, boasts a team of over 50,000 professionals worldwide. Our culture, known as The Gallagher Way,is driven by shared values and a passion for excellence. At the heart of our global operations, the Gallagher Center of Excellence (GCoE) in India, founded in 2006, upholds the values of quality, innovation, and teamwork. With 10,000+ professionals across five India locations, GCoE is where knowledge-driven individuals make a significant impact and build rewarding, long-term careers. Overview Should be able to work as an individual contributor and maintain good relationship with stakeholders. Should be proactive to learn new skills per business requirement. Familiar with extraction of relevant data, cleanse and transform data into insights that drive business value, through use of data analytics, data visualization and data modeling techniques. How youll make an impact Strong MS PowerBI skills. Hands-on experience in MS Azure SQL Database. Excellent knowledge of ETL packages using Visual Studio or Informatica Data analysis, Business analysis Database management and reporting (SQL MS Azure SQL) Critical-thinking and problem-solving Excellent verbal and written Communication skills Review and validate customer data as it s collected Oversee the deployment of data to the data warehouse Cooperate with IT department to deploy software and hardware upgrades that make it possible to leverage big data use cases Monitor analytics and metrics results Implement new data analysis methodologies Perform data profiling to identify and understand anomalies Good to have: Python/R About you 2 to 5 years of experience in PowerBI Technical Bachelor s Degree Non-Technical Degree holders should have 3+ years of relevant experience. Additional Information We value inclusion and diversity Inclusion and diversity (ID) is a core part of our business, and it s embedded into the fabric of our organization. For more than 95 years, Gallagher has led with a commitment to sustainability and to support the commu nities where we live and work. Gallagher embraces our employees diverse identities, experiences and talents, allowing us to better serve our clients and communities. We see inclusion as a conscious commitment and diversity as a vital strength. By embracing diversity in all its forms, we live out Th e Gallagher Way to its fullest. Gallagher believes that all persons are entitled to equal employment opportunity and prohibits any form of discrimination by its managers, employees, vendors or customers based on race, color , religion, creed, gender (including pregnancy status), sexual orientation, gender identity (which includes transgender and other gender non-conforming individuals), gender expression, hair expression, marital status, parental status, age, national origin, ancestry, disability, medical condition, genetic information, veteran or military status, citizenship status, or any other characteristic protected (herein referred to as protected characteristics ) by applicable federal, state, or local laws. Equal employment opportunity will be extended in all aspects of the employer-employee relationship, including, but not limited to, recruitment, hiring, training, promotion, transfer, demotion, compensation, benefits, layoff, and termination. In addition, Gallagher will make reasonable accommodations to known physical or mental limitations of an otherwise qualified person with a disability, unless the accommodation would impose an undue hardship on the operation of our business. ","

Posted 4 days ago

Apply

3.0 - 8.0 years

9 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Introduction We believe that every candidate brings something special to the table, including you! So, even if you feel that you re close but not an exact match, we encourage you to apply. We d be thrilled to receive applications from exceptional individuals like yourself. Gallagher, a global industry leader in insurance, risk management, and consulting services, boasts a team of over 50,000 professionals worldwide. Our culture, known as The Gallagher Way,is driven by shared values and a passion for excellence. At the heart of our global operations, the Gallagher Center of Excellence (GCoE) in India, founded in 2006, upholds the values of quality, innovation, and teamwork. With 10,000+ professionals across five India locations, GCoE is where knowledge-driven individuals make a significant impact and build rewarding, long-term careers. How youll make an impact Should be able to work as an individual contributor and maintain good relationship with stakeholders. Should be proactive to learn new skills per business requirement. Familiar with extraction of relevant data, cleanse and transform data into insights that drive business value, through use of data analytics, data visualization and data modeling techniques. About you Strong MS PowerBI skills. Good experience in HR Talent Data and Analytics domain. Excellent knowledge of SQL. Good MS Excel skills Data analysis, Business analysis Database management and reporting Critical-thinking and problem-solving Excellent verbal and written Communication skills Review and validate customer data as it s collected Oversee the deployment of data to the data warehouse Cooperate with IT department to deploy software and hardware upgrades that make it possible to leverage big data use cases Monitor analytics and metrics results Good knowledge of ETL packages using Visual Studio or Informatica. Implement new data analysis methodologies Perform data profiling to identify and understand anomalies Python R. 3+ years of relevant experience Additional Information We value inclusion and diversity Inclusion and diversity (ID) is a core part of our business, and it s embedded into the fabric of our organization. For more than 95 years, Gallagher has led with a commitment to sustainability and to support the commu nities where we live and work. Gallagher embraces our employees diverse identities, experiences and talents, allowing us to better serve our clients and communities. We see inclusion as a conscious commitment and diversity as a vital strength. By embracing diversity in all its forms, we live out Th e Gallagher Way to its fullest. Gallagher believes that all persons are entitled to equal employment opportunity and prohibits any form of discrimination by its managers, employees, vendors or customers based on race, color , religion, creed, gender (including pregnancy status), sexual orientation, gender identity (which includes transgender and other gender non-conforming individuals), gender expression, hair expression, marital status, parental status, age, national origin, ancestry, disability, medical condition, genetic information, veteran or military status, citizenship status, or any other characteristic protected (herein referred to as protected characteristics ) by applicable federal, state, or local laws. Equal employment opportunity will be extended in all aspects of the employer-employee relationship, including, but not limited to, recruitment, hiring, training, promotion, transfer, demotion, compensation, benefits, layoff, and termination. In addition, Gallagher will make reasonable accommodations to known physical or mental limitations of an otherwise qualified person with a disability, unless the accommodation would impose an undue hardship on the operation of our business. ","

Posted 4 days ago

Apply

12.0 - 17.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Bachelors or Master s degree in Computer Science, Engineering, Applied Mathematics or related field 12+ years of data engineering, data design and/or enterprise data management analytics experience Should be able to architect large enterprise analytics projects with optimal solutions on the Bigdata platform. Should have designed, developed and deployed complex big data ingestion jobs with contemporary data ingestion tools like Spark, Informatica Power Center BDM on Databricks/Hadoop/NoSQL/MPP platforms. Experience with dimensional modeling, data warehousing and data miningA Job posting does not exist for this global job code, please work with your HRG to develop one 4 years of experience in building advanced analytics solutions with data from enterprise systems like ERPs(SAP, Oracle, etc.), RDBMS, CRMs, Marketing Logistics tools etc. 4 years of hands on experience with Spark, Pig/hive etc. and automation of data flows using Informatica BDM/DEI, NiFi and/or Airflow/Oozie. As a Lead Data Engineer, you will be part of a team that delivers contemporary analytics solutions for all Honeywell business groups and functions. You will build strong relationships with leadership to effectively deliver contemporary data analytics solutions contribute directly to business success. You will develop solutions on various Database systems viz. Databricks, Snowflake, Hive, Hadoop, PostgreSQL, etc. You will identify and implement process improvements - and you don t like to the same thing twice so you will automate it if you can. You are always keeping an eye on scalability, optimization, and process. You have worked with Big Data before, IoT data, SAP, SQL, Azure, AWS and a bunch of other acronyms. You will work on a team including scrum masters, product owners, data architects, data engineers/designers, data scientists and DevOps. You and your team collaborate to build products from the idea phase through launch and beyond. The software you write makes it to production in couple of sprints. Your team will be working on creating a new platform using your experience of APIs, microservices, and platform development.

Posted 4 days ago

Apply

5.0 - 8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description Kenvue is currently recruiting for- Data Quality & Governance This position reports into the Manager, Data Quality & Governance and is based in Bengaluru, India. About Kenvue- Kenvue is the world’s largest pure-play consumer health company by revenue. Built on more than a century of heritage, our iconic brands, including Aveeno®, Johnson’s®, Listerine®, and Neutrogena® are science-backed and recommended by healthcare professionals around the world. At Kenvue, we believe in the extraordinary power of everyday care and our teams work every day to put that power in consumers’ hands and earn a place in their hearts and homes. What You Will Do ? The Data Quality & Governance Lead is responsible for overseeing and ensuring the quality, consistency, and integrity of an organization's data. This role involves developing and implementing data governance frameworks, policies, and procedures to maintain data standards across the enterprise. The position requires close collaboration with various stakeholders, including IT, business units, and senior management, to promote a culture of data excellence. Key Responsibilities- Drive the overall data quality function of the Global Supply Chain to include data profiling, data cleansing, and data enrichment of data required to support the programs goals. Leverages data management knowledge to maintain and define data quality and metadata processes, participates in development of data quality rules, thresholds, and standard metrics/quality-expectations for data elements that support critical business processes. Coordinate with global and regional data stewards, business analysts and business SMEs and data owners to define, develop, and maintain data quality rules and metadata. Continue to enhance the in-house DQ framework to include additional advanced DQ capability combined with our data platforms. Deliver DQ solutions and capabilities to data stewards, including DQ dashboards to measure their business-critical data. Co-Lead implementation of active data governance controls together with Data Governance Lead. Assist in the definition of governance structure, policies, processes, metrics to improve and maintain the quality and ownership of data (including data owners, stewards and administrators) Provide recommendations and best practices for data element, data definitions, business rule writing, and metadata management procedures. Ensure compliance with relevant data protection regulations (e.g., GDPR, CCPA). Identify and mitigate data-related risks. Conduct risk assessments and implement controls to protect sensitive data. Collaborate with business and IT stakeholders to ensure alignment of data governance initiatives with business objectives. Facilitate cross-functional data governance committees and working groups. Provide guidance and support to data stewards and data owners. What We Are Looking For ? Minimum 5-8 years of relevant work experience. Advanced experience in data analysis, business requirement gathering, dashboarding and reporting, preferably with Microsoft Azure, SQL. Strong IT literacy with experience in SAP ERP Systems, SAP Hana is plus. In supply chain Project experience as either project lead or core project team member. CDMA certification will be a plus. Data Quality Specialist will partner closely with data creators, data stewards, data consumers, and IT to ensure our data is usable, timely, robust, trustworthy, and compliant. This role looks at the end-to-end data flow and lifecycle, from generation/acquisition to exploitation and value, to establish global and local data quality dimensions, rules, and indicators. Support global and regional data owners, data stewards and SME's for data standardization, cleansing, and data migration activities Align Data Quality rules and policies to evolving data landscape and domains Leverage data quality best practices to design and maintain policies, methodologies, guidelines around data quality, data profiling, data cleansing, including KPI/metrics definitions Collaborate and support stakeholders responsible to implement relevant data quality rules and policies and ensure those rules and policies are implemented Evaluate, implement, and manage governance tool to track and manage data quality and compliance, support reconciliation and validation processes. Collaborate with business users to identify attributes that require data quality and corresponding business rules. Profile data, analyze user requirements, and translate & apply business rules to data quality rules. Extensive experience applying data quality principles to deliver high-quality data assets Experience in data quality and management platforms, query and scripting languages Experience in identifying and resolving data quality issues Experience in performing root cause analysis on data quality issues. Experience with data integration and data profiling tools, automation of processes Experience with tools such as Informatica, Colibra, DeeQu, etc. Interest and passion for working in a complex data domain and dynamic business environment, solving challenging problems, and supporting transformation Ability to work in complex, matrix environments doing hands-on work as the situation requires Ability to work with business and analytics leaders to identify solutions to data quality and successfully champion the role of data quality to preserve data integrity. Understanding of the data governance process, data stewardship, data cataloguing, data engineering, data integration, business rules management, etc. Ability to cultivate relationships with business partners to analyse and implement solutions to improve user experience. Ability to improve users' trust and confidence in the data products. Ability to demonstrate the direct impact of data quality checks and balances on business decisions. Ability to provide cost-saving and efficiency gain due to data quality improvement Desired Qualifications- Results driven with strong ownership and accountability. Ability to analyse problems as well as propose and implement own solutions. Strong organizational skills and an aptitude for planning and prioritization. Accuracy with attention to detail Experience to work autonomously and handle multiple projects and tasks in parallel. Ability to translate complex situations into simple working solutions. Ability to build effective partnering relationships. Excellent English written/spoken skills. Strong Team Player Framing and communicating of complex data and process. Data Quality & Governance This position reports into the Manager, Data Quality & Governance and is based in Bengaluru, India. About Kenvue- Kenvue is the world’s largest pure-play consumer health company by revenue. Built on more than a century of heritage, our iconic brands, including Aveeno®, Johnson’s®, Listerine®, and Neutrogena® are science-backed and recommended by healthcare professionals around the world. At Kenvue, we believe in the extraordinary power of everyday care and our teams work every day to put that power in consumers’ hands and earn a place in their hearts and homes. What You Will Do ? The Data Quality & Governance Lead is responsible for overseeing and ensuring the quality, consistency, and integrity of an organization's data. This role involves developing and implementing data governance frameworks, policies, and procedures to maintain data standards across the enterprise. The position requires close collaboration with various stakeholders, including IT, business units, and senior management, to promote a culture of data excellence. Key Responsibilities- Drive the overall data quality function of the Global Supply Chain to include data profiling, data cleansing, and data enrichment of data required to support the programs goals. Leverages data management knowledge to maintain and define data quality and metadata processes, participates in development of data quality rules, thresholds, and standard metrics/quality-expectations for data elements that support critical business processes. Coordinate with global and regional data stewards, business analysts and business SMEs and data owners to define, develop, and maintain data quality rules and metadata. Continue to enhance the in-house DQ framework to include additional advanced DQ capability combined with our data platforms. Deliver DQ solutions and capabilities to data stewards, including DQ dashboards to measure their business-critical data. Co-Lead implementation of active data governance controls together with Data Governance Lead. Assist in the definition of governance structure, policies, processes, metrics to improve and maintain the quality and ownership of data (including data owners, stewards and administrators) Provide recommendations and best practices for data element, data definitions, business rule writing, and metadata management procedures. Ensure compliance with relevant data protection regulations (e.g., GDPR, CCPA). Identify and mitigate data-related risks. Conduct risk assessments and implement controls to protect sensitive data. Collaborate with business and IT stakeholders to ensure alignment of data governance initiatives with business objectives. Facilitate cross-functional data governance committees and working groups. Provide guidance and support to data stewards and data owners. What We Are Looking For ? Minimum 5-8 years of relevant work experience. Advanced experience in data analysis, business requirement gathering, dashboarding and reporting, preferably with Microsoft Azure, SQL. Strong IT literacy with experience in SAP ERP Systems, SAP Hana is plus. In supply chain Project experience as either project lead or core project team member. CDMA certification will be a plus. Data Quality Specialist will partner closely with data creators, data stewards, data consumers, and IT to ensure our data is usable, timely, robust, trustworthy, and compliant. This role looks at the end-to-end data flow and lifecycle, from generation/acquisition to exploitation and value, to establish global and local data quality dimensions, rules, and indicators. Support global and regional data owners, data stewards and SME's for data standardization, cleansing, and data migration activities Align Data Quality rules and policies to evolving data landscape and domains Leverage data quality best practices to design and maintain policies, methodologies, guidelines around data quality, data profiling, data cleansing, including KPI/metrics definitions Collaborate and support stakeholders responsible to implement relevant data quality rules and policies and ensure those rules and policies are implemented Evaluate, implement, and manage governance tool to track and manage data quality and compliance, support reconciliation and validation processes. Collaborate with business users to identify attributes that require data quality and corresponding business rules. Profile data, analyze user requirements, and translate & apply business rules to data quality rules. Extensive experience applying data quality principles to deliver high-quality data assets Experience in data quality and management platforms, query and scripting languages Experience in identifying and resolving data quality issues Experience in performing root cause analysis on data quality issues. Experience with data integration and data profiling tools, automation of processes Experience with tools such as Informatica, Colibra, DeeQu, etc. Interest and passion for working in a complex data domain and dynamic business environment, solving challenging problems, and supporting transformation Ability to work in complex, matrix environments doing hands-on work as the situation requires Ability to work with business and analytics leaders to identify solutions to data quality and successfully champion the role of data quality to preserve data integrity. Understanding of the data governance process, data stewardship, data cataloguing, data engineering, data integration, business rules management, etc. Ability to cultivate relationships with business partners to analyse and implement solutions to improve user experience. Ability to improve users' trust and confidence in the data products. Ability to demonstrate the direct impact of data quality checks and balances on business decisions. Ability to provide cost-saving and efficiency gain due to data quality improvement Desired Qualifications- Results driven with strong ownership and accountability. Ability to analyse problems as well as propose and implement own solutions. Strong organizational skills and an aptitude for planning and prioritization. Accuracy with attention to detail Experience to work autonomously and handle multiple projects and tasks in parallel. Ability to translate complex situations into simple working solutions. Ability to build effective partnering relationships. Excellent English written/spoken skills. Strong Team Player Framing and communicating of complex data and process. Primary Location Asia Pacific-India-Karnataka-Bangalore Job Function Digital Product Development Job Qualifications Kenvue is currently recruiting for- Show more Show less

Posted 4 days ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities We are seeking a skilled and experienced Cognos TM1 Developer with a strong background in ETL processes and Python development. The ideal candidate will be responsible for designing, developing, and supporting TM1 solutions, integrating data pipelines, and automating processes using Python. This role requires strong problem-solving skills, business acumen, and the ability to work collaboratively with cross-functional teams Preferred Education Master's Degree Required Technical And Professional Expertise 4+ years of hands-on experience with IBM Cognos TM1 / Planning Analytics. Strong knowledge of TI processes, rules, dimensions, cubes, and TM1 Web. Proven experience in building and managing ETL pipelines (preferably with tools like Informatica, Talend, or custom scripts). Proficiency in Python programming for automation, data processing, and system integration. Experience with REST APIs, JSON/XML data formats, and data extraction from external sources Preferred Technical And Professional Experience strong SQL knowledge and ability to work with relational databases. Familiarity with Agile methodologies and version control systems (e.g., Git). 3.Excellent analytical, problem-solving, and communication skills Show more Show less

Posted 4 days ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Title: Lead Data Engineer – C12 / Assistant Vice President (India) The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 8 to 12 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills An inclination to mentor; an ability to lead and deliver medium sized components independently T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Expertise around Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Data Governance: A strong grasp of principles and practice including data quality, security, privacy and compliance Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Experience of using a Job scheduler e.g., Autosys. Exposure to Business Intelligence tools e.g., Tableau, Power BI Certification on any one or more of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 4 days ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

A career within Salesforce Consulting services, will provide you with the opportunity to help our clients leverage Salesforce technology to enhance their customer experiences, enable sustainable change, and drive results. We focus on understanding our client’s challenges and developing custom solutions powered by Salesforce to transform their sales, service and marketing capabilities by exploring data and identifying trends, managing customer life cycles, strategically building and leveraging online communities, driving employee engagement and collaboration, and connecting directly with channel partners to share goals, objectives, and activities in a secure, branded location. To really stand out and make us fit for the future in a constantly changing world, each and every one of us at PwC needs to be a purpose-led and values-driven leader at every level. To help us achieve this we have the PwC Professional; our global leadership development framework. It gives us a single set of expectations across our lines, geographies and career paths, and provides transparency on the skills we need as individuals to be successful and progress in our careers, now and in the future. As a Senior Associate, You'll Work As Part Of a Team Of Problem Solvers, Helping To Solve Complex Business Issues From Strategy To Execution. PwC Professional Skills And Responsibilities For This Management Level Include But Are Not Limited To: Use feedback and reflection to develop self awareness, personal strengths and address development areas. Delegate to others to provide stretch opportunities, coaching them to deliver results. Demonstrate critical thinking and the ability to bring order to unstructured problems. Use a broad range of tools and techniques to extract insights from current industry or sector trends. Review your work and that of others for quality, accuracy and relevance. Know how and when to use tools available for a given situation and can explain the reasons for this choice. Seek and embrace opportunities which give exposure to different situations, environments and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. The Opportunity When you join PwC Acceleration Centers (ACs), you step into a pivotal role focused on actively supporting various Acceleration Center services, from Advisory to Assurance, Tax and Business Services. In our innovative hubs, you’ll engage in challenging projects and provide distinctive services to support client engagements through enhanced quality and innovation. You’ll also participate in dynamic and digitally enabled training that is designed to grow your technical and professional skills. As part of the Business Application Consulting team, you translate customer requirements into functional configurations of Salesforce.com. As a Senior Associate, you analyze complex problems, mentor others, and maintain rigorous standards. You focus on building client relationships and developing a deeper understanding of the business context, while navigating increasingly complex situations and growing your personal brand and technical knowledge. Responsibilities Translate customer requirements into functional Salesforce configurations Analyze and address complex issues within client projects Mentor and support junior team members Foster and strengthen client relationships Gain a thorough understanding of business context Manage and navigate intricate scenarios Enhance personal brand and technical skills Uphold exceptional standards and quality in deliverables What You Must Have Bachelor's Degree 4 years of experience 2-3 years of experience in Salesforce CPQ projects & Billing Experience with configuration & implementation of Salesforce CPQ Cloud 1-3 successful completions of CPQ and Billing entire cycle Implementation Thorough understanding of Quote to cash process Hands-on experience in Force.com platform using APEX, flows Experience in working with LWC - Lightning Web Components Experience in working with Advanced approval process Experience on SOAP/Rest/Platform Events/Streaming APIs and 3rd party integrations What Sets You Apart Bachelor of Technology preferred Proficient experience in Salesforce configuration, security and mapping features to the business requirements Experience in implementing integration solutions between CRM, ERP and Financial systems (example - Zuora, NetSuite) Advanced RDBMS knowledge and building SQL queries Proficient written and verbal communication skills Proficient experience wrt handling large data Producing and delivering technical solutions and integrated solutions involving different Salesforce clouds (including but not limited to Sales, Service, Revenue, Platform) and a variety of middleware products (Mulesoft, Informatica, etc) establishing quality and schedule Show more Show less

Posted 4 days ago

Apply

10.0 years

5 - 7 Lacs

Hyderābād

On-site

GlassDoor logo

Manager – Global Data Management, Global Financial Services Key Responsibilities: Platforms & Technologies Serve as a subject matter expert for Global Data Management platforms, business value, product requirements, features and design Ensure alignment with leadership of programs who both provide and rely on the consumption and quality of Entity, Client, Vendor, People and Taxonomy master data Oversee delivery of product roadmaps in conjunction with business, technology, data governance and client delivery lifecycle teams Oversee successful development of feature sheets/PRDs, business requirements, user stories, KPIs and effective testing/validation strategies Confirm that business priorities are enabled by data architecture designs, proposed data workflows, and overall product functionality Ensure consistent stakeholder activities across product releases, e.g., project status updates, knowledge transfer sessions, deployment enablement, training development, communications Review key metrics to proactively assess product usage and value; communicate with leaders to escalate areas of concern and recommend corrective action Maintain knowledge of industry trends and best practices to drive continuous improvement through strategic thinking; emphasizing emerging technologies to improve product capabilities and adoption/ROI Manage and counsel team leaders and members to expand their knowledge, optimize their contributions and enhance their professional development Build and maintain strong relationships with technical teams to execute day to day responsibilities and projects Leadership Deep experience in a lead business role over large-scale technology projects across all aspects of the product development and delivery life cycle Proven effectiveness at stakeholder management, including exerting influence through eminence development, facilitation and effective collaboration with a positive attitude and presence Ability to face and deal with ambiguous problems/issues in a mature and professional manner Ability to demonstrate strategic thinking and provide effective direction to team members to generate innovative ideas as part of proposed solutions Excellent oral and written communications skills, with a focus on presenting at the executive-level Excellent organizational skills for leading multiple platforms and programs simultaneously Take decisions independently, demonstrate executive presence and have a strong hold on the team Lead recruitment, mentoring, and administrative management of high-performing leaders and individual contributors, including performance assessments Handle the entire cycle of performance management e.g. regular coaching sessions, due diligence, performance metrics and reviews, presentation of performance reviews and ability to influence stakeholders Lead operational initiatives e.g. enhancing roles and responsibilities on the team, developing career paths, defining retention strategies Experience and comfort working virtually with global, cross-geography teams Experience in a large professional services organization preferred Qualification Required: Education and Experience Master’s degree in computer information/data management/analytics/business administration or related field 10+ years (minimum 7+ years of work experience in a platforms delivery and data management leadership role) Technical Skills AGILE Methodology, Scrum and SAFe Master Data Management platforms (e.g. Informatica IICS, IDMC) Data Governance & Quality platforms (e.g. SAP MDG, Informatica CDGC) ERP platforms (e.g. SAP S/4HANA) CRM platforms (e.g. Salesforce) HCM platforms (e.g. SAP SuccessFactors) Taxonomy/Ontology management platforms Data Distribution/ETL services (e.g. Informatica CDI, SAP Data Services) SQL and/or Oracle Generative AI, LLMs, Machine Learning a plus Proficiency in Microsoft Outlook, Teams, PowerPoint, Word (advanced expertise in Excel) Location: Hyderabad Shift Timings: 02:00 PM – 11:00 PM How you will grow At Deloitte, we have invested a great deal to create a rich environment in which our professionals can grow. We want all our people to develop in their own way, playing to their own strengths as they hone their leadership skills. And, as a part of our efforts, we provide our professionals with a variety of learning and networking opportunities—including exposure to leaders, sponsors, coaches, and challenging assignments—to help accelerate their careers along the way. No two people learn in the same way. So, we provide a range of resources including live classrooms, team-based learning, and eLearning. DU: The Leadership Center in India, our state-of-the-art, world-class learning Center in the Hyderabad offices is an extension of the Deloitte University (DU) in Westlake, Texas, and represents a tangible symbol of our commitment to our people’s growth and development. Explore DU: The Leadership Center in India Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Deloitte’s culture Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture that is inclusive, invites authenticity, leverages our diversity, and where our people excel and lead healthy, happy lives. Learn more about Life at Deloitte. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people, and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. Learn more about Deloitte’s impact on the world. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 304446

Posted 4 days ago

Apply

0 years

4 - 7 Lacs

Hyderābād

On-site

GlassDoor logo

Experience -8 plus Location-Hyderabad Notice period-Only immediate joiners 8 plus years of Strong ETL Informatica experience. Should have Oracle, Hadoop, MongoDB experience. Strong SQL/Unix knowledge. Experience in working with RDBMS. Preference to have Teradata. Good to have Bigdata/Hadoop experience. Good to have Python or any programming knowledge. Your future duties and responsibilities Required qualifications to be successful in this role B Tech Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.

Posted 4 days ago

Apply

8.0 years

3 - 7 Lacs

Pune

Remote

GlassDoor logo

Entity: Technology Job Family Group: IT&S Group Job Description: Enterprise Technology Engineers in bp bp is reinventing itself and digital capability is at the core of this vision. As a Senior Enterprise Technology Engineer you are a digital expert bringing deep specialist expertise to bp. Enterprise Technology Engineers work on the strategic technology platforms we exploit from the market, or come with deep skills in the implementation and integration of market solutions into our overall technology landscape. You will bring a broad base of Digital technical knowledge and a strong understanding of software delivery principles. You will be familiar with lifecycle methods, with Agile delivery and the DevOps approach at the core. You will be skilled in the application of approaches such as Site Reliability Engineering in the delivery and operation of the technologies you deliver, working as part of multi disciplinary squads. You thrive in a culture of continuous improvement within teams, encouraging and empowering innovation and the delivery of changes that optimise operational efficiency and user experience. You are curious and improve your skills through continuous learning of new technologies, trends & methods, applying knowledge gained to improve bp standards and the capabilities of the Engineering Community. You coach others in the Field to drive improved performance across our business. You embrace a culture of change and agility, evolving continuously, adapting to our changing world. You are an effective great teammate, looking beyond your own area/organizational boundaries to consider the bigger picture and/or perspective of others, while understanding cultural differences. You continually enhance your self-awareness and seek guidance from others on your impact and effectiveness. Well organized, you balance proactive and reactive approaches and multiple priorities to complete tasks on time. You apply judgment and common sense – you use insight and good judgment to inform actions and respond to situations as they arise. Key Accountabilities Technical lead for invoice processing application called eBilling Managing reliability of service and delivering to agreed SLA Collaborating with platform and security teams for patching and vulnerability management The safety of our people and our customers is our highest priority. The role will advocate and lead in this and promote security and safety in everything that we do. Work as part of evolving multi disciplinary teams which may include Software Engineers, Enterprise Technology, Engineers, Designers, SecOps, and Product owners to deliver value through the application of specialist skills Work with vendors and partners providing market solutions to optimize the usage and value which can be delivered from the appropriate technology platform Ensure operational integrity of what you build, assuring operational compliance with architectural and security standards, as well as compliance and policy controls refined by Strategy. Mentoring and become a conduit to connect the broader organization. Define and document standard run books and operating procedures. Create and maintain system information and architecture diagrams Education A first degree from a recognized institute of higher learning, ideally computer science or engineering based. Essential Experience and Job Requirements Total 8+ Years experience with Good knowledge of the Order to Cash process (preferably with Aviation domain) Informatica ETL MS SQL Data Integration Patterns (preferably with XML invoice processing) Experience with leading teams Demonstrable Knowledge of modern Service Delivery methods - Site Reliability Engineering to traditional ITIL, and understanding of Product Based delivery Strong Communications skills and a high ‘EQ’ with the ability to operate across complex business environments and collaborators up to senior executive level Desirable criteria Project Management experience delivering IT led projects Broad experience contributing and collaborating to assist design, plan, implement, maintain, and document services and solutions Development experience in one or more object-oriented or applicable programming languages (e.g. Python, Go, Java, C/C++) Skills that set you apart Passion for mentoring and coaching engineers in both technical and soft skills You focus on delighting customers with outstanding user experiences and customer service You are comfortable operating in an environment that is loosely coupled but tightly aligned toward a shared vision About bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Agility core practices, Agility core practices, Analytics, API and platform design, Business Analysis, Cloud Platforms, Coaching, Communication, Configuration management and release, Continuous deployment and release, Data Structures and Algorithms (Inactive), Digital Project Management, Documentation and knowledge sharing, Facilitation, Information Security, iOS and Android development, Mentoring, Metrics definition and instrumentation, NoSql data modelling, Relational Data Modelling, Risk Management, Scripting, Service operations and resiliency, Software Design and Development, Source control and code management {+ 4 more} Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks.

Posted 4 days ago

Apply

0 years

4 - 5 Lacs

Bengaluru

On-site

GlassDoor logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data management focus on organising and maintaining data to enable accuracy and accessibility for effective decision-making. These individuals handle data governance, quality control, and data integration to support business operations. In data governance at PwC, you will focus on establishing and maintaining policies and procedures to optimise the quality, integrity, and security of data. You will be responsible for optimising data management processes and mitigate risks associated with data usage. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC , we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: Expected Skills · Hands on experience in IICS · Shell scripting and PL SQL experience · Experience in Oracle DB, Linux · Exposure to Azure cloud services · Knowledge of SAP Business Objects Mandatory skill sets: Informatica IICS, PLSQL, Shell Scripting Preferred skill sets: Business Objects(Preferred) Years of experience required: 7-12yrs Education qualification: BE, B.Tech, ME, M,Tech, MCA (60% above) Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills ETL (Informatica) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Process Management (BPM), Coaching and Feedback, Communication, Corporate Governance, Creativity, Data Access Control, Database Administration, Data Governance Training, Data Processing, Data Processor, Data Quality, Data Quality Assessment, Data Quality Improvement Plans (DQIP), Data Stewardship, Data Stewardship Best Practices, Data Stewardship Frameworks, Data Warehouse Governance, Data Warehousing Optimization, Embracing Change, Emotional Regulation, Empathy {+ 22 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 4 days ago

Apply

2.0 years

6 - 8 Lacs

Bengaluru

On-site

GlassDoor logo

By clicking the “Apply” button, I understand that my employment application process with Takeda will commence and that the information I provide in my application will be processed in line with Takeda’s Privacy Notice and Terms of Use. I further attest that all information I submit in my employment application is true to the best of my knowledge. Job Description Analyst / Specialist I - EDB Solutions, Comms Others The Future Begins Here At Takeda, we are leading digital evolution and global transformation. By building innovative solutions and future-ready capabilities, we are meeting the need of patients, our people, and the planet. Bengaluru, the city, which is India’s epicenter of Innovation, has been selected to be home to Takeda’s recently launched Innovation Capability Center. We invite you to join our digital transformation journey. In this role, you will have the opportunity to boost your skills and become the heart of an innovative engine that is contributing to global impact and improvement. At Takeda’s ICC we Unite in Diversity Takeda is committed to creating an inclusive and collaborative workplace, where individuals are recognized for their backgrounds and abilities they bring to our company. We are continuously improving our collaborators journey in Takeda, and we welcome applications from all qualified candidates. Here, you will feel welcomed, respected, and valued as an important contributor to our diverse team. OBJECTIVES/ PURPOSE As a Data Platform Solutions Support Engineer you will be responsible for designing, managing, and optimizing data integration and Data Platform Solutions. You will be part of data engineers, ensuring best practices and fostering development within the team. You will work closely with our data engineering and analytics teams to ensure seamless data flow and integration across various data sources and destinations. ACCOUNTABILITIES Integrate various data sources, including on-premises and cloud-based systems. Optimize and troubleshoot data integration processes to ensure efficiency and reliability. Collaborate with data engineers, analysts, and other stakeholders to understand business requirements and translate them into technical solutions. Ensure data quality and integrity through rigorous testing and validation processes. Perform data mapping, transformation, and cleansing to meet business needs. Participate in strategic planning and decision-making processes related to data integration and engineering initiatives. EDUCATION, BEHAVIOURAL COMPETENCIES AND SKILLS : Essential Bachelor’s / Master’s degree in Computer Science, Information Technology, or a related field. 2+ years of proven experience working with data integration tools, including supporting Data Platform Solutions Proficiency in SQL and experience with relational databases such as SQL Server, Oracle, or MySQL. Strong understanding of data integration concepts, ETL processes, and data warehousing. Experience with cloud platforms such as AWS, Azure, or Google Cloud. Excellent problem-solving skills and attention to detail. Strong leadership skills with experience leading and managing teams. Excellent communication and collaboration skills. Ability to work independently and drive strategic initiatives. ADDITIONAL INFORMATION ( Preferred Skills ) Experience with Informatica PowerCenter and other Data tools Knowledge of scripting languages such as Python or Shell Scripting. Experience in Agile development methodologies. What Takeda Can Offer You Takeda is certified as a Top Employer, not only in India, but also globally. No investment we make pays greater dividends than taking good care of our people. At Takeda, you take the lead on building and shaping your own career. Joining the ICC in Bangalore will give you access to high-end technology, continuous training and a diverse and inclusive network of colleagues who will support your career growth. Benefits It is our priority to provide competitive compensation and a benefit package that bridges your personal life with your professional career. Amongst our benefits are: Competitive Salary + Performance Annual Bonus Flexible work environment, including hybrid working Comprehensive Healthcare Insurance Plans for self, spouse, and children Group Term Life Insurance and Group Accident Insurance programs Health & Wellness programs Employee Assistance Program 3 days of leave every year for Voluntary Service in additional to Humanitarian Leaves Broad Variety of learning platforms Diversity, Equity, and Inclusion Programs Reimbursements – Home Internet & Mobile Phone Employee Referral Program Leaves – Paternity Leave (4 Weeks) , Maternity Leave (up to 26 weeks), Bereavement Leave (5 days) About ICC in Takeda Takeda is leading a digital revolution. We’re not just transforming our company; we’re improving the lives of millions of patients who rely on our medicines every day. As an organization, we are committed to our cloud-driven business transformation and believe the ICCs are the catalysts of change for our global organization. Locations IND - Bengaluru Worker Type Employee Worker Sub-Type Regular Time Type Full time

Posted 4 days ago

Apply

5.0 - 7.0 years

3 - 6 Lacs

Noida

On-site

GlassDoor logo

Req ID: 327439 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Technical Solns.Arch. Specialist Advisor to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). At NTT DATA, we know that with the right people on board, anything is possible. The quality, integrity, and commitment of our employees are key factors in our company's growth, market presence and our ability to help our clients stay ahead of the competition. By hiring the best people and helping them grow both professionally and personally, we ensure a bright future for NTT DATA and for the people who work here. NTT DATA, Inc. currently seeks an Enterprise Architect to join our team in India. Reporting to the Director of Software Architecture, with matrix reporting to the client's Architecture Leadership, the Architects work closely with IT and business leadership to align technical solutions with business needs. This role will bridge the gap between systems, design, technical architecture, business processes, support, and operations. The Technical Architect will be responsible for providing end-to-end conceptual solutions to business case ideas and present them to stakeholders. In doing so, the solutions architect will partner closely with business stakeholders to understand the vision and translate it to a conceptual architecture. The client for this engagement is a large healthcare insurance payer in New England. NTT DATA has an established and preferred vendor relationship with the client. Duties/Responsibilities: Provide leadership for planning, designing, and implementing end to end applications in enterprise architecture Acquire a working knowledge of the architecture of existing on premise systems, and assist in documenting them in SharePoint repository Contributes to enterprise architecture strategy and road map Evaluate emerging technologies, prototype solutions, and evaluate processes for inclusion in the technology road map. Identify system inter-dependence and develop migration plans. Provide architectural input of projects ensuring requirements are aligned with IT and business architecture. Identify opportunities to reduce overall IT costs and evaluate ongoing investment in technology to promote innovation and advancement of the client's strategic goals and mission. Other duties and projects as assigned. Basic Qualifications: 5-7 years' experience as an IT professional with deep understanding of n-Tier system architecture minimum 1 Year experience designing end to end Solution/Technical Architecture Skills Required: Strong working knowledge of portals, Java EE, and Angular/Ionic technologies, SOAP/REST Services and knowledge of microservices Working knowledge of JMS based applications integration concepts, tools, and technologies Working knowledge of data integration concepts and relational databases including Oracle, SQL Server Experience with US Healthcare Payor industry and health care data standards (HIPAA, HL7) is strongly preferred Solid understanding of big data concepts, tools, and technologies Solid understanding of security concepts across the tiers including applications, integration, and data Demonstrated ability in setting corporate architectural directions aligned with strategic business need Working knowledge of EA modelling tools Nice to have: Azure or AWS cloud integration with on-prem systems UX and UI design experience AI/ML technology understanding Architecture Governance rationalization and toolsets Strong knowledge in microservices architecture Working knowledge of application integration tools such as TIBCO, Mulesoft, Oracle SOA etc Working knowledge of Data integration tools such as Informatica, Datastage About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.

Posted 4 days ago

Apply

10.0 years

0 Lacs

West Bengal

On-site

GlassDoor logo

Requirements 10+ years of strong experience with data transformation & ETL on large data sets Experience with designing customer centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale etc.) 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data ) 5+ years of complex SQL or NoSQL experience Experience in advanced Data Warehouse concepts Experience in industry ETL tools (i.e., Informatica, Unifi) Experience with Business Requirements definition and management, structured analysis, process design, use case documentation Experience with Reporting Technologies (i.e., Tableau, PowerBI) Experience in professional software development Demonstrate exceptional organizational skills and ability to multi-task simultaneous different customer projects Strong verbal & written communication skills to interface with Sales team & lead customers to successful outcome Must be self-managed, proactive and customer focused Degree in Computer Science, Information Systems, Data Science, or related field

Posted 4 days ago

Apply

3.0 - 5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

As a Data Engineer , you are required to: Design, build, and maintain data pipelines that efficiently process and transport data from various sources to storage systems or processing environments while ensuring data integrity, consistency, and accuracy across the entire data pipeline. Integrate data from different systems, often involving data cleaning, transformation (ETL), and validation. Design the structure of databases and data storage systems, including the design of schemas, tables, and relationships between datasets to enable efficient querying. Work closely with data scientists, analysts, and other stakeholders to understand their data needs and ensure that the data is structured in a way that makes it accessible and usable. Stay up-to-date with the latest trends and technologies in the data engineering space, such as new data storage solutions, processing frameworks, and cloud technologies. Evaluate and implement new tools to improve data engineering processes. Qualification : Bachelor's or Master's in Computer Science & Engineering, or equivalent. Professional Degree in Data Science, Engineering is desirable. Experience level : At least 3 - 5 years hands-on experience in Data Engineering, ETL. Desired Knowledge & Experience : Spark: Spark 3.x, RDD/DataFrames/SQL, Batch/Structured Streaming Knowing Spark internals: Catalyst/Tungsten/Photon Databricks: Workflows, SQL Warehouses/Endpoints, DLT, Pipelines, Unity, Autoloader IDE: IntelliJ/Pycharm, Git, Azure Devops, Github Copilot Test: pytest, Great Expectations CI/CD Yaml Azure Pipelines, Continuous Delivery, Acceptance Testing Big Data Design: Lakehouse/Medallion Architecture, Parquet/Delta, Partitioning, Distribution, Data Skew, Compaction Languages: Python/Functional Programming (FP) SQL: TSQL/Spark SQL/HiveQL Storage: Data Lake and Big Data Storage Design additionally it is helpful to know basics of: Data Pipelines: ADF/Synapse Pipelines/Oozie/Airflow Languages: Scala, Java NoSQL: Cosmos, Mongo, Cassandra Cubes: SSAS (ROLAP, HOLAP, MOLAP), AAS, Tabular Model SQL Server: TSQL, Stored Procedures Hadoop: HDInsight/MapReduce/HDFS/YARN/Oozie/Hive/HBase/Ambari/Ranger/Atlas/Kafka Data Catalog: Azure Purview, Apache Atlas, Informatica Required Soft skills & Other Capabilities : Great attention to detail and good analytical abilities. Good planning and organizational skills Collaborative approach to sharing ideas and finding solutions Ability to work independently and also in a global team environment. Show more Show less

Posted 4 days ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

🚨 Urgent Hiring: Informatica IDMC Developers | 8–24 LPA | 30-Day Joiners Preferred We are urgently looking for Informatica IDMC (Informatica Data Management Cloud) Developers to join a fast-paced data integration and transformation initiative for a leading insurance client. This is a high-priority requirement to be fulfilled within 30 days . 📍 Pay Scale: ₹8 LPA – ₹24 LPA 📧 Apply Now: careers@vidunaya.com 🔧 Required Skills: 4+ years of experience with Informatica tools Minimum 1 year of hands-on experience with Informatica IDMC Strong understanding of cloud platforms – AWS, Azure, or GCP Proficient in REST APIs , data warehousing , and SQL Familiarity with insurance domain data models is a plus Strong communication and analytical skills ✅ Preferred Qualifications: Informatica IDMC certification Experience with Agile methodologies and DevOps tools 📢 Immediate joiners or those available within 30 days will be prioritized. Don’t miss this opportunity to work on a high-impact project with a dynamic team! 📨 To apply, send your resume to : careers@vidunaya.com #Informatica #IDMC #HiringNow #DataIntegration #CloudJobs #InsuranceTech #UrgentHiring #DevOps #SQL #DataEngineering #AWS #Azure #GCP Show more Show less

Posted 4 days ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Sanofi We are an innovative global healthcare company, driven by one purpose: we chase the miracles of science to improve people’s lives. Our team, across some 100 countries, is dedicated to transforming the practice of medicine by working to turn the impossible into the possible. We provide potentially life-changing treatment options and life-saving vaccine protection to millions of people globally, while putting sustainability and social responsibility at the center of our ambitions. Sanofi has recently embarked into a vast and ambitious digital transformation program. A cornerstone of this roadmap is the acceleration of its data transformation and of the adoption of artificial intelligence (AI) and machine learning (ML) solutions that will accelerate Manufacturing & Supply performance and help bring drugs and vaccines to patients faster, to improve health and save lives. Who You Are: You are a dynamic Data Engineer interested in challenging the status quo to design and develop globally scalable solutions that are needed by Sanofi’s advanced analytic, AI and ML initiatives for the betterment of our global patients and customers. You are a valued influencer and leader who has contributed to making key datasets available to data scientists, analysts, and consumers throughout the enterprise to meet vital business use needs. You have a keen eye for improvement opportunities while continuing to fully comply with all data quality, security, and governance standards. Our vision for digital, data analytics and AI Join us on our journey in enabling Sanofi’s Digital Transformation through becoming an AI first organization. This means: AI Factory - Versatile Teams Operating in Cross Functional Pods: Utilizing digital and data resources to develop AI products, bringing data management, AI and product development skills to products, programs and projects to create an agile, fulfilling and meaningful work environment. Leading Edge Tech Stack: Experience building products that will be deployed globally on a leading-edge tech stack. World Class Mentorship and Training: Working with renowned leaders and academics in machine learning to further develop your skillsets There are multiple vacancies across our Digital profiles and NA region. Further assessments will be completed to determine specific function and level of hired candidates. Job Highlights Propose and establish technical designs to meet business and technical requirements Develop and maintain data engineering solutions based on requirements and design specifications using appropriate tools and technologies Create data pipelines / ETL pipelines and optimize performance Test and validate developed solution to ensure it meets requirements Create design and development documentation based on standards for knowledge transfer, training, and maintenance Work with business and products teams to understand requirements, and translate them into technical needs Adhere to and promote to best practices and standards for code management, automated testing, and deployments Leverage existing or create new standard data pipelines within Sanofi to bring value through business use cases Develop automated tests for CI/CD pipelines Gather/organize large & complex data assets, and perform relevant analysis Conduct peer reviews for quality, consistency, and rigor for production level solution Actively contribute to Data Engineering community and define leading practices and frameworks Communicate results and findings in a clear, structured manner to stakeholders Remains up to date on the company’s standards, industry practices and emerging technologies Key Functional Requirements & Qualifications Experience working cross-functional teams to solve complex data architecture and engineering problems Demonstrated ability to learn new data and software engineering technologies in short amount of time Good understanding of agile/scrum development processes and concepts Able to work in a fast-paced, constantly evolving environment and manage multiple priorities Strong technical analysis and problem-solving skills related to data and technology solutions Excellent written, verbal, and interpersonal skills with ability to communicate ideas, concepts and solutions to peers and leaders Pragmatic and capable of solving complex issues, with technical intuition and attention to detail Service-oriented, flexible, and approachable team player Fluent in English (Other languages a plus) Key Technical Requirements & Qualifications Bachelor’s Degree or equivalent in Computer Science, Engineering, or relevant field 4 to 5+ years of experience in data engineering, integration, data warehousing, business intelligence, business analytics, or comparable role with relevant technologies and tools, such as Spark/Scala, Informatica/IICS/Dbt Understanding of data structures and algorithms Working knowledge of scripting languages (Python, Shell scripting) Experience in cloud-based data platforms (Snowflake is a plus) Experience with job scheduling and orchestration (Airflow is a plus) Good knowledge of SQL and relational databases technologies/concepts Experience working with data models and query tuning Nice To Haves Experience working in life sciences/pharmaceutical industry is a plus Familiarity with data ingestion through batch, near real-time, and streaming environments Familiarity with data warehouse concepts and architectures (data mesh a plus) Familiarity with Source Code Management Tools (GitHub a plus) Pursue Progress Discover Extraordinary Better is out there. Better medications, better outcomes, better science. But progress doesn’t happen without people – people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. So, let’s be those people. Watch our ALL IN video and check out our Diversity, Equity and Inclusion actions at sanofi.com! null Show more Show less

Posted 4 days ago

Apply

7.0 - 15.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

DataArchitecture Design: Develop and maintain a comprehensive data architecture strategy that aligns with the business objectives and technology landscape. DataModeling:Createand managelogical, physical, and conceptual data models to support various business applications and analytics. DatabaseDesign: Design and implement database solutions, including data warehouses, data lakes, and operational databases. DataIntegration: Oversee the integration of data from disparate sources into unified, accessible systems using ETL/ELT processes. DataGovernance:Implementand enforce data governance policies and procedures to ensure data quality, consistency, and security. TechnologyEvaluation: Evaluate and recommend data management tools, technologies, and best practices to improve data infrastructure and processes. Collaboration: Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand data requirements and deliver effective solutions. Trusted by the world’s leading brands Documentation:Createand maintain documentation related to data architecture, data flows, data dictionaries, and system interfaces. PerformanceTuning: Optimize database performance through tuning, indexing, and query optimization. Security: Ensure data security and privacy by implementing best practices for data encryption, access controls, and compliance with relevant regulations (e.g., GDPR, CCPA) Requirements Helpingproject teams withsolutions architecture,troubleshooting, and technical implementation assistance. Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, Oracle, SQL Server). Minimum7to15yearsofexperienceindataarchitecture or related roles. Experiencewithbig data technologies (e.g., Hadoop, Spark, Kafka, Airflow). Expertisewithcloud platforms (e.g., AWS, Azure, Google Cloud) and their data services. Knowledgeofdataintegration tools (e.g., Informatica, Talend, FiveTran, Meltano). Understandingofdatawarehousing concepts and tools (e.g., Snowflake, Redshift, Synapse, BigQuery).  Experiencewithdata governanceframeworks and tools. Show more Show less

Posted 4 days ago

Apply

4.0 - 6.0 years

0 Lacs

Kapra, Telangana, India

On-site

Linkedin logo

Note- Candidate should ready to go Johannesburg(South Africa) for initially a period of 3-4 months. Must have a ready passport. Job Description We are seeking a skilled Data Masking Engineer with 4-6 years of experience in SQL Server and Redgate tools to design, implement, and manage data masking solutions. The ideal candidate will ensure sensitive data is protected while maintaining database usability for development, testing, and analytics. Key Responsibilities Design and implement data masking strategies for SQL Server databases to comply with security and privacy regulations (GDPR, HIPAA, etc.). Use Redgate Data Masker and other tools to anonymize sensitive data while preserving referential integrity. Develop and maintain masking rules, scripts, and automation workflows for efficient data obfuscation. Collaborate with DBAs, developers, and security teams to identify sensitive data fields and define masking policies. Validate masked data to ensure consistency, usability, and compliance with business requirements. Troubleshoot and optimize masking processes to minimize performance impact on production and non-production environments. Document masking procedures, policies, and best practices for internal teams. Stay updated with Redgate tool updates, SQL Server features, and data security trends. Required Skills & Qualifications 4-6 years of hands-on experience in SQL Server database Strong expertise in Redgate Data Masker or similar data masking tools (e.g., Delphix, Informatica). Proficiency in T-SQL, PowerShell, or Python for scripting and automation. Knowledge of data privacy laws (GDPR, CCPA) and secure data handling practices. Experience with SQL Server security features (Dynamic Data Masking, Always Encrypted, etc.) is a plus. Familiarity with DevOps/CI-CD pipelines for automated masking in development/test environments. Strong analytical skills to ensure masked data remains realistic for testing. Preferred Qualifications Redgate or Microsoft SQL Server certifications. Experience with SQL Server Integration Services (SSIS) or ETL processes. Knowledge of cloud databases (Azure SQL, AWS RDS) and their masking solutions (ref:hirist.tech) Show more Show less

Posted 4 days ago

Apply

5.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Overview Join global organization with 82000+ employees around the world, as a ETL Data Brick Developer role based in IQVIA Bangalore. You will be part of IQVIA’s world class technology team and will be involved in design, development, enhanced software programs or cloud applications or proprietary products. The selected candidate will primarily work on Databricks and Reltio projects, focusing on data integration and transformation tasks. This role requires a deep understanding of Databricks, ETL tools, and data warehousing/data lake concepts. Experience in the life sciences domain is preferred. Candidate with Databricks certification is preferred. Key Responsibilities: Design, develop, and maintain data integration solutions using Databricks. Collaborate with cross-functional teams to understand data requirements and deliver efficient data solutions. Implement ETL processes to extract, transform, and load data from various sources into data warehouses/data lakes. Optimize and troubleshoot Databricks workflows and performance issues. Ensure data quality and integrity throughout the data lifecycle. Provide technical guidance and mentorship to junior developers. Stay updated with the latest industry trends and best practices in data integration and Databricks. Required Qualifications: Bachelor’s degree in computer science or equivalent. Minimum of 5 years of hands-on experience with Databricks. Strong knowledge of any ETL tool (e.g., Informatica, Talend, SSIS). Well-versed in data warehousing and data lake concepts. Proficient in SQL and Python for data manipulation and analysis. Experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services. Excellent problem-solving skills. Strong communication and collaboration skills. Preferred Qualifications: Certified Databricks Engineer. Experience in the life sciences domain. Familiarity with Reltio or similar MDM (Master Data Management) tools. Experience with data governance and data security best practices. IQVIA is a leading global provider of clinical research services, commercial insights and healthcare intelligence to the life sciences and healthcare industries. We create intelligent connections to accelerate the development and commercialization of innovative medical treatments to help improve patient outcomes and population health worldwide. Learn more at https://jobs.iqvia.com Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Join us as a Data Engineering Lead This is an exciting opportunity to use your technical expertise to collaborate with colleagues and build effortless, digital first customer experiences You’ll be simplifying the bank through developing innovative data driven solutions, inspiring to be commercially successful through insight, and keeping our customers’ and the bank’s data safe and secure Participating actively in the data engineering community, you’ll deliver opportunities to support our strategic direction while building your network across the bank We’re recruiting for multiple roles across a range to levels, up to and including experienced managers What you'll do We’ll look to you to demonstrate technical and people leadership to drive value for the customer through modelling, sourcing and data transformation. You’ll be working closely with core technology and architecture teams to deliver strategic data solutions, while driving Agile and DevOps adoption in the delivery of data engineering, leading a team of data engineers. We’ll Also Expect You To Be Working with Data Scientists and Analytics Labs to translate analytical model code to well tested production ready code Helping to define common coding standards and model monitoring performance best practices Owning and delivering the automation of data engineering pipelines through the removal of manual stages Developing comprehensive knowledge of the bank’s data structures and metrics, advocating change where needed for product development Educating and embedding new data techniques into the business through role modelling, training and experiment design oversight Leading and delivering data engineering strategies to build a scalable data architecture and customer feature rich dataset for data scientists Leading and developing solutions for streaming data ingestion and transformations in line with streaming strategy The skills you'll need To be successful in this role, you’ll need to be an expert level programmer and data engineer with a qualification in Computer Science or Software Engineering. You’ll also need a strong understanding of data usage and dependencies with wider teams and the end customer, as well as extensive experience in extracting value and features from large scale data. We'll also expect you to have knowledge of of big data platforms like Snowflake, AWS Redshift, Postgres, MongoDB, Neo4J and Hadoop, along with good knowledge of cloud technologies such as Amazon Web Services, Google Cloud Platform and Microsoft Azure You’ll Also Demonstrate Knowledge of core computer science concepts such as common data structures and algorithms, profiling or optimisation An understanding of machine-learning, information retrieval or recommendation systems Good working knowledge of CICD tools Knowledge of programming languages in data engineering such as Python or PySpark, SQL, Java, and Scala An understanding of Apache Spark and ETL tools like Informatica PowerCenter, Informatica BDM or DEI, Stream Sets and Apache Airflow Knowledge of messaging, event or streaming technology such as Apache Kafka Experience of ETL technical design, automated data quality testing, QA and documentation, data warehousing, data modelling and data wrangling Extensive experience using RDMS, ETL pipelines, Python, Hadoop and SQL Show more Show less

Posted 4 days ago

Apply

Exploring Informatica Jobs in India

The informatica job market in India is thriving with numerous opportunities for skilled professionals in this field. Companies across various industries are actively hiring informatica experts to manage and optimize their data integration and data quality processes.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

Average Salary Range

The average salary range for informatica professionals in India varies based on experience and expertise: - Entry-level: INR 3-5 lakhs per annum - Mid-level: INR 6-10 lakhs per annum - Experienced: INR 12-20 lakhs per annum

Career Path

A typical career progression in the informatica field may include roles such as: - Junior Developer - Informatica Developer - Senior Developer - Informatica Tech Lead - Informatica Architect

Related Skills

In addition to informatica expertise, professionals in this field are often expected to have skills in: - SQL - Data warehousing - ETL tools - Data modeling - Data analysis

Interview Questions

  • What is Informatica and why is it used? (basic)
  • Explain the difference between a connected and unconnected lookup transformation. (medium)
  • How can you improve the performance of a session in Informatica? (medium)
  • What are the various types of cache in Informatica? (medium)
  • How do you handle rejected rows in Informatica? (basic)
  • What is a reusable transformation in Informatica? (basic)
  • Explain the difference between a filter and router transformation in Informatica. (medium)
  • What is a workflow in Informatica? (basic)
  • How do you handle slowly changing dimensions in Informatica? (advanced)
  • What is a mapplet in Informatica? (medium)
  • Explain the difference between an aggregator and joiner transformation in Informatica. (medium)
  • How do you create a mapping parameter in Informatica? (basic)
  • What is a session and a workflow in Informatica? (basic)
  • What is a rank transformation in Informatica and how is it used? (medium)
  • How do you debug a mapping in Informatica? (medium)
  • Explain the difference between static and dynamic cache in Informatica. (advanced)
  • What is a sequence generator transformation in Informatica? (basic)
  • How do you handle null values in Informatica? (basic)
  • Explain the difference between a mapping and mapplet in Informatica. (basic)
  • What are the various types of transformations in Informatica? (basic)
  • How do you implement partitioning in Informatica? (medium)
  • Explain the concept of pushdown optimization in Informatica. (advanced)
  • How do you create a session in Informatica? (basic)
  • What is a source qualifier transformation in Informatica? (basic)
  • How do you handle exceptions in Informatica? (medium)

Closing Remark

As you prepare for informatica job opportunities in India, make sure to enhance your skills, stay updated with the latest trends in data integration, and approach interviews with confidence. With the right knowledge and expertise, you can excel in the informatica field and secure rewarding career opportunities. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies