Jobs
Interviews

4985 Data Governance Jobs - Page 20

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 2.0 years

6 - 7 Lacs

pune

Work from Office

Generation of the shared data toolkit comprises of BNY s proprietary iFlow indicators and Investment & Wealth Management (IWM) forecasts. Help own the health and evolution of our Shared Data Toolkit a modular data library that ingests, pre-processes, and delivers clean time-series data and model components to research, trading, and client-facing teams. Toolkit Development & Maintenance- Extend core utilities for data cleansing, feature engineering, and model integration. Enforce documentation, versioning, and unit-test standards across the toolkit. Refactor legacy code into reusable, production-ready modules. R&D Collaboration Partner with quants to onboard new macro, market-microstructure, and behavioral datasets. Prototype and benchmark forecasting models; monitor drift and align output with research intent. Conduct code reviews and coach analysts on best practices. Data Governance- Automate quality-checks and anomaly alerts for high-frequency updates. Coordinate with Data Ops to ensure secure, traceable data lineage. To be successful in this role, we re seeking the following: Technical skills required: SQL / SQL Server, Python, Oracle , ETL orchestration experience with AI/ML a plus High school/secondary school or the equivalent combination of education and experience is required;

Posted 1 week ago

Apply

1.0 - 2.0 years

6 - 7 Lacs

pune

Work from Office

We re seeking a future team member for the role of Associate, Data Management & Quantitative Analysis II to join our BNY Institute BNY s macro strategy insights platform team. This role is located in Pune, MH HYBRID. In this role, you ll make an impact in the following ways: Generation of the shared data toolkit comprises of BNY s proprietary iFlow indicators and Investment & Wealth Management (IWM) forecasts. Help own the health and evolution of our Shared Data Toolkit a modular data library that ingests, pre-processes, and delivers clean time-series data and model components to research, trading, and client-facing teams. Toolkit Development & Maintenance- Extend core utilities for data cleansing, feature engineering, and model integration. Enforce documentation, versioning, and unit-test standards across the toolkit. Refactor legacy code into reusable, production-ready modules. R&D Collaboration Partner with quants to onboard new macro, market-microstructure, and behavioral datasets. Prototype and benchmark forecasting models; monitor drift and align output with research intent. Conduct code reviews and coach analysts on best practices. Data Governance- Automate quality-checks and anomaly alerts for high-frequency updates. Coordinate with Data Ops to ensure secure, traceable data lineage. To be successful in this role, we re seeking the following: Technical skills required: SQL / SQL Server, Python, Oracle , ETL orchestration experience with AI/ML a plus High school/secondary school or the equivalent combination of education and experience is required;

Posted 1 week ago

Apply

5.0 - 8.0 years

10 - 14 Lacs

hyderabad

Work from Office

Roles and Responsibilities : - Create and maintain efficient and scalable data models as per business needs - Create and maintain optimal data pipelines against multiple data sources lie SQL, BigData on Azure / AWS cloud; - Assemble and process large, complex data sets to meet both functional and non-functional business requirements; - Analyze and improve existing data models, pipelines, related infrastructure and processes - Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics; - Monitor and improve data quality and data governance policies - Collaborate with stakeholders including the executive, product, data and design teams to assist with data-related technical issues and support their data infrastructure needs; - Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader; - Work with data and analytics experts to strive for greater functionality in our data systems. Must Have Skills : - 5+ years of experience working with distributed data technologies (e. Hadoop, MapReduce, Spark, Kafka, Flink etc) for building efficient, large-scale 'big data' pipelines; - Strong Software Engineering experience with proficiency in at least one of the following programming languages: Java, Scala, Python or equivalent; - Implement data ingestion pipelines both real time and batch using best practices; - Experience with building stream-processing applications using Apache Flink, Kafka Streams or others; - Experience with Cloud Computing platforms like Azure,Amazon AWS, Google Cloud etc.; - Experience supporting and working with cross-functional teams in a dynamic environment; - Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. - Experience with ELK stack. - Ability to work in a Linux environment. Nice to Have : - Experience in building distributed, high-volume data services; - Experience with big data processing and analytics stack in AWS: EMR, S3, EC2, Athena, Kinesis, Lambda, Quicksight etc.; - Knowledge of data science tools and their integration with data lakes; - Experience in container technologies like Docker/Kubernetes. Qualification : - Bachelor of Science in Computer Science or equivalent technical training and professional

Posted 1 week ago

Apply

8.0 - 13.0 years

20 - 25 Lacs

hyderabad

Work from Office

- Define ownership and accountability of our critical data assets to ensure they are effectively managed and maintain integrity throughout PepsiCo s systems - Leverage data as a strategic enterprise asset enabling data-based decision analytics - Improve productivity and efficiency of daily business operations Position Overview: The Customer Data Steward IBP role is responsible for working within the global data governance team and with their local businesses to maintain alignment to the Enterprise Data Governances (EDG) processes, rules and standards set to ensure data is fit for purpose. Responsibilities Responsibilities Primary Accountabilities: - Deliver key elements of Data Discovery, Source Identification, Data Quality Management, cataloging for program & Customer Domain data. - Ensure data accuracy and adherence to PepsiCo defined global governance practices, as well as driving acceptance of PepsiCos enterprise data standards and policies across the various business segments. - Maintain and advise relevant stakeholders on data governance-related matters in the customer domain and with respect to Demand Planning in IBP, with a focus on the business use of the data - Define Data Quality Rules from Source systems, within the Enterprise Data Foundation and through to the End User systems to enable End to End Data Quality management to deliver a seamless user experience. - Advise on various projects and initiatives to ensure that any data related changes and dependencies are identified , communicated and managed to ensure adherence with the Enterprise Data Governance established standards. - Accountable for ensuring that data-centric activities are aligned with the EDG program and leverage applicable data standards, governance processes, and overall best practices. Data Governance Business Standards: - Ensures alignment of the data governance processes and standards with applicable enterprise, business segment, and local data support models. - Champions the single set of Enterprise-level data standards & repository of key elements pertaining to their in-scope data domain ( e.g Customer, Material, Vendor, Finance, Consumer) and promoting their use throughout the PepsiCo organization. - Advise on various projects and initiatives to ensure that any data related changes and dependencies are identified , communicated and managed to ensure adherence with the Enterprise Data Governance established standards. Data Domain Coordination and Collaboration: - Responsible for helping identify the need for sector-level data standards (and above) based on strategic business objectives and the evolution of enterprise-level capabilities and analytical requirements. - Collaborates across the organization to ensure consistent and effective execution of data governance and management principles across PepsiCos enterprise and analytical systems and data domains. - Accountable for driving organizational acceptance of EDG established data standards, policies, and definitions and process standards for critical / related enterprise data. - Promotes and champions PepsiCos Enterprise Data Governance Capability and data management program across the organization. Qualifications Qualifications 8+ years of experience working in Customer Operations, Demand Planning, Order to Cash, Commercial Data Governance or Data Management within a global CPG. Qualifications 8+ years of experience working in Customer Operations, Demand Planning, Order to Cash, Commercial Data Governance or Data Management within a global CPG. Responsibilities Primary Accountabilities: - Deliver key elements of Data Discovery, Source Identification, Data Quality Management, cataloging for program & Customer Domain data. - Ensure data accuracy and adherence to PepsiCo defined global governance practices, as well as driving acceptance of PepsiCos enterprise data standards and policies across the various business segments. - Maintain and advise relevant stakeholders on data governance-related matters in the customer domain and with respect to Demand Planning in IBP, with a focus on the business use of the data - Define Data Quality Rules from Source systems, within the Enterprise Data Foundation and through to the End User systems to enable End to End Data Quality management to deliver a seamless user experience. - Advise on various projects and initiatives to ensure that any data related changes and dependencies are identified , communicated and managed to ensure adherence with the Enterprise Data Governance established standards. - Accountable for ensuring that data-centric activities are aligned with the EDG program and leverage applicable data standards, governance processes, and overall best practices. Data Governance Business Standards: - Ensures alignment of the data governance processes and standards with applicable enterprise, business segment, and local data support models. - Champions the single set of Enterprise-level data standards & repository of key elements pertaining to their in-scope data domain ( e.g Customer, Material, Vendor, Finance, Consumer) and promoting their use throughout the PepsiCo organization. - Advise on various projects and initiatives to ensure that any data related changes and dependencies are identified , communicated and managed to ensure adherence with the Enterprise Data Governance established standards. Data Domain Coordination and Collaboration: - Responsible for helping identify the need for sector-level data standards (and above) based on strategic business objectives and the evolution of enterprise-level capabilities and analytical requirements. - Collaborates across the organization to ensure consistent and effective execution of data governance and management principles across PepsiCos enterprise and analytical systems and data domains. - Accountable for driving organizational acceptance of EDG established data standards, policies, and definitions and process standards for critical / related enterprise data. - Promotes and champions PepsiCos Enterprise Data Governance Capability and data management program across the organization.

Posted 1 week ago

Apply

5.0 - 10.0 years

12 - 15 Lacs

hyderabad

Work from Office

The domain Data Steward role is responsible for working within the global data governance team and with their local businesses to maintain alignment to the Enterprise Data Governances (EDG) processes, rules and standards set to ensure data is fit for purpose. This will be achieved through the EDG Data Steward operating as the single point of contact for those creating and consuming data within their respected data domain(s). Additionally, they will be driving the team to interact directly with key domain and project stakeholders, the EDG Lead, Governance Council, other data stewards across the organization and relevant SMEs throughout the organization as necessary. This position collaborates / advises with PepsiCos Governance Council, of which they are accountable for the success of PepsiCo s EDG program. Responsibilities Primary Accountabilities: Partner closely with the PepsiCo Financial Planning & Analysis (FP&A) team to ensure data requirements are met to enable timely, accurate and insightful reporting and analysis in support of FP&A digitization initiatives Promote data accuracy and adherence to PepsiCo defined global governance practices, as well as driving acceptance of PepsiCos enterprise data standards and policies across the various business segments. Maintain and advise relevant stakeholders on data governance-related matters in the relevant data domains with a focus on the business use of the data. Monitor operational incidents, support root cause analysis and based on the recurrence propose ways to optimize the Data Governance framework (processes, Data Quality Rules, etc.) Provide recommendations and supporting documentation for new or proposed data standards, business rules and policy (in conjunction with the Governance Council as appropriate). Advice on various projects and initiatives to ensure that any data related changes and dependencies are identified, communicated, and managed to ensure adherence with the Enterprise Data Governance established standards. Represent market specific needs in Sector data councils and above, ensuring locals user needs are heard/met/addressed; Voice opinions around why proposals will or will not work for the market you represent and provide alternative solutions. Coordinate across the Sector (with fellow Market Data Stewards and the EDG Steward; strategic initiatives, Digital Use Cases and the federated data network) in order to maintain consistency of PepsiCos critical enterprise, digital, operational and analytical data. Accountable for ensuring that data-centric activities are aligned with the EDG program and leverage applicable data standards, governance processes, and overall best practices. Data Governance Business Standards: Ensures alignment of the data governance processes and standards with applicable enterprise, business segment, and local data support models. Champions the single set of Enterprise-level data standards & repository of key elements pertaining to the finance domain and promoting their use throughout the PepsiCo organization. Owns one or multiple domain perspectives in defining and continually evolving the roadmap for enterprise data governance based upon strategic business objectives, existing capabilities/programs, cultural considerations and a general understanding of emerging technologies and governance models/techniques. Advise on various projects and initiatives to ensure that any data related changes and dependencies are identified, communicated, and managed to ensure adherence with the Enterprise Data Governance established standards. Data Domain Coordination and Collaboration: Responsible for helping identify the need for sector-level data standards (and above) based on strategic business objectives and the evolution of enterprise-level capabilities and analytical requirements. Collaborates across the organization to ensure consistent and effective execution of data governance and management principles across PepsiCos enterprise and analytical systems and data domains. Accountable for driving organizational acceptance of EDG established data standards, policies, and definitions and process standards for critical / related enterprise data. Promotes and champions PepsiCos Enterprise Data Governance Capability and data management program across the organization. Qualifications 5+ years of experience working in Data Governance or Data Management within a global CPG (Consumer Packaged Good) company; Strong data management background who understands data, how to ingest data, proper data use / consumption, data quality, and stewardship. 7+ years of experience working with data across multiple domains (with a particular focus on Finance data), associated processes, involved systems and data usage. Minimum of 5+ years functional experience working with and designing standards for data cataloging processes and tools. Ability to partner with both business and technical subject matter experts to ensure standardization of operational information and execution of enterprise-wide data governance policies and procedures are defined and implemented. Matrix management skills and business acumen Competencies: Strong knowledge and understanding of master data elements and processes related to data across multiple domains Understanding of operation usage of transactional data as it relates to financial planning. Strong Communication Skills/Able to Persuade/Influence Others at all Organization Levels and the ability foster lasting partnerships. Ability to translate business requirements into critical data dependencies and requirements Ability to think beyond their current state (processes, roles and tools) and work towards an unconstrained, optimized design. An ability to solicit followership from the functional teams to think beyond the way the things work today. Able to align various stakeholders to a common set of standards and the ability to sell the benefits of the EDG program. Foster lasting relationships across varying organizational levels and business segments with the maturity to interface with all levels of management Self-starter who welcomes responsibility, along with the ability to thrive in an evolving organization and an ability to bring structure to unstructured situations. Ability to arbitrate on difficult decisions and drive consensus through a diplomatic approach. Matrix management skills and business acumen Excellent written & verbal communication skills. 5+ years of experience working in Data Governance or Data Management within a global CPG (Consumer Packaged Good) company; Strong data management background who understands data, how to ingest data, proper data use / consumption, data quality, and stewardship. 7+ years of experience working with data across multiple domains (with a particular focus on Finance data), associated processes, involved systems and data usage. Minimum of 5+ years functional experience working with and designing standards for data cataloging processes and tools. Ability to partner with both business and technical subject matter experts to ensure standardization of operational information and execution of enterprise-wide data governance policies and procedures are defined and implemented. Matrix management skills and business acumen Competencies: Strong knowledge and understanding of master data elements and processes related to data across multiple domains Understanding of operation usage of transactional data as it relates to financial planning. Strong Communication Skills/Able to Persuade/Influence Others at all Organization Levels and the ability foster lasting partnerships. Ability to translate business requirements into critical data dependencies and requirements Ability to think beyond their current state (processes, roles and tools) and work towards an unconstrained, optimized design. An ability to solicit followership from the functional teams to think beyond the way the things work today. Able to align various stakeholders to a common set of standards and the ability to sell the benefits of the EDG program. Foster lasting relationships across varying organizational levels and business segments with the maturity to interface with all levels of management Self-starter who welcomes responsibility, along with the ability to thrive in an evolving organization and an ability to bring structure to unstructured situations. Ability to arbitrate on difficult decisions and drive consensus through a diplomatic approach. Matrix management skills and business acumen Excellent written & verbal communication skills. Primary Accountabilities: Partner closely with the PepsiCo Financial Planning & Analysis (FP&A) team to ensure data requirements are met to enable timely, accurate and insightful reporting and analysis in support of FP&A digitization initiatives Promote data accuracy and adherence to PepsiCo defined global governance practices, as well as driving acceptance of PepsiCos enterprise data standards and policies across the various business segments. Maintain and advise relevant stakeholders on data governance-related matters in the relevant data domains with a focus on the business use of the data. Monitor operational incidents, support root cause analysis and based on the recurrence propose ways to optimize the Data Governance framework (processes, Data Quality Rules, etc.) Provide recommendations and supporting documentation for new or proposed data standards, business rules and policy (in conjunction with the Governance Council as appropriate). Advice on various projects and initiatives to ensure that any data related changes and dependencies are identified, communicated, and managed to ensure adherence with the Enterprise Data Governance established standards. Represent market specific needs in Sector data councils and above, ensuring locals user needs are heard/met/addressed; Voice opinions around why proposals will or will not work for the market you represent and provide alternative solutions. Coordinate across the Sector (with fellow Market Data Stewards and the EDG Steward; strategic initiatives, Digital Use Cases and the federated data network) in order to maintain consistency of PepsiCos critical enterprise, digital, operational and analytical data. Accountable for ensuring that data-centric activities are aligned with the EDG program and leverage applicable data standards, governance processes, and overall best practices. Data Governance Business Standards: Ensures alignment of the data governance processes and standards with applicable enterprise, business segment, and local data support models. Champions the single set of Enterprise-level data standards & repository of key elements pertaining to the finance domain and promoting their use throughout the PepsiCo organization. Owns one or multiple domain perspectives in defining and continually evolving the roadmap for enterprise data governance based upon strategic business objectives, existing capabilities/programs, cultural considerations and a general understanding of emerging technologies and governance models/techniques. Advise on various projects and initiatives to ensure that any data related changes and dependencies are identified, communicated, and managed to ensure adherence with the Enterprise Data Governance established standards. Data Domain Coordination and Collaboration: Responsible for helping identify the need for sector-level data standards (and above) based on strategic business objectives and the evolution of enterprise-level capabilities and analytical requirements. Collaborates across the organization to ensure consistent and effective execution of data governance and management principles across PepsiCos enterprise and analytical systems and data domains. Accountable for driving organizational acceptance of EDG established data standards, policies, and definitions and process standards for critical / related enterprise data. Promotes and champions PepsiCos Enterprise Data Governance Capability and data management program across the organization.

Posted 1 week ago

Apply

3.0 - 8.0 years

22 - 25 Lacs

hyderabad

Work from Office

This role will assist the AMESA Supplier Quality Assurance team by conducting re-approvals annually for low risk designated suppliers located in AMESA that supply ingredients to Pepsico. This role will contact suppliers, request, and review returned pre-requisite data and documents, and re-approve the supplier site (3-year reapproval cycle). The role will update the required database(s) to support the re-approval. The expectation is ~200 supplier sites per year. The role will also support ancillary programs in AMESA SQA such as gathering program evidence from suppliers and brokers related to Food Fraud, Quality and Food safety Certifications and Contaminants programs as well as updating the Master Supplier Ingredient database when new ingredients are added to existing supplier sites or when ingredients or supplier sites are obsoleted. In addition, the role may be required to create short PowerPoint reviews of supplier non conformances from specified material groups on a quarterly basis (~ 5) to support the AMESA SQA agenda. There will be an ongoing requirement to cleanse and update supplier/ingredient data in the data base /portals with information supplied by Suppliers or GP or Pep manufacturing facilities. Responsibilities Functional Responsibilities Utilize SQA Trackwise database to determine Supplier site re-approvals required over a defined time (12 months). SQA AMESA team can construct a list/query to enable this and have specific records can be assigned to a named person. Using a pre-determined script by Material class/Sub class of materials, and/or a specific questionnaire to use send a request by email to the supplier requesting documentation and certifications to support the re-approval. The pre-determined documents are listed on script and in the body of the specific questionnaire. The process can benefit from having a brief zoom call with the supplier to re-educate the supplier on the request. This has proven to assist the agility of receiving all the documentation at once and avoids re-connecting / follow up on missing documentation. Typical documents requested are quality and food safety certifications (GFSI/ISO9001/Others), 3rd party audit reports, process flow diagrams, HACCP plans, specific contaminants data, pathogen sensitive ingredient program etc. Because the re-approval frequency for low-risk suppliers is approx. 3 years, it is likely that the supplier contacts may have changed, so it may be necessary at times to find a new contact by connecting with GP or pep site representative to determine who buys the material and ascertaining a new contact. Review all documents received from supplier site and create an Audit record in Trackwise database, complete the requisite record fields based on the information supplied. Carry out a check on FDA data base or horizon scanning data base also is required. Attach all the received Supplier site documents and questionnaire, and assign a status, and next audit date, and complete the audit verifying materials are correct. The person will also update progress metrics monthly showing number of sites completed/ sites outstanding/ overdue / sites on plan for balance of year. This can be a simple Dashboard. Maintain data governance of the information supplied by the R&D /specifications teams to update all Ingredient records in the Global Trackwise Database (additions/changes ~5-10 per week) cross checking the data base and if needed the specifications. In addition, continue to cleanse Supplier -Ingredient Master data in Trackwise data to ensure records are in correct status and are complaint, as required. e.g., Update any supplier site name changes or contact name changes etc. These are ad hoc and not frequent but important to update. Add GFSI certificates to supplier audit record when received, upload certificates to TW site audit record from Portal. Support the Supplier Performance Management agenda. (primarily supplier non-conformance incidents registered by plants and in a database). The task primarily would be to the preparation of short presentations for cross functional meetings held quarterly targeting ~ 5 material category metrics. Help prepare data for Europe sector manufacturing sites GFSI audits. The request is generally to check supplier data and documents received before an audit to ensure it is all present and correct or collect data during the audit if we find issues on conformance. This may be one or twice per month. Support ancillary programs in SQA such as food fraud questionnaire sending and reviewing to suppliers, request for contaminants data, creating ad-hoc requests from suppliers resulting from an incident of regulatory affairs requests, sending and reviewing/ aligning Pathogen Sensitive Ingredient (PSIP) Questionnaires as part of a formal AMESA PSIP program. Some of these requests in the medium term may be automated but still requires governance to check compliance and close out requests/ follow up on outstanding information yet to be furnished. Qualifications Bachelor s degree in a science discipline (e.g Food Science, Food Engineering, Food Chemistry, Microbiology, Biochemical Engineering or Chemical Engineering or equivalent relevant experience required.) Lead Auditor training is a requirement, and the person will need to complete training either in advance, or shortly afterward joining the team. SQA Europe team can act as proxy to quicky review and approve the records in the interim and Europe SQA can assist in organizing the training as applicable. 1-2 years of experience on Regulatory, QA/QC, or other Food related FMCG in a technical role Ability to interpret technical information, escalate issues and seek alignment towards workable solutions. Proficient with Microsoft Office Suite Ability to rapidly learn computer applications/programs / interrogate databases such as specification /SAP & navigate systems. Basic understanding of Food Safety, Microbiology and Regulatory Affairs Collaborative skills and strong interest in working with others across time zones; Ability to build relationships and work closely with both internal and external partners. Good communication skills oral and written (e.g. communication on telephone, external and internal to PepsiCo, managing with other support group functions, etc.) Ability to manage time effectively across multiple priorities and projects requiring a high degree of organizational and communication skills to ensure requests are delivered in a timely manner. Self-motivated & demonstrated ability to take initiative on projects. Fluent in English (HBS) Bachelor s degree in a science discipline (e.g Food Science, Food Engineering, Food Chemistry, Microbiology, Biochemical Engineering or Chemical Engineering or equivalent relevant experience required.) Lead Auditor training is a requirement, and the person will need to complete training either in advance, or shortly afterward joining the team. SQA Europe team can act as proxy to quicky review and approve the records in the interim and Europe SQA can assist in organizing the training as applicable. 1-2 years of experience on Regulatory, QA/QC, or other Food related FMCG in a technical role Ability to interpret technical information, escalate issues and seek alignment towards workable solutions. Proficient with Microsoft Office Suite Ability to rapidly learn computer applications/programs / interrogate databases such as specification /SAP & navigate systems. Basic understanding of Food Safety, Microbiology and Regulatory Affairs Collaborative skills and strong interest in working with others across time zones; Ability to build relationships and work closely with both internal and external partners. Good communication skills oral and written (e.g. communication on telephone, external and internal to PepsiCo, managing with other support group functions, etc.) Ability to manage time effectively across multiple priorities and projects requiring a high degree of organizational and communication skills to ensure requests are delivered in a timely manner. Self-motivated & demonstrated ability to take initiative on projects. Fluent in English (HBS) Functional Responsibilities Utilize SQA Trackwise database to determine Supplier site re-approvals required over a defined time (12 months). SQA AMESA team can construct a list/query to enable this and have specific records can be assigned to a named person. Using a pre-determined script by Material class/Sub class of materials, and/or a specific questionnaire to use send a request by email to the supplier requesting documentation and certifications to support the re-approval. The pre-determined documents are listed on script and in the body of the specific questionnaire. The process can benefit from having a brief zoom call with the supplier to re-educate the supplier on the request. This has proven to assist the agility of receiving all the documentation at once and avoids re-connecting / follow up on missing documentation. Typical documents requested are quality and food safety certifications (GFSI/ISO9001/Others), 3rd party audit reports, process flow diagrams, HACCP plans, specific contaminants data, pathogen sensitive ingredient program etc. Because the re-approval frequency for low-risk suppliers is approx. 3 years, it is likely that the supplier contacts may have changed, so it may be necessary at times to find a new contact by connecting with GP or pep site representative to determine who buys the material and ascertaining a new contact. Review all documents received from supplier site and create an Audit record in Trackwise database, complete the requisite record fields based on the information supplied. Carry out a check on FDA data base or horizon scanning data base also is required. Attach all the received Supplier site documents and questionnaire, and assign a status, and next audit date, and complete the audit verifying materials are correct. The person will also update progress metrics monthly showing number of sites completed/ sites outstanding/ overdue / sites on plan for balance of year. This can be a simple Dashboard. Maintain data governance of the information supplied by the R&D /specifications teams to update all Ingredient records in the Global Trackwise Database (additions/changes ~5-10 per week) cross checking the data base and if needed the specifications. In addition, continue to cleanse Supplier -Ingredient Master data in Trackwise data to ensure records are in correct status and are complaint, as required. e.g., Update any supplier site name changes or contact name changes etc. These are ad hoc and not frequent but important to update. Add GFSI certificates to supplier audit record when received, upload certificates to TW site audit record from Portal. Support the Supplier Performance Management agenda. (primarily supplier non-conformance incidents registered by plants and in a database). The task primarily would be to the preparation of short presentations for cross functional meetings held quarterly targeting ~ 5 material category metrics. Help prepare data for Europe sector manufacturing sites GFSI audits. The request is generally to check supplier data and documents received before an audit to ensure it is all present and correct or collect data during the audit if we find issues on conformance. This may be one or twice per month. Support ancillary programs in SQA such as food fraud questionnaire sending and reviewing to suppliers, request for contaminants data, creating ad-hoc requests from suppliers resulting from an incident of regulatory affairs requests, sending and reviewing/ aligning Pathogen Sensitive Ingredient (PSIP) Questionnaires as part of a formal AMESA PSIP program. Some of these requests in the medium term may be automated but still requires governance to check compliance and close out requests/ follow up on outstanding information yet to be furnished.

Posted 1 week ago

Apply

8.0 - 13.0 years

15 - 19 Lacs

hyderabad

Work from Office

Overview The Master data workflow Specialist will be a key contributor to the functional designs, configuration, and implementation of SAP workflow processes to manage SAP master data. This specialist, working with the S4 Functional Leads, will be responsible to remediate the data maintenance tool (workflow solution, LSMW, Winshuttle etc ) to ensure seamless transition to S4 from ECC Responsibilities This SAP data Maintenace tool expert is responsible for remediating the exiting tools to work seamlessly in S4 Lead delivery/remediation of SAP workflow solutions for Business Partners Establish design patterns for existing data maintenance tools that enable reuse across multiple markets Consult with architecture resources, data conversion team, process teams, governance team to redesign the data tools Partner with the Data capability delivery teams (i.e. Conversion Readiness, Master Data Governance) to ensure that data design changes are incorporated Ability to understand complex functional and IT requirements, and be able to identify and offer multiple solution options to facilitate the best outcome Ability to quickly adapt to changes in timelines and sequences, deal with ambiguity, and succeed in a high-pressure environment Qualifications 8+ years functional design, delivery, and sustain of SAP ERP (ECC) Business Workflow solutions with a focus on master data - Material, Customer, Vendor and/or Finance Experience delivering global workflow solutions across mulitple PepsiCo businesses and geographies Deep functional and technical experience architecting, designing and delivering complex, re-usable business process workflow solutions with particular emphasis on employing flexible, configured solutions Experience with SAP workflow/BRF integration and analyst coded business rules 8+ years functional design, delivery, and sustain of SAP ERP (ECC) Business Workflow solutions with a focus on master data - Material, Customer, Vendor and/or Finance Experience delivering global workflow solutions across mulitple PepsiCo businesses and geographies Deep functional and technical experience architecting, designing and delivering complex, re-usable business process workflow solutions with particular emphasis on employing flexible, configured solutions Experience with SAP workflow/BRF integration and analyst coded business rules This SAP data Maintenace tool expert is responsible for remediating the exiting tools to work seamlessly in S4 Lead delivery/remediation of SAP workflow solutions for Business Partners Establish design patterns for existing data maintenance tools that enable reuse across multiple markets Consult with architecture resources, data conversion team, process teams, governance team to redesign the data tools Partner with the Data capability delivery teams (i.e. Conversion Readiness, Master Data Governance) to ensure that data design changes are incorporated Ability to understand complex functional and IT requirements, and be able to identify and offer multiple solution options to facilitate the best outcome Ability to quickly adapt to changes in timelines and sequences, deal with ambiguity, and succeed in a high-pressure environment

Posted 1 week ago

Apply

2.0 - 7.0 years

8 - 9 Lacs

bengaluru

Work from Office

Azure Data Engineer- Azure Data Engineer Position Overview We are seeking a talented Azure Data Engineer to join our dynamic team. This role involves designing, building, and managing data solutions using the Azure ecosystem. The ideal candidate will work on data integration, transformation, and visualization while ensuring high-quality, secure, and scalable data pipelines. Responsibilities Design, develop, and maintain data pipelines using Azure Data Factory . Manage and optimize data storage using Azure Data Lake Gen 2 . Build and process large-scale data solutions with Azure Databricks and Apache Spark . Create interactive reports and dashboards in Power BI for business insights. Collaborate with cross-functional IT and business teams to translate requirements into data solutions. Ensure data quality, security, and compliance in line with organizational standards. Core Skills Required Azure Data Factory : Experience in building and managing data pipelines. Azure Data Lake Gen 2 : Proficient in data storage and management within Azures data lake environment. Azure Databricks / Apache Spark : Hands-on skill with distributed data processing, transformations, and analytics. Power BI : Expertise in data visualization and reporting. Nice-to-Have Skills Basic SQL Performance Tuning : Ability to write and optimize SQL queries. Data Governance & Unity Catalog : Understanding of data governance principles and experience with Unity Catalog for data management. Certification : Microsoft DP-203 (Azure Data Engineer Associate) certification. CI/CD Pipelines : Experience implementing CI/CD pipelines for Azure Data Factory or Databricks projects. Qualifications Bachelors degree in Computer Science, Information Technology, or related field (or equivalent experience). 2+ years of professional experience with Azure data services. Strong analytical and problem-solving skills. Effective communication and team collaboration abilities. Preferred Traits Continuous learner, staying updated with Azure and data engineering advancements. Experience working in Agile environments. Passion for optimizing data workflows and enabling data-driven decisions. If you are enthusiastic about working with advanced Azure data services and enabling impactful analytics, we encourage you to apply!

Posted 1 week ago

Apply

10.0 - 14.0 years

32 - 40 Lacs

pune

Work from Office

MISSION OF THE POSITION The Lead Workday Administrator will be responsible for the implementations, configurations, maintenance, and support related to the Workday human capital management (HCM) system and other systems in the HR space. This includes the design and development of business processes, security administration, integrations with other systems, data management, reporting and analytics, and end-user support. The Lead Workday Administrator will also be responsible for relationship management with internal HR customers, coordination and management of work between members of the HR Systems team, and training of new and existing HR Systems resources. RESPONSIBILITIES Strategic Implementation & Configuration: Lead the deployment and customization of Workday HCM modules, aligning technology solutions with organizational objectives and best practices. Enterprise Integrations: Architect and maintain robust integrations between Workday and critical systems (e.g., payroll, benefits, LMS), ensuring data integrity and system interoperability. Data Governance: Oversee comprehensive audits and validation protocols to maintain high standards for data accuracy, consistency, and compliance. Security & Compliance: Define and administer complex security structures, ensuring appropriate access controls and regulatory alignment. Business Intelligence & Reporting: Design and deliver advanced Workday reports and dashboards that support strategic decisions and organizational visibility. Continuous Improvement: Monitor Workday feature releases, assess potential impact, and implement enhancements that drive efficiency and user satisfaction. Cross-Functional Leadership: Collaborate closely with HR, IT, and business leaders to streamline processes, enhance system performance, and align Workday capabilities with evolving business strategies. Team Up-skilling: Train new and existing HR Systems team members in internal processes and technical skills. REQUIREMENTS (include some distinguishing characteristics) Bachelors degree in Information Technology, Human Resources, or a related field, or equivalent practical experience. Over 6 years of progressive experience administering, configuring, and optimizing Workday systems, with a strong emphasis on scalable solutions and continuous improvement. Advanced expertise across multiple Workday modules, including HCM, Benefits, Compensation, Payroll, and Integrations. Proven track record of developing and maintaining complex Workday Integrations to support enterprise-level operations. Deep understanding of HR and payroll business processes, compliance standards, and regulatory frameworks. Workday certifications strongly preferred, reflecting a commitment to industry best practices and professional development. Summary- We are looking for a Lead Workday Administrator with strong expertise in Workday Core HCM and modules including Compensation, Benefits, and Payroll. The ideal candidate will be responsible for configuring and maintaining the Workday platform, as well as developing and supporting complex integrations using Workday Studio, EIBs, and APIs to meet enterprise-wide operational needs. This role requires hands-on experience with Workday configuration, cross-functional collaboration, and delivering scalable, compliant solutions aligned with business goals.

Posted 1 week ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

pune

Work from Office

MISSION OF THE POSITION The Lead Workday Administrator will be responsible for the implementations, configurations, maintenance, and support related to the Workday human capital management (HCM) system and other systems in the HR space. This includes the design and development of business processes, security administration, integrations with other systems, data management, reporting and analytics, and end-user support. The Lead Workday Administrator will also be responsible for relationship management with internal HR customers, coordination and management of work between members of the HR Systems team, and training of new and existing HR Systems resources. RESPONSIBILITIES Strategic Implementation & Configuration: Lead the deployment and customization of Workday HCM modules, aligning technology solutions with organizational objectives and best practices. Enterprise Integrations: Architect and maintain robust integrations between Workday and critical systems (e.g., payroll, benefits, LMS), ensuring data integrity and system interoperability. Data Governance: Oversee comprehensive audits and validation protocols to maintain high standards for data accuracy, consistency, and compliance. Security & Compliance: Define and administer complex security structures, ensuring appropriate access controls and regulatory alignment. Business Intelligence & Reporting: Design and deliver advanced Workday reports and dashboards that support strategic decisions and organizational visibility. Continuous Improvement: Monitor Workday feature releases, assess potential impact, and implement enhancements that drive efficiency and user satisfaction. Cross-Functional Leadership: Collaborate closely with HR, IT, and business leaders to streamline processes, enhance system performance, and align Workday capabilities with evolving business strategies. Team Up-skilling: Train new and existing HR Systems team members in internal processes and technical skills. REQUIREMENTS (include some distinguishing characteristics) Bachelors degree in Information Technology, Human Resources, or a related field, or equivalent practical experience. Over 6 years of progressive experience administering, configuring, and optimizing Workday systems, with a strong emphasis on scalable solutions and continuous improvement. Advanced expertise across multiple Workday modules, including HCM, Benefits, Compensation, Payroll, and Integrations. Proven track record of developing and maintaining complex Workday Integrations to support enterprise-level operations. Deep understanding of HR and payroll business processes, compliance standards, and regulatory frameworks. Workday certifications strongly preferred, reflecting a commitment to industry best practices and professional development. Summary- We are looking for a Lead Workday Administrator with strong expertise in Workday Core HCM and modules including Compensation, Benefits, and Payroll. The ideal candidate will be responsible for configuring and maintaining the Workday platform, as well as developing and supporting complex integrations using Workday Studio, EIBs, and APIs to meet enterprise-wide operational needs. This role requires hands-on experience with Workday configuration, cross-functional collaboration, and delivering scalable, compliant solutions aligned with business goals.

Posted 1 week ago

Apply

15.0 - 20.0 years

22 - 27 Lacs

bengaluru

Work from Office

You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. Job Summary: As a Delivery Head at Capco you will be working with Data teams to develop data solutions for the world s largest financial services firms: Roles & Responsibilities: Lead Data & Analytics portfolio/programs in the areas of Data Engineering, Data management/Governance and Advance Analytics Build solutions, accelerators, and capabilities in-line with dynamic technology landscape and advancements Collaborate with global practices in creating & building business use cases and Go-To-Market offerings Work with account leadership in aligning Data & Analytics best practices and standards across delivery programs Bring thought leadership and innovative ideas to build trust and relationship with internal and external stakeholders Hire, train, mentor, motivate and lead a pool consists of data engineers, data analyst, data architects, BI consultants and AI engineers Lead pre-sale effort by owning the solutions and technology landscape required for RFI/RFP responses Skills & Experience: 15+ years of progressive experience in Data & Analytics space with focus on Data Engineering, Data Management/Governance Or Advance Analytics (Including GenAI) Well- versed with Digital/SMACI integration points and its applicability within Data & Analytic technology/platform/tool stack Experienced consultant who has defined strategy, created roadmaps, developed PoCs/Pilots, built architecture design and built assessment frameworks for diverse data driven business use cases both on legacy and modern data platforms Hands-on experience of building enterprise level solutions with combination and varied patterns of data sources, storage on legacy as well as on modern data platform(including Cloud). Should carry good understanding of AI lifecycle and related advancements including Gen AI Good to have BI & Visualization experience Should be able to navigate through Data Management and Data Governance requirements Should have supported large sales pursuits and presales activities Should be able manage ambiguous situations with structured approach to evaluate risks/issues and related mitigation plan Excellent stakeholder management and organizational skills If you are keen to join us, you will be part of an organization that values your contributions, recognizes your potential, and provides ample opportunities for growth. For more information, visit www.capco.com. Follow us on Twitter, Facebook, LinkedIn, and YouTube

Posted 1 week ago

Apply

8.0 - 10.0 years

7 - 12 Lacs

gurugram

Work from Office

Role: Master Data Team Lead (Projects) Reporting to Global MDM Manager Technical Competencies: Should have hands on experience on Master data creation and maintenance (Material/Vendor/Pricing/Customer/PIRs/Source List/BOM data etc) Hands on experience on SAP toolsets in Data space including Data extraction programs from SAP, SQVIs, ETL processes, Load programs, LSMW, LTMC, BODS, data quality maintenance and cleansing etc. Knowledge of Request Management tools eg: SNOW, Remedy etc Knowledge of key database concepts, data models, relationships between different types of data An understanding of end-to-end set-up and business impact of master data key fields Knowledge about SAP, S/4 HANA, SAP-MDG, Ariba , SFDC, MW (Informatica etc) or additional ERP platforms, IT tools and technologies is desirable Experience in Data Management Processes (Data Profiling & Cleansing, Workflows, Data Quality, Governance Process, relationships & dependencies with IT teams etc.), or functional knowledge in SAP MM/PP or OTC modules will be an added advantage Should have prior experience of handling a team. Primary Responsibilities/ Role Expectations: As an MDM Team Lead, the role would involve: Getting adept with MDM process and gaining knowledge by initially working on daily business requests for master data objects - creation/update/obsolete/reactivate. Responsible for holding key business stakeholder interactions for feedback, business requirements and to maintain data governance and data quality. Testing master data creations/updates across tools and interfaces Participation in key projects - S/4 HANA global implementation support as functional master data management expert Work with project team on definitions, design, build and implementation of in-scope technology solutions Support MDM 360/MDG Workflows design and deployment ensure integrity of the solution, user acceptance testing, end-user tools and documentation, training, and support Multiple stakeholder interaction and leading a team to meet the above deliverables Keep inhouse team engaged & updated on MDM projects (S4, MDG, MDM360 etc) & key activities (Data Quality etc) Getting the team and the allocated region ready for future additional tasks/requirements/projects as and when needed Responsible for maintaining data governance, data quality and for data cleansing activities Mentoring the team members on topics of expertise Strong ownership focus, drive to excel & deliver Flexibility to work in shifts. Other Skills & Experience: Professional experience of ~ 8 - 10 years Good communication skills, stakeholder alignment, experience of interaction with end clients/ international colleagues across geographies Ability to resolve conflicts, share, collaborate and work as a leader for the allocated team.

Posted 1 week ago

Apply

2.0 - 5.0 years

2 - 6 Lacs

hyderabad

Work from Office

We are looking for a highly skilled and experienced BI Engineer to join our team at ABC Fitness Solutions, LLC. The ideal candidate will have 6-10 years of experience in the field. Roles and Responsibility Design and develop data visualizations and reports using various tools and technologies. Collaborate with cross-functional teams to identify business requirements and develop solutions. Develop and maintain databases and data systems to support business intelligence initiatives. Analyze complex data sets to identify trends and patterns, and provide insights to stakeholders. Develop and implement data governance policies and procedures to ensure data quality and integrity. Provide technical guidance and support to junior team members on BI tools and technologies. Job Requirements Strong understanding of software services and IT industry trends. Experience with payment operations and financial analysis is desirable. Proficiency in BI tools such as Tableau, Power BI, or QlikView. Excellent communication and interpersonal skills, with the ability to work effectively with stakeholders at all levels. Strong analytical and problem-solving skills, with attention to detail and the ability to meet deadlines. Ability to work in a fast-paced environment and adapt to changing priorities and deadlines. Educational qualifications: B.Tech/B.E., MCA, MBA/PGDM, PG Diploma.

Posted 1 week ago

Apply

5.0 - 7.0 years

8 - 12 Lacs

chennai

Work from Office

Business Analyst focused on designing and implementing a Data Governance strategy and Master Data Management (MDM) framework. This role will support the high-level design and detailed design phases of a transformative project involving systems such as 3DS PLM, SAP, Team Centre, and Blue Yonder. The ideal candidate will bring a blend of business analysis expertise, data governance knowledge, and automotive/manufacturing domain experience to drive workshops, map processes, and deliver actionable recommendations. Working closely with the GM of Master Data and MDM technical resources, you will play a pivotal role in aligning people, processes, and technology to achieve M&M's data governance and MDM objectives. Key Responsibilities : - Requirements Gathering & Workshops: Lead and facilitate workshops with business and IT stakeholders to elicit requirements, define data governance policies, and establish MDM strategies for automotive specific data domains (e.g. , parts, engineering data, bill of material, service parts, supplier and dealer master data). - Process Mapping & Design: Document and design master data-related processes, including data flows between systems such as 3DS, SAP, Talend, and Blue Yonder, ensuring alignment with business needs and technical feasibility. - Analysis & Recommendations: Analyse existing data structures, processes, and system integrations to identify gaps and opportunities; provide clear, actionable recommendations to support Data Governance and MDM strategy. - Stakeholder Collaboration: Act as a bridge between business units, IT teams, and technical resources (e.g. , 3DS specialists) to ensure cohesive delivery of the project objectives. - Documentation & Communication: Create high-quality deliverables, including process maps, requirement specifications, governance frameworks, and summary reports, tailored to both technical and non-technical audiences. - Support Detailed Design: Collaborate with the 3DS/Talend technical resource to translate high-level designs into detailed MDM solutions, ensuring consistency across people, process, and technology components. - Project Support: Assist the MDM Leadership in planning, tracking, and executing project milestones, adapting to evolving client needs. Experience : Required Skills & Qualifications : - 5+ years of experience as a Business Analyst, with a focus on data governance, master data management (MDM) such as Talend, Informatica, Reltio etc. - Proven track record of working on auto/manufacturing industry projects, ideally with exposure to systems like 3DS, Team Centre, SAP S/4HANA, MDG, or Blue Yonder. Technical Knowledge : - Strong understanding of MDM concepts, data flows, and governance frameworks. - Familiarity with auto-specific data domains (e.g. , ECCMA/E-Class Schema). - Experience with process modelling tools (e.g. , Visio, Lucid chart, or BPMN) and documentation standards. Soft Skills : - Exceptional communication and facilitation skills, with the ability to engage diverse stakeholders and drive consensus in workshops. - Methodical and structured approach to problem-solving and project delivery. - Ability to summarize complex information into clear, concise recommendations. - Education: Bachelor's degree in business, Information Systems, or a related field (or equivalent experience). - Certifications: Relevant certifications (e.g. , CBAP, PMP, or MDM-specific credentials) are a plus but not required. Preferred Qualifications : - Prior consulting experience in a client-facing role. - Hands-on experience with MDG, Talend, Informatica, Reltio etc. or similar MDM platforms. - Exposure to data quality analysis or profiling (not required to be at a Data Analyst level).

Posted 1 week ago

Apply

5.0 - 10.0 years

6 - 10 Lacs

bengaluru

Work from Office

We are looking for a Senior Analyst to join our Medtech Data team at Clarivate Bangalore.This is an amazing opportunity to work on market research (including both primary and secondary research), competitive analysis, clinical trial analysis, financial analysis, forecasting, insight generations, produce comprehensive market research reports and conduct custom consulting projects for medical devices and healthcare markets. What will you be doing in this role? As part of the Partnerships team, you will have a key role in ensuring that we maintain high standards of data governance and visibility on commercial ROI across commercially licensed and non-commercially licensed sources. You will work within a team of commercial negotiators and licensing analysts to design, build and maintain the core workflow that will enable the team. You will be the primary administrator of a Sharepoint based database that houses all information related to data partnerships and the related PowerBI dashboards that communicate key governance requirements and commercial outcomes to management. You will interface with colleagues in other departments, notably our technology and data teams to understand existing available data relating to the use of third-party assets and work with those teams to create a reliable consistent workflow that captures requests to i) utilize third party assets, ii) manages the approval process and iii) logs all such requests. This role will touch all aspects of the Life Science and Health business, providing a rich opportunity to gain insights into the wider commercial business. Three primary business objectives: - Support a licensing manager who is constructing a global workflow to manage all aspects of our non-commercially licensed sources. Support a senior licensing manager who is managing commercial ROI, working with various business leaders in product management and consulting. Support the head of partnerships in rolling out a series of insightful dashboards that will inform business leadership on investment decisions. About You experience, education, skills, and At least 5 years of rich experience working with both SharePoint and PowerBI Experience in MS 365 platform including related tools: SharePoint, Power Apps, Power Automate and Power BI. Graduate/Postgraduate any discipline. Validated experience of successfully working with other business systems to ingest/ export / transfer data to enable the creation of an efficient workflow, including interfaces with systems such as JIRA. Proven ability to work independently to take ownership and solve problems relating to the data pipeline, About the Team The team consists of around 40 colleagues spread across teams in India and is reporting to the respective team managers. We have a great skill set in the market research and consulting domain related to medical devices & healthcare markets and we would love to speak with you if you have skills or interest for the same. Hours of Work Hybrid work mode, working days Monday to Friday, working hours 12:00 PM IST to 9:00 PM IST, Hybrid working mode.

Posted 1 week ago

Apply

4.0 - 7.0 years

5 - 9 Lacs

kochi, hyderabad, bengaluru

Work from Office

Summary: We are looking for an experienced Oracle EPM Functional Consultant with deep expertise in Enterprise Data Management (EDM) and Account Reconciliation Cloud Service (ARCS). The ideal candidate will play a key role in implementing, configuring, and supporting Oracle EPM Cloud solutions, ensuring alignment with business goals and compliance standards. Key Responsibilities: Implementation & Configuration: Strong experience in the implementation and configuration of Oracle EPM modules, especially EDMCS and ARCS. Design and manage metadata, hierarchies, and mappings in EDMCS ensuring data accuracy, completeness, and consistency. Configure reconciliation formats, rules, and workflows in ARCS. Integration & Support: Ensure seamless integration between EDMCS, ARCS, and other Oracle EPM modules (e.g., FCCS, EPBCS). Provide ongoing support, troubleshoot issues, and optimize system performance. Monitor KPIs and ensure continuous improvement. Training & Documentation: Prepare training materials and deliver end-user training sessions. Maintain detailed documentation for configurations, processes, and user guides. Qualifications: Bachelors degree in Finance, Information Systems, or related field. 4-7 years of experience in Oracle EPM Cloud, including EDMCS and ARCS. Proven track record of at least two end-to-end Oracle EPM Cloud implementations. Strong understanding of financial close, reconciliation, and data governance processes. Proficiency with EPM Automate and integration tools. Excellent communication and stakeholder management skills. Preferred Skills: Experience with other Oracle EPM modules like FCCS, EPBCS, PCMCS. Familiarity with Agile or Waterfall methodologies. Oracle certifications in EPM Cloud modules. Location - Bengaluru,Kochi,Hyderabad,Chennai

Posted 1 week ago

Apply

9.0 - 14.0 years

35 - 40 Lacs

pune

Work from Office

Job Description: Job Title: Senior Python & GCP Data Engineering SME Location: Pune, India Corporate Title: AVP Role Description Deutsche Bank has set for itself ambitious goals in the areas of Sustainable Finance, ESG Risk Mitigation as well as Corporate Sustainability. As Climate Change throws new Challenges and opportunities, Bank has set out to invest in developing a Sustainability Technology Platform, Sustainability data products and various sustainability applications which will aid Banks goals. As part of this initiative, we are building an exciting global team of technologists who are passionate about Climate Change, want to contribute to greater good leveraging their Technology Skillset in multiple areas predominantly in Cloud Hybrid Architecture. As part of this Role, we are seeking a highly experienced GCP Data & BI Subject Matter Expert (SME) to join our growing team. In this senior role, you will be a trusted advisor, providing technical expertise and strategic direction across all things data and BI on GCP. Your key responsibilities Technical Expertise In-depth knowledge of GCP data services (BigQuery, Cloud Storage, Composer,PubSub,Cloud Run,Cloud Logging etc.). In-depth knowledge in Python Coding for data engineering and API build experience. In-depth knowledge of Database (like Postgres,BigQuery) and complex queries/procedures. Design and optimize complex data pipelines for efficient data ingestion, transformation, and analysis. Partner with product management group and other business stakeholders to gather requirements, translate them into technical specifications, and design effective e2e Data solutions. Design and develop complex data models, leveraging expertise in relational and dimensional modeling techniques. Advocate for best practices in data governance, security, and compliance on GCP. Collaboration & Mentorship Collaborate with data engineers,analysts, and business stakeholders to understand data requirements and drive data-driven decision-making. Mentor and guide junior team members on GCP technologies and BI best practices. Foster a culture of innovation and continuous improvement within the data and BI domain. Staying Current Track emerging trends and innovations in GCP, BI tools, and data analytics methodologies. Proactively research and recommend new technologies and solutions to enhance our data, BI capabilities. Your skills and experience 9+ years of experience in data warehousing, data management, and business intelligence. Proven expertise in Google Cloud Platform (GCP) and its data services (BigQuery, Cloud Storage, Composer,PubSub,Cloud Run,Cloud Logging,IAM etc.). Proven expertise in Python and SQL (Data Engineering and API build knowledge) Strong understanding of data governance, security, and compliance principles on GCP. Experience designing and implementing complex data pipelines. In-depth knowledge of relational and dimensional modeling techniques for BI. Experience with T SQL or PL SQL or Ansi SQL Excellent communication, collaboration, and problem-solving skills. Ability to translate technical concepts into clear, actionable insights for business stakeholders. Strong leadership presence and ability to influence and inspire others. Knowledge of Sustainable Finance ESG Risk CSRD Regulatory Reporting will be a plus Knowledge of cloud infrastructure and data governance best practices will be a plus. Knowledge of Terraform will be a plus

Posted 1 week ago

Apply

1.0 - 6.0 years

10 - 12 Lacs

jaipur

Work from Office

Role & responsibilities : Experience in developing, implementing or architecting data privacy and information governance based policies, frameworks. Great working knowledge of data protection and privacy laws and regulations (e.g. EU GDPR, etc.) and industry standards and frameworks, such as GAPP and BCR Able to efficiently understand client organizations and their business model and to tailor-make relevant processes to data privacy requirements Basic knowledge of project management tools and methodologies Creative thinking ability and good analytical skills Ability to keep confidential and sensitive information safe Strong research and communications skills Strong understanding of information security regulatory requirements and compliance issues Good working knowledge of computers and common software packages including analytical tools Good verbal and business communications skills Notable customer consulting and collaboration skills Experience in service delivery, team handling and working with all levels of staff Preferred candidate profile : Identify existing privacy gaps at our clients businesses which they may need to focus on Create custom privacy programs to help clients with their business and provide implementation support Conduct Privacy Impact Assessments for the clients across the globe in diverse industries Assist clients understanding and dealing with the new data privacy regulations such as GDPR, DPDPA, CPRA, etc. Help implement policies and procedures to help client protect their data (e.g. PII, personal sensitive data, etc.) Create an incident response and forensic plan to tackle data breaches that could make clients susceptible to privacy regulation penalties Wherever needed render services on behalf of clients to help them with privacy regulations From an information governance perspective, help clients identify and classify their information using various frameworks. Provide assistance in creating data maps and advice on the redundancy of data within their organizations. Define technical and business requirements for data privacy and information governance solutions. Define information security processes and policies which secure and enable the business. Certifications such as , ISO 27701, CIPM, CIPT, CIPP/E, CISM, CISSP, and/or HCISSP, as well as involvement in industry related organizations (e.g. IAPP, ISACA, (ISC) ) are added advantages

Posted 1 week ago

Apply

3.0 - 4.0 years

5 - 6 Lacs

gurugram

Work from Office

Project Role : Security Architect Project Role Description : Define the cloud security framework and architecture, ensuring it meets the business requirements and performance goals. Document the implementation of the cloud security controls and transition to cloud security-managed operations. Must have skills : Security Data Privacy Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Security Architect, you will define the cloud security framework and architecture, ensuring it meets the business requirements and performance goals. Your typical day will involve collaborating with various teams to assess security needs, documenting the implementation of cloud security controls, and transitioning to cloud security-managed operations. You will engage in discussions to refine security strategies and ensure compliance with industry standards, all while adapting to the evolving landscape of cloud technologies and security threats. Roles & Responsibilities:- Maintaining the integrity of data and processes in OneTrust or Securiti.ai.- Hands-on support using OneTrust or securiti.ai in data discovery, classification & data governance.- Hands-on support using OneTrust or securiti.ai in performing data security posture management.- Supporting the team with OneTrust or securiti,ai privacy assessments.- Hands-on support using OneTrust or securiti.ai for Policy & Notice Management and DPIA.- Hands-on support using OneTrust or securiti,ai for Cookie compliance, including scanning and banner.- Hands-on support using OneTrust or securiti.ai for Consent compliance and maintain records of consent.- Hands-on support using OneTrust or securiti.ai for Data Subject Requests to automate request to fulfilment to meet regulatory deadlines.- Hands-on Data retention & deletion - Manage and enforce retention policies and data deletion.- Evaluating PIA/DPIA assessments for Risk Management, including Vendors.Professional and Technical Skills: - 3-4 years of hands-on experience as an OneTrust or Securiti.ai administrator.- 3 years of work experience with data privacy regulations such as GDPR, CCPA,DPDP (mandatory).- 2 years of work experience in defining & managing DSAR, DPIA's, Consent, Cookie, TPRM & RoPA lifecycles. 2 years of work experience in performing Data Discovery , Classification, Data Governance, Data Mapping & Cataloging and Data Security Posture Management. Excellent communication skills in English - both written and verbal.- OneTrust or Securiti.ai certified Professional (required). Additional Information:- The candidate should have minimum 12 years of experience in Security Data Privacy.- A 15 year full time education is required.- This will be a work from office on all 5 days, and the resource needs to work from client location only. Qualification 15 years full time education

Posted 1 week ago

Apply

6.0 - 9.0 years

12 - 16 Lacs

bengaluru

Work from Office

Title: Data Architect - ISS Department: ISS Enterprise Architecture Location: Bangalore Level: 6 About your team Fidelity is investing in its Enterprise Architecture team with the purpose to serve the needs of Fidelity by enabling the Business, Technology and Data strategies to flourish to realise the targeted business opportunities and value. Overall, the functions aim is to: Unite Business, Technology and Data strategies into a technology roadmap Lead the transformation of Fidelity's technology estate Align innovation with strategy and business need Provide strategic thought-leadership for Fidelity Commission key strategy change and drive adoption. Ensure efficient use of change investment leading to maximum business value return Operate across technical, organisation and process challenges to drive design of enterprise solutions. Instigate and direct Deliverys regulatory and compliance engagement, verifying success through governance checks Educate technology and business on solutions and technologies that are relevant for the strategies About your role This role is for a data architect who can understand, guide, and solves complex functional, technical and architectural issues around data. He/she must understand the implications associated with the chosen technical strategy well aligned with the business context. He/she will be interacting with various levels of the business and technical community, for building more context, explaining the underlying problems, and for presenting the solutions. He/she will handle the logical and physical data architecture landscape, and need to ensure that our data solutions are optimal and compliant with EOL policies; security policies etc. He/she will be acting as guardians of the data landscape across the data lifecycle; understands the technical data lineage and data definition and embeds data into ISS deliveries. About you The candidate should be an expert in data architecture concepts in particular data product model and data modelling with a good understanding of meta-data management, data governance, data risk, and data issue management. The role would also involve intensive interaction with the business and other systems groups, so good communications skills and the ability to work under pressure are an absolute must. Experience in Application Architecture. Should have played an Application Architect role in multiple projects Experience or understanding of Data Governance, Data Quality, Data Issue Management Experience or understanding of Data Product model and the required supporting data capabilities Experience in data modelling techniques and creating various data models Experience in AWS cloud services and Snowflake and the associated architecture patterns Excellent communication (verbal and written) and interpersonal skills suitable for a diverse audience, with the ability to communicate in a positive, friendly and effective manner with technical or non-technical users/ customers Ability to guide/mentor juniors in the team and review code artefacts The ability to convey complex, abstract technical concepts to technical and non-technical audiences Ability to work closely with cross-functional teams. Ability to prioritise own activities, work under hard deadlines.

Posted 1 week ago

Apply

6.0 - 9.0 years

16 - 20 Lacs

hyderabad

Work from Office

Job Description Summary As Technical Product Manager for our Data Products, you will join our GridOS Data Fabric product management team who are delivering solutions designed to accelerate decarbonization by managing DERs at scale and proactively manage disruptions from climate change. Specifically, you will be accountable for managing technical product lifecycle activities around our core Data Products in partnership with our own & partner development teams to build trusted data products for our GridOS ADMS applications. Job Description Roles and Responsibilities Technical product management, responsible for delivering Data Products in partnership with both GE Vernova & partner development teams. Includes all activities related to sprint planning, backlog grooming, testing and release management. Collaborate with data engineers, data scientists, analysts, and business stakeholders to prioritize product epics & features. Ensure data products are reliable, scalable, secure, and align with regulatory and compliance standards. Advocate for data governance, data quality, and metadata management as part of product development. Evangelize the use of data products across the organization to drive data you can trust to fuel AI/ML predictive workflows. Accountability for functional, business, and broad company objectives. Integrate and develop processes that meet business needs across the organization, be involved in long-term planning, manage complex issues within functional area of expertise, and contribute to the overall business strategy. Developing specialized knowledge of latest commercial developments in own area and communication skills to influence others. Contributes towards strategy and policy development and ensure delivery within area of responsibility. Has in-depth knowledge of best practices and how own area integrates with others; has working knowledge of competition and the factors that differentiate them in the market Brings the right balance of tactical momentum and strategic focus and alignment and uses engineering team organization processes, like scrums, daily-stand-ups and not shy away from explaining deep technical requirements. Uses judgment to make decisions or solve moderately complex tasks or problems within projects, product lines, markets, sales processes, campaigns, or customers. Takes new perspective on existing solutions. Uses technical experience and expertise for data analysis to support recommendations. Uses multiple internal and limited external sources outside of own function to arrive at decisions. Acts as a resource for colleagues with less experience. May lead small projects with moderate risks and resource requirements. Explains difficult or sensitive information; works to build consensus. Developing persuasion skills required to influence others on topics within field. Required Qualifications This role requires significant experience in the Product Management & Digital Product Manager. Knowledge level is comparable to a Master's degree from an accredited university or college. Bachelors degree in Computer Science, Data Science, Engineering, or a related field. Desired Characteristics Strong oral and written communication skills. Strong interpersonal and leadership skills. Demonstrated ability to analyze and resolve problems. Demonstrated ability to lead programs projects. Ability to document, plan, market, and execute programs. Strong understanding of data infrastructure, data modeling, ETL pipelines, APIs, and cloud technologies (e.g., AWS, Azure). Experience with iterative product development and program management techniques including Agile, Safe, Scrum & DevOps. Familiarity with data privacy and security practices (e.g., GDPR, CCPA, HIPAA), and understanding of metadata, lineage, and data quality management. Knowledge and experience with electric utility industry practices.

Posted 1 week ago

Apply

10.0 - 15.0 years

35 - 45 Lacs

bengaluru

Work from Office

Job Purpose and Impact This role will be responsible to create and maintain Data Quality Framework, Solution Design/ Architecture and assisting the global data team and business users /data stewards in implementing a global or enterprise level data quality solution and help improving data quality across organization. Key Accountabilities Accountable for Data Quality solution design, development, Test, and Operationalization across Cargill. Closely work with DQ engineering team and provide technical support and mentorship wherever required Understand functional/ technical design, define best practice, Develop re-usable & scalable DQ solution Partner with Global & Enterprise data teams and consult on data quality capabilities to define, Implement and socialize data definitions, Standards/ Polices. Qualifications Minimum requirement of 10 years of relevant work experience. Typically reflects 12 years or more of relevant experience. IDMC - Cloud Data Quality (CDQ), Data Integration (CI), Data Profiling (CDP) Mandatory IDMC Data Governance & Catalogue (CDGC), Application Integration (CAI) Mandatory Informatica Address Doctor / AD6/ DAAS - Mandatory IDMC - Metadata Scanning, CLAIRE Preferred Informatica Data Quality (IDQ) Preferred IDMC -Administration, Operational Dashboard, Connections Preferred

Posted 1 week ago

Apply

8.0 - 15.0 years

20 - 25 Lacs

bengaluru

Work from Office

Hungry, Humble, Honest, with Heart. The Opportunity We are seeking a detail-oriented and analytical IT Audit Manager to support our SOX compliance and internal audit program as part of our India team. This role focuses on evaluating and testing the design and operating effectiveness of internal controls over financial reporting (ICFR) within the IT environment, managing IT related internal audits and projects. The ideal candidate will possess a strong understanding of SOX requirements, IT general controls (ITGCs) and automated controls, and experience in IT auditing methodologies. About the Team The Internal Audit team at Nutanix is a dynamic group that manages both the SOX compliance and internal audit programs. This team thrives on collaboration and innovation, fostering a culture of openness and integrity. The mission of the Internal Audit team is to ensure that the Companys processes are effective and efficient, safeguarding assets while contributing to the achievement of Nutanixs strategic objectives through insightful audits and recommendations. You will report to the Senior Manager - Internal Audit, who is known for their supportive and empowering leadership style. They emphasize professional growth and encourage team members to take ownership of their work while providing guidance, when needed. You Role Develop and execute IT audit plans, conduct risk assessments, and perform IT audit testing independently. Understand end-to-end business processes, critical IT systems, and data flows that impact financial reporting. Coordinate, perform testing and review testing of ITGCs, key reports, SOC 1 reports, IPEs, and automated controls. Support internal audits related to IT operations, cybersecurity, data governance, and participate in the development of risk-based audit plans. Provide training and support to junior audit staff to develop team capabilities. Establish strong relationships with stakeholders to facilitate audit processes and gather insights. Monitor and assess audit findings, ensuring timely resolution and follow-up actions. Achieve operational goals by streamlining audit processes and improving reporting accuracy within the first year. What You Will Bring Strong understanding of Internal Audit processes and methodologies, especially in IT processes and controls around SaaS applications. 8 years of experience with Sarbanes-Oxley (SOX) compliance and testing. Knowledge of financial reporting and accounting standards. Proficient in risk assessment and management techniques. Excellent analytical and problem-solving skills. Effective communication and interpersonal skills. Bachelors degree in Information Systems, Accounting, or Computer Science/Engineering. Strong organizational skills with attention to detail. How we work This role operates in a hybrid capacity, blending the benefits of remote work with the advantages of in-person collaboration. For most roles, that will mean coming into an office a minimum of 3 days per week, however certain roles and/or teams may require more frequent in-office presence. Additional team-specific guidance and norms will be provided by your manager. -- Nutanix is an equal opportunity employer. Nutanix is an Equal Employment Opportunity and (in the U.S.) an Affirmative Action employer. Qualified applicants are considered for employment opportunities without regard to race, color, religion, sex, sexual orientation, gender identity or expression, national origin, age, marital status, protected veteran status, disability status or any other category protected by applicable law. We hire and promote individuals solely on the basis of qualifications for the job to be filled. We strive to foster an inclusive working environment that enables all our Nutants to be themselves and to do great work in a safe and welcoming environment, free of unlawful discrimination, intimidation or harassment. As part of this commitment, we will ensure that persons with disabilities are provided reasonable accommodations. If you need a reasonable accommodation, please let us know by contacting [email protected] .

Posted 1 week ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

bengaluru

Work from Office

You Lead the Way. We ve Got Your Back. At American Express, you ll be recognized for your contributions, leadership, and impact every colleague has the opportunity to share in the company s success. Together, we ll win as a team, striving to uphold our company values and powerful backing promise to provide the world s best customer experience every day. And we ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like Manager Finance Data Governance (India) As part of the Finance Data Governance Organization (FDG) within Corporate Controllership, this role is responsible for overseeing the end-to-end process of financial regulatory data attestation, ensuring the accuracy, completeness, and traceability of data submitted to regulatory bodies. The ideal candidate will have deep knowledge of financial regulations, a strong command of data governance principles, and proven experience implementing attestation processes in complex, regulated financial environments Key Responsibilities: Lead the implementation and ongoing execution of the regulatory data attestation framework across Finance. Support in establishing standards, controls, and documentation protocols to ensure consistent and auditable sign-off on regulatory data submissions (e.g., FR Y-9C, CCAR, Basel, BCBS 239). Develop and maintain attestation workflows, schedules, templates, and dashboards using an AMEX enterprise tool. Perform and document data quality checks, transformation validations, and reconciliation activities in support of attestation readiness. Facilitate attestation review sessions with data stakeholders and maintain a clear audit trail of signoffs. Generate and present attestation status reports for leadership and regulatory stakeholders. Collaborate with internal audit and compliance teams to ensure policy alignment and audit preparedness. Support the integration of data attestation into broader data governance, risk, and compliance frameworks. Qualifications: Bachelor s degree in finance, Accounting, Information Management, or related field; advanced degree or certifications (e.g., CA, CPA, CISA, CDMP) preferred. 7 plus years of work experience in data governance, data quality, regulatory reporting, or audit/compliance roles. Experience of Finance, Banking, or similar industries is a strong plus. Strong understanding of data flows, data lineage, and control frameworks across a wide range of architectures/platforms. Familiarity with attestation, data certification, or data sign-off practices in regulated environments (e.g. OCC, FRB, etc.) Strong relationship skills and communication is essential as the role require partnering with multiple groups e.g., Business process owners, Technologies, and Enterprise Data Governance. Experience supporting regulatory reporting (e.g., FR2052a, FRY 14s, FFIEC 031, FRY9C, etc.) Self-motivated and proactive, ability to manage multiple assignments and projects concurrently within tight deadlines. Ability to be flexible, prioritize multiple demands, and effectively manage in a matrix organization.

Posted 1 week ago

Apply

5.0 - 10.0 years

3 - 5 Lacs

bengaluru

Work from Office

About The Role What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies