Home
Jobs

1361 Data Governance Jobs - Page 47

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5 - 8 years

10 - 17 Lacs

Pune

Work from Office

Naukri logo

Hiring for MDM + JAVA (Immediate Joiner) Client: Multinational IT Company (Client name would be discussed during call) NOTE: Interview will be Schedule on TUESDAY 20TH MAY 25 (11AM - 4PM), Interested candidates can ONLY Apply !! Experience: 5 Years to 8 Years Location: Pune, Maharashtra Mode: Hybrid Notice Period: Immediate to 10 days Salary: 10 LPA - 17 LPA Payroll: Full Time Client Payroll Required: 5+ years of experience in Informatica MDM and related tools. Strong understanding of MDM architecture and data governance. Hands-on experience with Java, SQL, and REST APIs. Knowledge of Active VOS workflow design and troubleshooting. Job Description: Informatica MDM Developer Key Skills & Technologies Informatica MDM (Master Data Management) functionality BES (Business Entity Services) External Calls Active VOS (AVOS) Workflow MDM Hub & Components and Database (Oracle/SQL) e360 Application Development Provisioning Tool MDM Implementation & Development Configure and manage MDM Hub components like Landing, Staging, Base Objects, and Hierarchies. Implement Match & Merge rules for data consolidation. Develop and maintain e360 User Interfaces for MDM BES & API Integration Design and implement BES External Calls for data validation and retrieval. Integrate BES services with external systems via REST/SOAP APIs. Workflow & Process Automation o Develop, customize, and manage Active VOS (AVOS) Workflows. o Implement workflow automation for Data Stewardship and approvals. Provisioning & Data Governance: Use Provisioning Tool for data modeling and UI configuration. o Define data quality compliance Develop code for data processing and Custom api.

Posted 1 month ago

Apply

5 - 7 years

8 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Department : Platform Engineering Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.

Posted 1 month ago

Apply

5 - 7 years

8 - 14 Lacs

Surat

Work from Office

Naukri logo

Job Title : Sr. Data Engineer Ontology & Knowledge Graph Specialist Department : Platform Engineering Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.

Posted 1 month ago

Apply

6 - 9 years

13 - 18 Lacs

Hyderabad

Work from Office

Naukri logo

About The Role Extensive expertise in Microsoft 365, Entra ID, and multi-tenant capabilities of these platforms. Expertise in Microsofts latest multi-tenancy features called MTO (Multi Tenant Organization) Expertise in setting up and configuring Microsoft M365 collaboration tools including Microsoft Teams, SharePoint Onling, Viva Engage, and Exchange Online, especially in a multi-tenancy setting Experience with Entra ID and M365 security protocols, compliance and data governance best practices. Experience with Microsoft Purview, especially Data Loss Prevention Having worked with Microsofts engineering teams on multi-tenancy feature development previously would be a big plus. Familiarity with automation tools such as Power Automate to implement any necessary approval flows for SharePoint or Teams to ensure existing information sharing policies of our group are followed. Familiarity with Entra ID and M365 licensing models and requirements for MTO. Primary Skills Microsoft 365 Entra ID Power Automate MTO Secondary Skills SharePoint Microsoft Purview Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.

Posted 1 month ago

Apply

3 - 8 years

5 - 12 Lacs

Gurugram

Hybrid

Naukri logo

Master Data Management (MDM) Finance & Accounting About NTT DATA NTT DATA is a $30+ billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long-term success. As a Global Top Employer, we have experts in more than 50 countries and a robust partner ecosystem of established and startup companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are also one of the leading providers of digital and AI infrastructure in the world. NTT DATA is part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at nttdata.com Job brief We seek a competent Specialist to perform Master Data Management (MDM) in ERP system. This process consists of very tight timelines and multiple source systems and end systems. Our Specialist should exhibit professionalism, dedication and commitment towards timely delivery of services. Job Responsibilities End-to-end ownership of master data management Analyze and triage missing master data issues and work with respective teams to fix the issues. This job involves MDM with respect to: Product Master creation and Maintenance Client Master creation and Maintenance Vendor Master creation and Maintenance Service Master creation and Maintenance Data Governance Review each incoming request for duplication and completeness of data Data Quality Review each record for correctness and completeness Minimum Experience, Education and Certifications M.Com / B.Com /CA/ICWA Requires 3-9 years relevant experience Technical Skill Must have Very good knowledge of relevant usage of Master Data Very good data analysis skills Problem resolving skills and should be a team player Working knowledge of MS Office and databases SAP ERP Soft Skills Good communication skills (verbal and written). Good interpersonal skills and ability to self-manage. Display good planning and organizing abilities. Demonstrate good attention to detail and deadline driven. Able to cope with stressful situations. Able to deal with different individuals at various levels in the organization. Takes own initiative and has a solutions-orientated approach. Maintains a high standard of accuracy and quality. Ability to work independently and be a knowledge expert Comfortable working with targets Patience and ability to manage stress

Posted 1 month ago

Apply

9 - 14 years

14 - 18 Lacs

Kolkata

Work from Office

Naukri logo

Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your role Having 5+ years of experience in creating data strategy frameworks/ roadmaps Having relevant experience in data exploration & profiling, involve in data literacy activities for all stakeholders. 5+ years in Analytics and data maturity evaluation based on current AS-is vs to-be framework. 5+ years Relevant experience in creating functional requirements document, Enterprise to-be data architecture. Relevant experience in identifying and prioritizing use case by for business; important KPI identification opex/capex for CXO's 2+ years working knowledge in Data StrategyData Governance/ MDM etc 4+ year experience in Data Analytics operating model with vision on prescriptive, descriptive, predictive , cognitive analytics Identify, design, and recommend internal process improvementsautomating manual processes, optimizing data delivery, re- designing infrastructure for greater scalability, etc. Identify data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. Work with data and analytics experts to create frameworks for digital twins/ digital threads. Relevant experience in co-ordinating with cross functional team ; aka SPOC for global master data Your Profile 8+ years of experience in a Data Strategy role, who has attained a Graduate degree in Computer Science, Informatics, Information Systems, or another quantitative field. They should also have experience using the following software/tools: Experience with understanding big data toolsHadoop, Spark, Kafka, etc. Experience with understanding relational SQL and NoSQL databases, including Postgres and Cassandra/Mongo dB. Experience with understanding data pipeline and workflow management toolsLuigi, Airflow, etc. Good to have cloud skillsets (Azure/ AWS/ GCP), 5+ years of Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.Postgres/ SQL/ Mongo 2+ years working knowledge in Data StrategyData Governance/ MDM etc. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with unstructured datasets. A successful history of manipulating, processing, and extracting value from large, disconnected datasets. Working knowledge of message queuing, stream processing, and highly scalable big data data stores. Strong project management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.

Posted 1 month ago

Apply

2 - 5 years

1 - 6 Lacs

Mumbai, Mumbai Suburban, Mumbai (All Areas)

Work from Office

Naukri logo

Officer/Executive - Master Data Management Mumbai, India (Hybrid) We are seeking a detail-oriented and highly skilled ERP Master Data Management (MDM) Officer to join our team. In this role, you will be responsible for managing, maintaining, and optimizing master data within our ERP system (workday Finance) for multiple entities around the globe. You will work closely with cross-functional teams to ensure data integrity, improve processes, and support the implementation of best practices for data governance with various colleagues over the world. Role & responsibilities : Master Data Management: Oversee the creation, maintenance, and quality of master data within the ERP system, ensuring consistency and accuracy across business functions. Data Governance: Enforce data governance policies and procedures to ensure master data meets business rules and compliance standards. ERP System Optimization: Collaborate with IT and business teams to optimize ERP system configurations and enhance the efficiency of data entry, data processing, and reporting. Data Cleansing & Standardization: Identify, analyze, and resolve any data quality issues, implementing corrective actions to improve data consistency, completeness, and accuracy. Cross-Functional Collaboration: Work closely with business departments (Finance, Reservations, Controlling etc.) to understand their data needs and ensure proper data integration and synchronization across functions. Data Migration Support: Assist in the migration of master data during system upgrades, new module implementations, or ERP system changes, ensuring a smooth transition and accurate data transfer. Reporting & Analytics: Prepare regular reports and analytics to track master data quality, data governance adherence, and system performance. Continuous Improvement: Drive initiatives to continuously improve master data management processes, tools, and policies, ensuring best practices are followed throughout the organization. Preferred candidate profile: Minimum of 2 years of work experience preferably in master data management, data governance, or data management roles and in other Finance and Accounting roles to understand the impact of master data and financial terminology as well as basics of payment transactions. Familiarity with Workday is highly preferred. Technical Skills: Strong understanding of data management principles, including data quality, data validation, and data governance. Experience with data migration and integration projects within ERP systems. Proficiency in MS Excel Soft Skills: Strong problem-solving skills and attention to detail. Excellent communication skills with the ability to work collaboratively across teams. Ability to manage multiple tasks and prioritize effectively in a dynamic environment. Preferred Qualifications: Experience with ERP implementations or upgrades. Knowledge of industry-specific best practices in master data management. Certifications in ERP systems or data management (e.g., SAP Master Data Governance certification, DAMA certification). What we offer: International team with global office network Modern Office and dynamic work environment in Mumbai Great team spirit

Posted 1 month ago

Apply

5 - 7 years

8 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title : Sr. Data Engineer Ontology & Knowledge Graph Specialist Department : Platform Engineering Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.

Posted 1 month ago

Apply

7 - 9 years

7 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

Senior Lead Data Management Analyst for Home Lending Data Analytics Team Data-Management-Analyst-for-Home-Lending-Data---Analytics-Team About this role: Wells Fargo is seeking a Senior Lead Data Management Analyst to join our Data Insights and Analytics (DIA) team. This role will provide support as an Application Business Owner (ABO) role for Home Lending applications and compliance with Enterprise policies related to both data and application management. The DIA team fulfills the Enterprise defined ABO role. ABO is a required role for all applications registered in the companys Remedy Asset Management system. The Application Business Owner has a variety of roles and responsibilities which are not static. The ABO roles and responsibilities must adapt to changes in policy and application requirements, but at the heart of the role is a responsibility to drive compliance activities for their applications, included compliance with data management; application lifecycle management; records management; access management; information security; risk & regulatory reporting; business continuity; and RCSA policies. The DIA team also helps to connect Home Lending data to the broader Enterprise In this role, you will: Manage requests and perform highly complex support functions that evidence compliance with 14+ Enterprise policies and procedures Build relationships with business partners at all levels both within and outside of Home Lending Collaborate with functional business partners, leaders, and executive management to provide support for strategic initiatives related to data management and risk mitigation Responsible for proactively identifying potential risks when implementing change, along with developing mitigation strategies and plans Lead cross-organizational initiatives to review, define, enhance, and document our processes, procedures, and controls Manage stakeholder relations with key partner groups and various teams across the organization Establish and execute program change management processes to support controls, implementation, and adoption Develop strategy and resolution of highly complex and unique challenges that require solid analytical skills, extensive knowledge of business execution, application inventory, and business units utilizing application inventory to deliver longer term and larger-scale solutions Lead team and stakeholder meetings to facilitate decision making and drive adoptions and adherence to program requirements Strategically engage with all levels of professionals and managers across multiple lines of businesses and serve as an expert advisor to the leadership Should be able to provide strategic solutions to business Required Qualifications: 7+ years of Data Management, Business Analysis, Analytics, or Project Management experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Proficient in implementing a unified data strategy that aligns with the overall business objectives of HL. Guide the team on how to leverage data assets effectively while ensuring adherence to industry standards and best practices. Champion a comprehensive data quality framework to ensure high standards of data integrity, consistency, and accuracy across all data systems and processes. Serve as the primary liaison between data management, BIA, Tech and Product. Work cross-functionally to ensure data strategy and quality initiatives are aligned with broader business goals. Continuously evaluate and refine data management processes to enhance operational efficiency. Innovate and implement automated solutions enabling scalable and agile data management and governance practices. Excellent verbal, written, and interpersonal communication skills Ability to navigate through ambiguity and maintain momentum which drive toward results and clarity in uncertain situations Knowledge of experience with data related tools: ADMF, Collibra, Ab Initio, JIRA, SHRP Knowledge of and experience with key HL applications: MSP, LO SHAW, CORE, CMIE etc. Working knowledge of the audit life cycle, standards, practices and testing strategies Knowledge of third-party risk management and regulatory compliance; understanding of operational oversight for third party execution and third-party regulations Knowledge of enterprise risk management framework concepts, including risk identification, risk appetite and strategy, risk related decisions, processes and controls, risk analytics and governance. Familiarity with formal project and change management processes, particularly Agile and similar methodologies Proven and demonstrated leadership skills including relationship building, partnering and collaboration skills with clear ability to influence, gain buy-in and negotiate with a diverse group of key business partners/stakeholders including senior management Strong analytical skills with keen attention to detail with ability to draw conclusions and translate findings Ability to grasp complex business issues quickly, recommend solutions, and drive for resolutions Facilitation skills, including ability to facilitate decision-making and broker agreements amongst diverse, differing, and/or conflicting perspectives/priorities. Ability to manage diverse relationships and foster collaborative team dynamic with Risk Management, Data Governance, Corporate Finance, and other key stakeholders at all levels of the organization Advanced Proficiency Microsoft Office (Word, Excel, Outlook and PowerPoint) Proficient with delivering against: RCSA Policy Data Management Policy and Procedures COSO Policy Regulatory Reporting Governance & Oversight Policy Records and Information Management Policy Information Security Identity and Access Management Domain Policy Job Level: P5.

Posted 1 month ago

Apply

4 - 9 years

20 - 30 Lacs

Chennai

Work from Office

Naukri logo

Key Responsibilities Develop and implement data governance policies, standards, and processes to ensure data integrity, quality, and security. Collaborate with cross-functional teams to embed governance best practices into data pipelines, analytics, and BI reporting. Define and monitor data quality metrics and KPIs Setup data catalogs, classification, and metadata management for enhanced discoverability and compliance. Partner with IT, Security, and Compliance teams to ensure regulatory and policy adherence (e.g., GDPR, HIPAA). Leverage tools and technologies like SQL, Pandas Profiling , and Python to enhance data quality and governance workflows. Act as a subject matter expert on data governance strategies and tools. Skills and Experience Bachelors/Masters degree in Data Science, Information Management, Computer Science, or a related field. 8+ years of experience in data governance, data quality management, or BI reporting roles. Knowledge of data governance tools such as Collibra, OpenMetadata, DataHub , and Informatica . Proficiency in SQL , with hands-on experience in data profiling tools (e.g., Pandas Profiling). Strong understanding of data lifecycle management, privacy laws, and compliance frameworks. Excellent leadership, communication, and stakeholder management skills. Analytical mindset with experience in measuring and reporting on data quality KPIs.

Posted 1 month ago

Apply

4 - 9 years

5 - 15 Lacs

Gurugram, Bengaluru

Hybrid

Naukri logo

Role & responsibilities: Enhance and standardised Collibra data glossary. Linke assets, logical models, and processes within Collibra. Updated and verified data lineage and quality metrics. Reporting and dashboards in Collibra for governance oversight. Creation of SOPs or process documents so everyone can align Preferred candidate profile: identify gaps and recommend improvements without requiring constant oversight. Document processes, decisions and the work being completed in Collibra Has experience in using Collibra Is familiar with data models and can link them to the Collibra assets Experience with Confluence. Will likely be creating SOPs and process documents in Confluence Mandate: Minimum 4 yrs of working experince.

Posted 1 month ago

Apply

5 - 7 years

10 - 16 Lacs

Pune

Hybrid

Naukri logo

Lead Data Engineer Experience: 7 - 10 Years Exp Salary : Upto INR 25 Lacs per annum Preferred Notice Period : Within 30 Days Shift : 10:00AM to 7:00PM IST Opportunity Type: Hybrid (Pune) Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : AWS Glue, Databricks, Azure - Data Factory, SQL, Python, Data Modelling, ETL Good to have skills : Big Data Pipelines, Data Warehousing Forbes Advisor (One of Uplers' Clients) is Looking for: Data Quality Analyst who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description Position: Lead Data Engineer (Databricks) Location: Pune, Ahmedabad Required Experience: 7 to 10 years Preferred: Immediate Joiners Job Overview: We are looking for an accomplished Lead Data Engineer with expertise in Databricks to join our dynamic team. This role is crucial for enhancing our data engineering capabilities, and it offers the chance to work with advanced technologies, including Generative AI. Key Responsibilities: Lead the design, development, and optimization of data solutions using Databricks, ensuring they are scalable, efficient, and secure. Collaborate with cross-functional teams to gather and analyse data requirements, translating them into robust data architectures and solutions. Develop and maintain ETL pipelines, leveraging Databricks and integrating with Azure Data Factory as needed. Implement machine learning models and advanced analytics solutions, incorporating Generative AI to drive innovation. Ensure data quality, governance, and security practices are adhered to, maintaining the integrity and reliability of data solutions. Provide technical leadership and mentorship to junior engineers, fostering an environment of learning and growth. Stay updated on the latest trends and advancements in data engineering, Databricks, Generative AI, and Azure Data Factory to continually enhance team capabilities. Qualifications: Bachelors or masters degree in computer science, Information Technology, or a related field. 7+ to 10 years of experience in data engineering, with a focus on Databricks. Proven expertise in building and optimizing data solutions using Databricks and integrating with Azure Data Factory/AWS Glue. Proficiency in SQL and programming languages such as Python or Scala. Strong understanding of data modelling, ETL processes, and Data Warehousing/Data Lakehouse concepts. Familiarity with cloud platforms, particularly Azure, and containerization technologies such as Docker. Excellent analytical, problem-solving, and communication skills. Demonstrated leadership ability with experience mentoring and guiding junior team members. Preferred Skills: Experience with Generative AI technologies and their applications. Familiarity with other cloud platforms, such as AWS or GCP. Knowledge of data governance frameworks and tools. How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: At Inferenz, our team of innovative technologists and domain experts help accelerating the business growth through digital enablement and navigating the industries with data, cloud and AI services and solutions. We dedicate our resources to increase efficiency and gain a greater competitive advantage by leveraging various next generation technologies. About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 month ago

Apply

2 - 6 years

8 - 13 Lacs

Noida

Work from Office

Naukri logo

Data Analyst III Who We Are Brightly, a Siemens company, is the global leader in intelligent asset management solutions. Brightly enables organizations to transform the performance of their assets with a sophisticated cloud-based platform that leverages several years of data to deliver predictive insights that help users through the key phases of the entire asset lifecycle. More than 12,000 clients of every size worldwide depend on Brightly"™s complete suite of intuitive software including CMMS, EAM, Strategic Asset Management, Sustainability and Community Engagement. Paired with award-winning training, support and consulting services, Brightly helps light the way to a bright future with smarter assets and sustainable. About The Job The Business Intelligence (BI) Analyst and Report Development professional for Brightly is a lead specialist in our Analytics and BI Services team responsible for building, testing, and maintaining software Product embedded reports, charts and dashboards in Power BI and/or QLIK. This position will also partner with and guide other Product report writers and end users in the development of their own reports. By providing best in class enterprise reporting, the Report Writer directly contributes towards Brightly"™s objective to differentiate with data. What You"™ll Be Doing Address reporting needs of applications by modernizing and building new embedded reports using Power BI or in some case QLIK Cloud Develop appropriate Semantic Models and Business Views, generate calculated fields based on application specific business logic and implement row level security (RLS) in the application reports or dashboards Support end-user community in the use of business intelligence tools creation of ad-hoc reports. Ongoing technical documentation for Brightly BI Services sustainability and scale including data sources, logic, processes, and limitations Work closely with multiple stakeholders such as Product Management, Analytics, Design, and Data Cloud teams Follow and influence reporting and data quality change control processes for proper configuration and application change management that will impact reports What You Need A Bachelor's degree in Business, Programming, Business Intelligence, Computer science or related field Minimum 6 years of experience developing reports in Power BI (some may be using similar tools) and familiarity with reports embedding in the applications Proficiency in using SQL, with experience in querying and joining tabular data structures, database management, and creating new variables required for reports Expertise in building intuitive, interactive dashboards and pixel perfect reports/Power BI paginated reporting. Advance level of knowledge in Power BI Desktop Reporting (Including all sub-components such as Power Query, Semantic Data Modelling, DAX and Visualizations) Strong experience and knowledge of Power BI Services (Ex. Gateway, B2B Applications, Workspaces etc.) Willingness to learn general international data security issues and follow data governance. Ability to communicate and collaborate in a remote team setting both reading, writing, and speaking English Ability to manage multiple priorities and adjust quickly to changing requirements and priorities Performs other related duties as assigned The Brightly Culture Service. Ingenuity. Integrity. Together. These values are core to who we are and help us make the best decisions, manage change, and provide the foundations for our future. These guiding principles help us innovate, flourish and make a real impact in the businesses and communities we help to thrive. We are committed to the great experiences that nurture our employees and the people we serve while protecting the environments in which we live Together We Are Brightly

Posted 1 month ago

Apply

4 - 9 years

25 - 30 Lacs

Hyderabad

Work from Office

Naukri logo

Overview In this role, We are seeking a Associate Manager Offshore Program & Delivery Management to oversee program execution, governance, and service delivery across DataOps, BIOps, AIOps, MLOps, Data IntegrationOps, SRE, and Value Delivery programs. This role requires expertise in offshore execution, cost optimization, automation strategies, and cross-functional collaboration to enhance operational excellence. Collaborate with global teams to support Data & Analytics transformation efforts and ensure sustainable, scalable, and cost-effective operations. Support the standardization and automation of pipeline workflows, report generation, and dashboard refreshes to enhance efficiency. Manage and support DataOps programs, ensuring alignment with business objectives, data governance standards, and enterprise data strategy. Assist in real-time monitoring, automated alerting, and self-healing mechanisms to improve system reliability and performance. Contribute to the development and enforcement of governance models and operational frameworks to streamline service delivery and execution roadmaps. Assist in proactive issue identification and self-healing automation, enhancing the sustainment capabilities of the PepsiCo Data Estate. Responsibilities Support DataOps and SRE operations, assisting in offshore delivery of DataOps, BIOps, Data IntegrationOps, and related initiatives. Assist in implementing governance frameworks, tracking KPIs, and ensuring adherence to operational SLAs. Contribute to process standardization and automation efforts, improving service efficiency and scalability. Collaborate with onshore teams and business stakeholders, ensuring alignment of offshore activities with business needs. Monitor and optimize resource utilization, leveraging automation and analytics to improve productivity. Support continuous improvement efforts, identifying operational risks and ensuring compliance with security and governance policies. Assist in managing day-to-day DataOps activities, including incident resolution, SLA adherence, and stakeholder engagement. Participate in Agile work intake and management processes, contributing to strategic execution within data platform teams. Provide operational support for cloud infrastructure and data services, ensuring high availability and performance. Document and enhance operational policies and crisis management functions, supporting rapid incident response. Promote a customer-centric approach, ensuring high service quality and proactive issue resolution. Assist in team development efforts, fostering a collaborative and agile work environment. Adapt to changing priorities, supporting teams in maintaining focus on key deliverables. Qualifications 6+ years of technology experience in a global organization, preferably in the CPG industry. 4+ years of experience in Data & Analytics, with a foundational understanding of data engineering, data management, and operations. 3+ years of cross-functional IT experience, working with diverse teams and stakeholders. 12 years of leadership or coordination experience, supporting team operations and service delivery. Strong communication and collaboration skills, with the ability to convey technical concepts to non-technical audiences. Customer-focused mindset, ensuring high-quality service and responsiveness to business needs. Experience in supporting technical operations for enterprise data platforms, preferably in a Microsoft Azure environment. Basic understanding of Site Reliability Engineering (SRE) practices, including incident response, monitoring, and automation. Ability to drive operational stability, supporting proactive issue resolution and performance optimization. Strong analytical and problem-solving skills, with a continuous improvement mindset. Experience working in large-scale, data-driven environments, ensuring smooth operations of business-critical solutions. Ability to support governance and compliance initiatives, ensuring adherence to data standards and best practices. Familiarity with data acquisition, cataloging, and data management tools. Strong organizational skills, with the ability to manage multiple priorities effectively.

Posted 1 month ago

Apply

5 - 8 years

20 - 25 Lacs

Hyderabad

Work from Office

Naukri logo

Job Description Blend is hiring a Senior Data Scientist (Generative AI) to spearhead the development of advanced AI-powered classification and matching systems on Databricks. You will contribute to flagship programs like the Diageo AI POC by building RAG pipelines, deploying agentic AI workflows, and scaling LLM-based solutions for high-precision entity matching and MDM modernization. Key Responsibilities Design and implement end-to-end AI pipelines for product classification, fuzzy matching, and deduplication using LLMs, RAG, and Databricks-native workflows . Develop scalable, reproducible AI solutions within Databricks notebooks and job clusters , leveraging Delta Lake, MLflow, and Unity Catalog. Engineer Retrieval-Augmented Generation (RAG) workflows using vector search and integrate with Python-based matching logic. Build agent-based automation pipelines (rule-driven + GenAI agents) for anomaly detection, compliance validation, and harmonization logic. Implement explainability, audit trails, and governance-first AI workflows aligned with enterprise-grade MDM needs. Collaborate with data engineers, BI teams, and product owners to integrate GenAI outputs into downstream systems. Contribute to modular system design and documentation for long-term scalability and maintainability. Qualifications Bachelor s/Master s in Computer Science, Artificial Intelligence, or related field. 5+ years of overall Data Science experience with 2+ years in Generative AI / LLM-based applications. Dee

Posted 1 month ago

Apply

12 - 15 years

50 - 55 Lacs

Bengaluru

Work from Office

Naukri logo

SAP MDG Lead (ideally 10+ years exp) responsible for delivering high-quality MDG solutions to customers at different stages- from problem definition to diagnosis to solution design, development and deployment. Review the proposals prepared by consultants, provide guidance, and analyze the solutions defined for the client business problems to identify any potential risks and issues. Provide expert advice on master data governance topics with in-depth Techno Functional knowledge in SAP MDG end to end implementations Experienced in leading multiple MDG delivery assignments involving full life cycle implementations (solution blueprinting, fit gap analysis, product configurations, data migration cutover) SAP MDG hands on knowledge in configurations customizations for Data Modeling, UI Modeling, Process Modeling, Workflows, Data Replication Framework, Key Mapping, rules, and derivations, BRF+ Must have good understanding of SAP MDG technical framework - BADI, BAPI/RFC/FM, Workflows, BRF+, Enterprise Services, IDoc, Floorplan Manager, WebDynPro, Fiori, and MDG API framework Strong client engagement co-ordination skills with good knowledge of SAP cross functional master data - Material, Customer, Supplier FI objects and data quality tools is necessary Mandatory skills SAP MDG HANDS ON

Posted 1 month ago

Apply

3 - 6 years

6 - 10 Lacs

Pune

Work from Office

Naukri logo

Allata is a fast-growing technology strategy and data development consulting firm delivering scalable solutions to our enterprise clients. Our mission is to inspire our clients to achieve their most strategic goals through uncompromised delivery, active listening, and personal accountability. We are a group who thrive in fast-paced environments, working on complex problems, continually learning, and working alongside colleagues to be better together. We are seeking a skilled Data Engineer with strong expertise in Data Bricks and Delta Live Tables to guide and drive the evolution of our clients data ecosystem. The ideal candidate show strong technical leadership and own hands-on execution, leading the design, migration, and implementation of robust data solutions while mentoring team members and collaborating with stakeholders to achieve enterprise-wide analytics goals. Key Responsibilities ~ Collaborate in defining the overall architecture of the solution, with experience in designing, building, and maintaining reusable data products using Data Bricks, Delta Live Tables (DLT), Pyspark, and SQL. Migration of existing data pipelines to modern frameworks and ensure scalability and efficiency. ~ Develop the data infrastructure, pipeline architecture, and integration solutions while actively contributing to hands-on implementation. ~ Build and maintain scalable, efficient data processing pipelines and solutions for data-driven applications. ~ Monitor and ensure adherence to data security, privacy regulations, and compliance standards. ~ Troubleshoot and resolve complex data-related challenges and incidents in a timely manner. ~ Stay at the forefront of emerging trends and technologies in data engineering and advocate for their integration when relevant. Required Skills & Qualifications ~ Proven expertise in Data Bricks, Delta Live Tables, SQL, and Pyspark for processing and managing large data volumes. ~ Strong experience in designing and implementing dimensional models and medallion architecture. ~ Strong experience in designing and migrating existing databricks workspaces and models to Unity Catalog enabled workspaces. ~ Strong Experience creatinging and managing group Access Control Lists (ACL) and compute and governance policies in Databricks Unity Catalog. ~ Hands-on experience with modern data pipeline tools (e.g. AWS Glue, Azure Data Factory) and cloud platforms (e.g. Databricks). ~ Knowledge of cloud data lakes (e.g., Data Bricks Delta Lake, Azure Storage and/or AWS S3). ~ Demonstrated experience applying DevOps principles using Version Control and CICD for IaC and code base deployments(e.g. AzureDevops, Git, CI/CD) to data engineering projects. ~ Strong experience with batch and streaming data processing techniques and file compactization strategies. Nice-to-Have Skills ~ Familiarity with architectural best practices for building data lakes. ~ Hands on with additional Azure Services including, Message Queues, Service Bus, Cloud Storage, Virtual Cloud, Serverless ~ Compute, CloudSQL, OOP Languages and Frameworks ~ Experience with BI tools (e.g., Power BI, Tableau) and deploying data models. ~ Experience with Databricks Unity Catalog, i.e. configuring and managing data governance and access controls in a Delta Lake environment

Posted 1 month ago

Apply

3 - 15 years

5 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

We help the world run better This position is part of the SAP Returnship Program, seeking applicants who have been on a career break for a year or more and want to return to the workplace. What youll do As Data Management Consultant @ Success Delivery Center, your role will be to support customer in their digital transformation journey by implementing across Data Management solutions covering Data Migrations and Master Data Governance. As a tech-no functional consultant, you will be part of project teams delivering SAP Implementations for customers. You need to be hands on and well versed with solutions and good communication skills to participate in business discussions. Functional understanding of Data Management and prior development experience will be added advantage. While there may be travel based on customer needs, the focus will be delivering remote and offshore. Own / Acquire relevant SAP Business AI skills to position/deliver SAP s AI offerings to our customers. Enhance adoption/ consumption of various SAP AI offerings in customer use cases What you bring 3- 15 Years of stable professional experience in SAP with data management and good development experience in ABAP. Experience in one or more of the following areas - SAP MDG/ Data Migration S/4HANA/ SAP Data Integration Experience in one or more of the following MDG Implementation/ Data Migration / Data Integration Solutions - SAP Data Services, Migration Cockpit, SDI, SLT, Datasphere-Integration Good functional knowledge in any SAP modules like MM, SD, PP, FICO Good Written and Verbal Communication & Ability to Interact with diverse group of people. Proven track record of performance with stable stint in the previous company. Meet your team Data Management Solution Area within BTP Delivery @ Scale is strong 100 plus team delivering engagements across wide range of Data Management Topics - Data Migration, Data Integration, Data Engineering, Data Governance and Data Quality. Location: Bangalore / Gurgaon

Posted 1 month ago

Apply

8 - 13 years

25 - 30 Lacs

Ahmedabad

Work from Office

Naukri logo

Title: Senior Data Platform Engineer Location: fully remote. Working hours: Flexible (presence from 9 pm to 12 am is must) Rest of the hours are flexible during the day time. About the Role: We are seeking a hands-on Data Platform Engineer to lead the design, development, and maintenance of our scalable data infrastructure. You ll build data pipelines, manage our data lake, and enable reliable analytics and reporting across the business. Key Responsibilities: Design and develop robust data pipelines using Python, Spark, and Big Data tools Own and manage our AWS-based data lake (S3, Glue, Athena, Lambda, EMR) Build and orchestrate ETL workflows with Airflow Develop and manage Kafka-based data streaming platforms Ensure data reliability, scalability, and performance Collaborate with Analysts, Scientists, and stakeholders for data-driven decisions Promote best practices in data governance and quality Enable BI integration with tools like Tableau Mentor junior team members Required Skills: 8+ years in data engineering or platform roles Strong Python/Java and advanced SQL skills Expertise in Airflow, Spark/PySpark, and AWS data stack Experience with large-scale ETL and real-time streaming (Kafka) Solid grasp of data warehousing, governance, and big data systems Nice to Have: Tableau or BI tools experience Familiarity with open-source data platforms (Druid, Iceberg, Nessie) Knowledge of data observability, cataloging, and quality frameworks

Posted 1 month ago

Apply

3 - 5 years

20 - 25 Lacs

Hyderabad

Work from Office

Naukri logo

Blend is hiring a Senior Data Scientist (Generative AI) to spearhead the development of advanced AI-powered classification and matching systems on Databricks. You will contribute to flagship programs like the Diageo AI POC by building RAG pipelines, deploying agentic AI workflows, and scaling LLM-based solutions for high-precision entity matching and MDM modernization. Key Responsibilities Design and implement end-to-end AI pipelines for product classification, fuzzy matching, and deduplication using LLMs, RAG, and Databricks-native workflows . Develop scalable, reproducible AI solutions within Databricks notebooks and job clusters , leveraging Delta Lake, MLflow, and Unity Catalog. Engineer Retrieval-Augmented Generation (RAG) workflows using vector search and integrate with Python-based matching logic. Build agent-based automation pipelines (rule-driven + GenAI agents) for anomaly detection, compliance validation, and harmonization logic. Implement explainability, audit trails, and governance-first AI workflows aligned with enterprise-grade MDM needs. Collaborate with data engineers, BI teams, and product owners to integrate GenAI outputs into downstream systems. Contribute to modular system design and documentation for long-term scalability and maintainability. Bachelor s/Master s in Computer Science, Artificial Intelligence, or related field. 5+ years of overall Data Science experience with 2+ years in Generative AI / LLM-based applications. Deep experience with Databri

Posted 1 month ago

Apply

2 - 5 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

About this opportunity: This position plays a crucial role in the development of Python-based solutions, their deployment within a Kubernetes-based environment, and ensuring the smooth data flow for our machine learning and data science initiatives. The ideal candidate will possess a strong foundation in Python programming, hands-on experience with ElasticSearch, Logstash, and Kibana (ELK), a solid grasp of fundamental Spark concepts, and familiarity with visualization tools such as Grafana and Kibana. Furthermore, a background in ML Ops and expertise in both machine learning model development and deployment will be highly advantageous. What you will do: Python Development: Write clean, efficient, and maintainable Python code to support data engineering tasks, including data collection, transformation, and integration with machine learning models. Data Pipeline Development: Design, develop, and maintain robust data pipelines that efficiently gather, process, and transform data from various sources into a format suitable for machine learning and data science tasks using ELK stack, Python and other leading technologies. Spark Knowledge: Apply basic Spark concepts for distributed data processing when necessary, optimizing data workflows for performance and scalability. ELK Integration: Utilize ElasticSearch, Logstash, and Kibana (ELK) for data management, data indexing, and real-time data visualization. Knowledge of OpenSearch and related stack would be beneficial. Grafana and Kibana: Create and manage dashboards and visualizations using Grafana and Kibana to provide real-time insights into data and system performance. Kubernetes Deployment: Deploy data engineering solutions and machine learning models to a Kubernetes-based environment, ensuring security, scalability, reliability, and high availability. What you will Bring: Machine Learning Model Development: Collaborate with data scientists to develop and implement machine learning models, ensuring they meet performance and accuracy requirements. Model Deployment and Monitoring: Deploy machine learning models and implement monitoring solutions to track model performance, drift, and health. Data Quality and Governance: Implement data quality checks and data governance practices to ensure data accuracy, consistency, and compliance with data privacy regulations. MLOps (Added Advantage): Contribute to the implementation of MLOps practices, including model deployment, monitoring, and automation of machine learning workflows. Documentation: Maintain clear and comprehensive documentation for data engineering processes, ELK configurations, machine learning models, visualizations, and deployments. Why join Ericsson? What happens once you apply? Primary country and city: India (IN) || Bangalore Req ID: 766745

Posted 1 month ago

Apply

5 - 10 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking for a skilled professional with 5 to 10 years of relevant experience to join our team as an EY Financial Accounting Advisory Services (FAAS) Manager. The ideal candidate will have a strong focus on Enterprise Performance management (EPM) areas and be based in Gurugram or Bengaluru. ### Roles and Responsibility Lead engagement scoping and plan Discovery Workshops with clients to assess current state and perform comprehensive needs analysis. Identify and document current FP&A / Consolidation & Reporting processes, KPIs, risk metrics, pain points, and inefficiencies. Engage with stakeholders to understand business requirements and objectives. Conduct gap analysis to determine improvement or automation opportunities. Lead the detailed Process Analysis stage, identifying redundant or manual steps that can be optimized or automated. Coordinate with multiple teams for detailed solutioning and delivery, including redesigning processes to eliminate inefficiencies and incorporate best practices. Evaluate appropriate FP&A / Consolidation & Reporting software and automation tools that meet project requirements based on factors such as integration capabilities, scalability, user-friendliness, and cost. Assist in designing processes on chosen FP&A / Consolidation & Reporting tools to fit the future state operating model and demonstrate expertise in EPM solution design with different applications and create business rules for consolidation, planning, and reporting. Advise clients on enhancing their Planning, Budgeting, Forecasting, Financial Close, Consolidation & Reporting, and Disclosure Management processes using different EPM tools. Conduct comprehensive cost-benefit analysis, costing assessments, and benchmarking exercises to measure the performance of EPM applications. Proficient in developing KPIs, performance value driver analysis, and implementing Driver-based planning models. Lead in responding to Requests for Quotation (RFQs), preparing proposals, and actively participating in client discussions. Develop detailed documentation and deliver training programs and workshops to enhance end-user understanding and proficiency in EPM solutions during handover. ### Job Requirements Chartered Accountant (CA), CPA (US), ACCA (UK) or MBA with 5-10 years of relevant experience. Strong understanding of Financial Planning and Analysis (FP&A) and/or Consolidation & Reporting processes, methodologies, and best practices. Proven experience in leading and managing transformation initiatives, including driver-based forecasting, process automation, and strategic operating model design. Expertise in or hands-on experience in tasks such as Budgeting and forecasting, Financial modelling and scenario analysis, Variance analysis, Performance management and KPIs. Proficiency in leading FP&A / Consolidation and Reporting tools (e.g., Tagetik, Lucanet, Anaplan, Adaptive Insights, Oracle Hyperion, IBM Planning Analytics), including an understanding of their features, capabilities, and limitations. Familiarity with Data Management & Integration processes and tools such as Data extraction, transformation, and loading (ETL) processes, Data governance and data quality management, Integration with ERP systems (e.g., SAP, Oracle, Microsoft Dynamics). Familiarity with BI & Analytics processes & tools (e.g., Power BI, Tableau, Qlik), Data visualization and dashboarding, Advanced analytics and data science concepts. Experience in current state analysis (As is), future state analysis (To be), process optimization, data optimization, data analytics, and data management. Hands-on experience preferred in compiling monthly financials (P&L, B/S, C/F), analyzing trends in P&L items to identify issues and risks or opportunities for financial plan/forecast. Project management experience, including the ability to lead and manage multiple projects simultaneously. Strong executive presence and ability to interface with all levels of management (EY and clients).

Posted 1 month ago

Apply

5 - 10 years

25 - 30 Lacs

Gurugram

Work from Office

Naukri logo

We are looking for a highly skilled and experienced Associate Director to join our team in Mumbai. The ideal candidate will have 5-10 years of experience in risk management, preferably in law firms or legal/ethical conflict checking. ### Roles and Responsibility Identify and manage ethical/legal conflicts of interest at EY, providing appropriate conflict safeguards. Act as a quality controller for conflict checks within the Conflicts Management Law CoE. Identify training needs and facilitate required trainings for team members. Mentor and coach team members to enhance their skills and knowledge. Assist in resolving technical queries and escalations with senior stakeholders. Manage operational KPIs through daily/monthly reporting. Handle operational, financial, and strategic responsibilities and projects, including data governance, reporting, invoicing, financial planning, KPI benchmarking, headcount planning, and process efficiency. Develop and implement an early warning system for process KPIs and benchmarks to ensure zero-surprise delivery. Partner in strategy formation and implementation, leading conversations with senior stakeholders independently. Prepare SLAs and agree with stakeholders, managing performance and appraisals for the entire team. Collaborate cross-functionally with other departments and teams to ensure a cohesive approach. ### Job Requirements Graduate in any discipline; law graduates are preferred. Relevant experience of 5+ years, preferably in international or domestic law firms, legal/ethical conflict checking, or related fields. Proficient in MS Office, especially Excel and PowerPoint. Strong analytical skills with a logical mindset to make informed decisions. Excellent communication skills, both written and verbal, with the ability to work effectively under pressure. Ability to perform well under pressure and respond to time-sensitive projects. Demonstrate problem-solving skills, including creativity and innovative thinking, and bring innovation to improve processes and work products. Proven expertise in leading 10-15 member teams and possess excellent team attitude, interpersonal, and leadership skills.

Posted 1 month ago

Apply

3 - 8 years

7 - 11 Lacs

Chennai

Work from Office

Naukri logo

We are looking for a skilled Intermediate Quest One Identity Manager Developer with 3 to 8 years of experience to join our team. The ideal candidate will have expertise in custom connector development, advanced workflow configurations, and optimizing synchronization processes for large-scale identity management. ### Roles and Responsibility Design and implement custom workflows using designer and Object Browser for complex provisioning tasks. Develop and maintain custom connectors for integrating with external systems using Synchronization Editor and APIs. Write advanced SQL stored procedures, triggers, and custom queries for data reconciliation and manipulation within One Identity’s database. Configure and optimize Job Service and DBQueue to handle high-volume job processing and resolve performance bottlenecks. Develop complex VBScript and PowerShell scripts to implement business logic. Implement and configure role mining and role lifecycle management processes, ensuring role compliance and SoD policy enforcement. Extend the functionality of the Web Portal by customizing UI forms, adding new fields, and configuring specific approval workflows for access requests. Perform advanced troubleshooting using Job Queue Info, analyzing detailed logs, and debugging synchronization and provisioning failures. Implement and maintain the attestation process, ensuring compliance through periodic certification of user roles and entitlements. Lead efforts to implement custom reporting using SSRS or One Identity Reporting Module to deliver access governance insights. Integrate One Identity Manager with cloud services (e.g., Azure AD, AWS IAM) and on-prem applications using custom-developed connectors. ### Job Requirements In-depth knowledge of Quest One Identity Manager architecture, including Application Server, Job Server, and Data Governance Edition. Advanced SQL skills for writing stored procedures, views, and triggers. Proficiency in VBScript, PowerShell, and knowledge of One Identity Manager API. Strong experience with Synchronization Editor for developing custom connectors. Deep understanding of Active Directory, LDAP, HR systems, Azure, and other integrated systems. Familiarity with SoD policies, role mining, and advanced RBAC configuration.

Posted 1 month ago

Apply

3 - 8 years

5 - 9 Lacs

Kochi

Work from Office

Naukri logo

We are looking for a skilled professional with 3 to 8 years of experience to join our team as an EY Data Engineer in Bengaluru. The ideal candidate will have a strong background in data engineering, analytics, and reporting. ### Roles and Responsibility Collaborate with cross-functional teams to design and implement data solutions. Develop and maintain large-scale data pipelines using tools like Azure Data Factory and Azure Synapse. Design and implement data models and architectures to support business intelligence and analytics. Work with stakeholders to understand business requirements and develop solutions that meet their needs. Ensure data quality and integrity by implementing data validation and testing procedures. Optimize data storage and retrieval processes for improved performance and efficiency. ### Job Requirements Strong knowledge of data modeling, architecture, and visualization techniques. Experience with big data technologies such as Hadoop, Spark, and NoSQL databases. Proficiency in programming languages like Python, Java, and SQL. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment and communicate effectively with stakeholders. Strong understanding of data governance principles and practices. A B.Tech/B.E. degree is required; higher professional or master’s qualification is preferred. Active membership in related professional bodies or industry groups is preferred.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies