Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 6.0 years
10 - 20 Lacs
pune, greater noida
Work from Office
Coforge Ltd is Hiring for Data Governance Engineers. Are you passionate about data integrity, governance, and compliance? Join Coforge Ltd. as a Data Governance Engineer and help shape the future of data management across enterprise systems. About the Role : This role involves working on data governance frameworks and ensuring compliance with data policies. Experience: 4 to 6 Years Job Locations: Pune & Greater Noida Only. For queries, reach out via WhatsApp: 9667427662 Send your CV to: Gaurav.2.Kumar@coforge.com Responsibilities : Implement data governance frameworks and workflows Monitor data quality and ensure policy compliance Collaborate with data stewards and data architects Develop tools for metadata management and data lineage tracking Support audit processes and regulatory reporting Required Skills : Experience with tools like Collibra, Informatica Axon, Alation, Talend Knowledge of data cataloging, classification, and MDM Familiarity with GDPR, HIPAA, CCPA, SOX Proficiency in SQL, Python, or Shell scripting Exposure to cloud platforms: AWS Glue, Azure Purview, Google Data Catalog Strong communication and stakeholder engagement skills Experience in data governance maturity assessments Preferred Skills : Experience with tools like Collibra, Informatica Axon, Alation, Talend Knowledge of data cataloging, classification, and MDM Familiarity with GDPR, HIPAA, CCPA, SOX Proficiency in SQL, Python, or Shell scripting Exposure to cloud platforms: AWS Glue, Azure Purview, Google Data Catalog Strong communication and stakeholder engagement skills Experience in data governance maturity assessments
Posted -1 days ago
7.0 - 12.0 years
14 - 15 Lacs
bengaluru
Hybrid
About the Role We are seeking a Collibra Data Governance Specialist to play a pivotal role in enabling, documenting, and supporting enterprise-wide data governance adoption. This individual will focus on training, metadata management, governance enablement, and Collibra platform administration, ensuring that data assets are well-documented, trusted, and aligned with business objectives. This is an exciting opportunity for someone passionate about driving data governance maturity and empowering business and technical users with reliable, well-governed data. Key Responsibilities Training & Enablement Develop and deliver training sessions, workshops, quick reference guides, and knowledge-sharing materials to enable Stewards, Owners, and Analysts. Drive adoption of governance processes by making training accessible, actionable, and engaging. Serve as a go-to resource for governance-related questions across business and IT teams. Process & Procedure Documentation Author and maintain Standard Operating Procedures (SOPs), process flows, and governance documentation. Ensure consistent adoption of data governance practices across domains. Provide guidelines and user guides to help stakeholders navigate Collibra and governance workflows effectively. Metadata Management Collaborate with Data Stewards and Steward Leads to capture, document, and curate metadata. Maintain high-quality metadata aligned with business definitions, KPIs, and technical assets. Ensure metadata accurately reflects lineage, classifications, and relationships. Collibra Administration & Cataloging Leverage Collibra templates to create, update, and maintain catalog entries. Establish and maintain relationships across assets (business terms, reports, KPIs, tables, columns). Support metadata certification and report governance workflows within Collibra. Governance Enablement Partner with Steward Leads to enforce governance policies across domains. Support initiatives for data lineage, KPI/report certification, and asset relationship mapping. Promote governance adoption through practical enablement and communication. Presentation & Communication Prepare and deliver dashboards, governance updates, and presentations for councils, working groups, and executive stakeholders. Clearly communicate complex governance concepts to technical and non-technical audiences. Represent governance team in stakeholder meetings and cross-functional forums. Qualifications Strong knowledge of data governance concepts: stewardship, cataloging, lineage, classification, and certification. Hands-on experience with Collibra Data Intelligence Platform (cataloging, metadata curation, templates). Proven ability to create training materials and conduct training/presentations across diverse audiences. Excellent documentation skills for SOPs, guidelines, and process frameworks. Ability to collaborate effectively with Steward Leads, Data Owners, and cross-functional stakeholders. Strong organizational skills and attention to detail, with the ability to manage multiple priorities.
Posted -1 days ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As a Functional Consultant in Oracle Cloud EDM module, your role involves the following key responsibilities: - Creating applications by selecting the appropriate adapter from the list of available EDM Adapters - Developing data chain objects such as Node Types, Hierarchy Sets, Node Sets, and Viewpoints - Generating Maintenance Views - Implementing Node Type converters and Compares - Writing queries for data analysis - Demonstrating the ability to create complex derived properties - Managing subscriptions for efficient metadata management - Defining business constraints and validations - Configuring Mapping Viewpoints - Establishing an efficient Governance Framework - Performing bulk Metadata updates using custom scripts - Configuring Users, User Groups, access provisioning, and Security framework - Integrating EDM with various target systems - Managing environment synchronization, code migration, and release management In addition to the above responsibilities, you should be able to: - Understand the impact of changes made to metadata and conduct impact analysis - Take necessary steps to enhance performance - Conduct integration using EPM automate The technical requirements for this role include proficiency in: - Oracle EDM (Enterprise Data Management) - Understanding of Overall EPM Processes Please note that the preferred skill set for this position includes expertise in Oracle Cloud's EPM - Enterprise Data Management technology.,
Posted 1 day ago
2.0 - 6.0 years
0 Lacs
noida, uttar pradesh
On-site
You will be empowered to shape your career at Capgemini in the way you desire, supported and inspired by a collaborative community of colleagues worldwide, and have the opportunity to reimagine what is achievable. Join Capgemini to assist the world's leading organizations in unlocking technology's value and constructing a more sustainable, inclusive world. **Role Overview:** As a Procurement Customer Support Specialist, you will play a hands-on role in supporting Coupa system users across the organization. Your responsibilities will include assisting in onboarding new Coupa users, updating information in existing user profiles, and addressing Coupa access-related support tickets. You will also be tasked with detailing Coupa workflows to internal stakeholders and determining appropriate system access levels based on individual user roles and responsibilities. A comprehensive understanding of Coupa workflows is essential to ensure accurate access provisioning and effective user support. **Key Responsibilities:** - Provision and manage Coupa user access, which includes setup, role assignments, catalog access, and approver changes. - Identify the appropriate level of system access based on individual user roles and responsibilities. - Create, update, and deactivate user profiles using Excel-based data entry and exports. - Extract and analyze data from Coupa to generate reports and insights for management. - Maintain metadata for contracts, purchase requisitions, and purchase orders in Coupa. - Willingness to work in shifts as part of a new client engagement. - Immediate joiners or candidates available within a maximum of 30 days. - Must be available to work from the office all 5 days in line with client expectations. Capgemini is a global business and technology transformation partner, committed to accelerating organizations" dual transition to a digital and sustainable world, creating tangible impact for enterprises and society. With a diverse team of 340,000 members in over 50 countries, Capgemini leverages its over 55-year heritage to unlock technology's value for clients, addressing all aspects of their business needs. The company delivers end-to-end services and solutions, incorporating strengths from strategy and design to engineering, supported by market-leading capabilities in AI, generative AI, cloud, and data, alongside deep industry expertise and a strong partner ecosystem.,
Posted 1 day ago
5.0 - 9.0 years
0 Lacs
kochi, kerala
On-site
As a Data Science Programs Lead Associate Director at EY, you will be leading the Data Engineering and Data Science pillars within the Global Analytics team in the Global Finance function. Your role will involve scoping, exploring, and delivering advanced data science and machine learning projects to drive growth and create efficiencies for the firm. You will work closely with the Global Analytics team lead and other pillar leads to ensure that the best data and analytics-driven insights are used in strategic and operational decision-making across EY. Your key responsibilities will include: - Supporting and collaborating with the team to deliver advanced data science and ML projects - Using agile best practices to manage team priorities and workflow across multiple projects - Ensuring high standards are maintained in terms of your team's work, including validation, testing, and release management - Being a thought leader and driving innovation across the team - Coaching and developing the team to ensure they have the right skills to succeed - Communicating developments to stakeholders in a clear and relevant manner To qualify for this role, you must have: - An excellent track record in leading teams to develop data science and machine learning solutions - The ability to build trust with key stakeholders and explain analysis in a visual story - Experience in proactive innovation and creating new solutions to meet business requirements - Strong experience in creating and managing automated ETL processes for machine learning pipelines - Practical experience in performing exploratory analytics and creating data science pipelines using Python and SQL Ideally, you will also have experience with Graph databases, metadata management, LLMs using vectorized and structured data, and PowerBI. At EY, you will have the opportunity to work in an inclusive environment that values flexible working arrangements. You will be rewarded with a competitive remuneration package and comprehensive Total Rewards, including support for career development and flexible working. EY offers support, coaching, and feedback from engaging colleagues, opportunities to develop new skills, and the freedom to handle your role in a way that suits you. EY is committed to building a better working world by creating long-term value for clients, people, and society. Through data and technology, EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate across various sectors.,
Posted 1 day ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Data Analyst in the Solution Design team at Barclays, your role involves supporting the organization in defining and designing technology and business solutions to meet organizational goals. This includes requirements gathering, data analysis, data architecture, system integration, and delivering scalable, high-quality designs aligned with both business and technical needs. Key Responsibilities: - Deliver large-scale change in complex environments, acting as a thought leader in requirements documentation and workshop facilitation to gather, clarify, and communicate business needs effectively. - Utilize strong data analysis and data modeling skills to perform data validations, anomaly detection, and make sense of large volumes of data to support decision-making. - Demonstrate advanced SQL proficiency for querying, joining, and transforming data to extract actionable insights, along with experience in data visualization tools such as Tableau, Qlik, and Business Objects. - Act as an effective communicator, translating complex technical concepts into clear, accessible language for diverse audiences, and liaising between business stakeholders and technical teams to achieve a mutual understanding of data interpretations, requirements definition, and solution designs. - Apply experience in Banking and Financial services, particularly in wholesale credit risk, and implement data governance standards including metadata management, lineage, and stewardship. Qualifications Required: - Experience in Python data analysis and associated visualization tools. - Familiarity with external data vendors for sourcing and integrating company financials and third-party datasets. - Experience with wholesale credit risk internal ratings-based (IRB) models and regulatory frameworks. In this role based in Chennai/Pune, you will be responsible for implementing data quality processes and procedures to ensure reliable and trustworthy data. Your tasks will include investigating and analyzing data issues related to quality, lineage, controls, and authoritative source identification, executing data cleansing and transformation tasks, designing and building data pipelines, and applying advanced analytical techniques like machine learning and AI to solve complex business problems. Additionally, you will document data quality findings and recommendations for improvement. As a Vice President, you are expected to contribute to strategy, drive requirements, and make recommendations for change. You will manage resources, budgets, and policies, deliver continuous improvements, and escalate breaches of policies and procedures. If you have leadership responsibilities, you will demonstrate leadership behaviours focused on creating an environment for colleagues to thrive and deliver to an excellent standard. All colleagues at Barclays are expected to uphold the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as demonstrate the Barclays Mindset of Empower, Challenge, and Drive in their behavior.,
Posted 2 days ago
2.0 - 6.0 years
0 Lacs
pune, maharashtra
On-site
Role Overview: ZS is seeking a motivated and detail-oriented Information Protection Analyst to join the Data Governance team. In this role, you will be responsible for supporting the implementation and ongoing management of the data governance framework. Your primary focus will be on operationalizing governance policies, maintaining data quality, supporting data classification and discovery efforts, and ensuring that data is well-managed, secure, and compliant. As a Data Governance Analyst, you will play a crucial role in data stewardship, monitoring governance processes, and assisting with tool administration to promote trusted and governed data across the organization. Key Responsibilities: - Support the implementation and enforcement of data governance policies, standards, and procedures. - Assist in data discovery, classification, and cataloging efforts using data governance and DSPM tools. - Monitor and report on data quality metrics, compliance adherence, and data stewardship activities. - Maintain and update data inventories, metadata repositories, and lineage documentation. - Collaborate with data owners, stewards, and IT teams to address data governance issues and remediation. - Participate in data risk assessments, control testing, and regulatory compliance audits. - Manage governance tool configurations, user access, and workflow automation. - Provide training and support to data steward teams and end-users on governance principles and tool usage. - Prepare regular data governance status reports and dashboards for leadership. Qualifications Required: - Bachelor's degree in Data Management, Computer Science, Information Systems, Business, or related field. - 2+ years of experience in data governance, data quality, data management, or related operational roles. - Familiarity with data governance frameworks, data cataloging, or metadata management concepts. - Hands-on experience with data governance, catalog, or data quality tools (e.g., Collibra, Informatica, Alation, Talend, BigID). - Understanding of data privacy regulations (such as GDPR, CCPA, HIPAA) and data security basics. - Strong analytical skills with attention to detail and problem-solving capabilities. - Excellent communication skills with the ability to work collaboratively across teams. - Familiarity with cloud platforms (AWS, Azure, GCP) and data architecture is a plus. Additional Company Details: ZS offers a comprehensive total rewards package that includes health and well-being, financial planning, annual leave, personal growth, and professional development. The company's skills development programs, career progression options, internal mobility paths, and collaborative culture empower individuals to thrive within the organization and as part of a global team. ZS promotes a flexible and connected way of working, allowing employees to combine work from home and on-site presence at clients/ZS offices for the majority of the week. The company values face-to-face connections for fostering innovation and maintaining its culture. If you are eager to grow, contribute, and bring your unique self to the work environment at ZS, they encourage you to apply. ZS is an equal opportunity employer committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law.,
Posted 2 days ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Collibra Specialist at NTT DATA, your main responsibility will be to produce data mapping documents, import various glossaries and CDEs into Collibra, establish lineage from Glossary to CDM to LDM, and configure lineage visualizations, glossary workflows, and governance processes in Collibra. Key Responsibilities: - Produce data mapping documents including Glossary, CDM, LDM - Import Business Glossary, sub-domain glossaries, and CDEs into Collibra - Import mapping documents into Collibra and establish lineage from Glossary, CDM, LDM - Configure lineage visualizations, glossary workflows, and governance processes in Collibra Qualifications Required: - Minimum 5-7 years of experience in data governance/metadata management - At least 3 years of hands-on experience with Collibra implementation (glossary, lineage, workflows, metadata ingestion) - Proficiency in metadata ingestion and mapping automation - Ability to script/transform mapping templates into Collibra-ingestable formats - Knowledge of ERWin/Foundry integration with Collibra - Strong analytical and problem-solving skills to support lineage accuracy Please note that you will be required to be available up to 1:30am IST for shift timings. NTT DATA is a $30 billion trusted global innovator of business and technology services, serving 75% of the Fortune Global 100. As a Collibra Specialist, you will be part of a diverse team of experts across more than 50 countries, working towards helping clients innovate, optimize, and transform for long-term success. NTT DATA is committed to investing in research and development to support organizations and society in confidently moving into the digital future. Visit us at us.nttdata.com.,
Posted 2 days ago
5.0 - 10.0 years
9 - 13 Lacs
hyderabad
Work from Office
Overview As an Analyst, Data Modeler, your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analysing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data Modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, Data Bricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper managementbusiness and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levelslow-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications Bachelors degree required in Computer Science, Data Management/Analytics/Science, Information Systems, Software Engineering or related Technology Discipline. 5+ years of overall technology experience that includes at least 2+ years of data modeling and systems architecture. Around 2+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 2+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or Snowflake. Experience with version control systems like GitHub and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including Dev Ops and Data Ops concepts. Familiarity with business intelligence tools (such as Power BI). Excellent verbal and written communication and collaboration skills.
Posted 2 days ago
7.0 - 12.0 years
16 - 20 Lacs
bengaluru
Work from Office
Notice Period: - 30-60 Days Project Overview: We are seeking a seasoned AI Architect with strong experience in Generative AI and Large Language Models (LLMs)including OpenAI, Claude, and Geminito lead the design, orchestration, and deployment of intelligent solutions across complex use cases. You will architect conversational systems, feedback loops, and LLM pipelines with robust data governance, leveraging the Databricks platform and Unity Catalog for enterprise-scale scalability, lineage, and compliance. Role Scope / Deliverables: Architect end-to-end LLM solutions for chatbot applications, semantic search, summarization, and domain-specific assistants. Design modular, scalable LLM workflows including prompt orchestration, RAG (retrieval-augmented generation), vector store integration, and real-time inference pipelines. Leverage Databricks Unity Catalog for:o Centralized governance of AI training and inference datasetso Managing metadata, lineage, access controls, and audit trailso Cataloging feature tables, vector embeddings, and model artifacts Collaborate with data engineers and platform teams to ingest, transform, and catalog datasets used for fine-tuning and prompt optimization. Integrate feedback loop systems (e.g., user input, signal-driven reinforcement, RLHF) to continuously refine LLM performance. Optimize model performance, latency, and cost using a combination of fine-tuning, prompt engineering, model selection, and token usage management. Oversee secure deployment of models in production, including access control, auditability, and compliance alignment via Unity Catalog. Guide teams on data quality, discoverability, and responsible AI practices in LLM usage. Key Skills: 7+ years in AI/ML solution architecture, with 2+ years focused on LLMs and Generative AI. Strong experience working with OpenAI (GPT-4/o), Claude, Gemini, and integrating LLM APIs into enterprise systems. Proficiency in Databricks, including Unity Catalog, Delta Lake, MLflow, and cluster orchestration. Deep understanding of data governance, metadata management, and data lineage in large-scale environments. Hands-on experience with chatbot frameworks, LLM orchestration tools (LangChain, LlamaIndex), and vector databases (e.g., FAISS, Weaviate, Pinecone). Strong Python development skills, including notebooks, REST APIs, and LLM orchestration pipelines. Ability to map business problems to AI solutions, with strong architectural thinking and stakeholder communication. Familiarity with feedback loops and continuous learning patterns (e.g., RLHF, user scoring, prompt iteration). Experience deploying models in cloud-native and hybrid environments (AWS, Azure, or GCP). Preferred Qualifications: Experience fine-tuning or optimizing open-source LLMs (e.g., LLaMA, Mistral) with tools like LoRA/QLoRA. Knowledge of compliance requirements (HIPAA, GDPR, SOC2) in AI systems. Prior work building secure, governed LLM applications in highly regulated industries. Background in data cataloging, enterprise metadata management, or ML model registries.
Posted 3 days ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As an Oracle Content Management Administrator at AVASO Technology, you will play a crucial role in managing and supporting Oracle Content Management environments. Your responsibilities will include leading implementation, configuration, and support for Oracle Content Management or similar ECM platforms. You will define content models, metadata structures, and taxonomies as per business needs, manage user roles, groups, permissions, and access control policies, and design and implement content migration strategies. Additionally, you will integrate ECM with other systems using REST APIs or custom scripts, ensure platform compliance with organizational standards, troubleshoot and resolve issues related to content publishing and system performance, and provide guidance and training to end-users and junior administrators. Key Responsibilities: - Lead implementation, configuration, and support for Oracle Content Management (OCM) or similar ECM platforms. - Define content models, metadata structures, and taxonomies as per business needs. - Manage user roles, groups, permissions, and access control policies. - Design and implement content migration strategies (dev/uat/production). - Integrate ECM with other systems using REST APIs or custom scripts. - Ensure ECM platform compliance with organizational standards (GDPR, ISO, etc.). - Troubleshoot and resolve issues related to content publishing, access, and system performance. - Provide guidance and training to end-users and junior administrators. - Collaborate with development teams for headless CMS setups and custom UI components if required. - Manage document security (roles/permissions). - Support metadata updates. - Assist with content check-in/check-out issues. - Provide integration support for applications and users. - Manage incidents/service requests through ITSM tools (e.g., ServiceNow). - Coordinate with Middleware/OHS/Content management teams regarding vulnerability and patch updates from Oracle. - Perform administrative activities like patching, updating, and purging (Linux & Windows OS). Required Skills & Qualifications: - Strong experience with Oracle Content Management (OCM). - Expertise in content modeling, metadata, workflows, and digital asset management. - Knowledge of REST API integration. - Understanding of compliance standards. - Familiarity with Oracle Cloud Infrastructure (OCI) services (preferred). - Good problem-solving and communication skills. Mandatory Skills: - Oracle Universal Document Management (UDM) - Oracle Web Content Management (WCM) - Oracle Universal Record Management (URM) - Oracle Universal Content Management (UCM) Why AVASO Technology Join a dynamic and innovative team with a global presence. You will have opportunities for career growth and continuous learning, along with a competitive salary and benefits package. You will work with cutting-edge technologies to shape the future of IT solutions. Ready to take your career to the next level Apply now by clicking the "Easy Apply" button or send your resume to isha.pathak@avasotech.com. (Please do mention the Location you're applying for),
Posted 3 days ago
10.0 - 14.0 years
12 - 22 Lacs
jaipur
Work from Office
Your potential, unleashed. Indias impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realise your potential amongst cutting edge leaders, and organisations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. Location: Jaipur EXPERIENCE in Government projects is mandate Minimum Experience and Qualifications: 1. BE/MCA/ME degree in Computer Science, Information Technology, or a related field. 2. Proven minimum 10 years of work experience in MDM, data quality, and data governance, with a deep understanding of best practices and industry standards. 3. Strong knowledge of data governance frameworks, concepts, and principles. 4. Experience with MDM solutions and data quality tools such as Informatica Power Centre/ Informatica Data Quality (IDQ) or similar platforms. 5. Proficiency in data analysis, data profiling, and data cleansing techniques. 6. Excellent communication skills and the ability to collaborate with cross-functional teams. 7. Strong problem-solving skills and the ability to identify, analyze, and resolve complex data issues. 8. Familiarity with regulatory compliance standards (e.g., GDPR, HIPAA, etc.) related to data governance and quality. Roles and Responsibilities: 1. Develop, implement, and manage Master Data Management (MDM) strategies and initiatives to ensure consistent and accurate data across the organization. 2. Establish and enforce data governance policies, procedures, and standards to maintain data integrity and quality. 3. Collaborate with stakeholders to define data quality standards, guidelines, and metrics, and implement data quality improvements. 4. Design and implement data quality assessment and profiling processes, resolving data issues and ensuring compliance with regulatory requirements. 5. Lead efforts to identify, analyze, and resolve data quality issues within various data sources and systems. 6. Monitor and audit data quality, conducting regular quality checks and assessments to ensure adherence to standards. 7. Collaborate with cross-functional teams to align MDM, data quality, and governance initiatives with business objectives. 8. Provide guidance and support to internal teams in understanding and adhering to data governance and quality standards. 9. Evaluate, select, and implement data quality tools and technologies to support data governance and MDM initiatives. How youll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the worlds most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report. Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyones welcome entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you’re applying to. Check out recruiting tips from Deloitte professionals.
Posted 3 days ago
5.0 - 10.0 years
25 - 27 Lacs
hyderabad, pune, bengaluru
Hybrid
Role - Data Catalog and Governance - Collibra specialist Experience - 5+ Years Location - All EXL Location Key Skills - Collibra, Metadata Management, Data Governance, SQL Job Qualifications Required Skills & Experience: Hands-on experience with Collibra Data Intelligence Platform, including: Metadata ingestion Data lineage stitching Workflow configuration and customization Strong understanding of metadata management and data governance principles Experience working with the following data sources/tools: Teradata (BTEQ, MLOAD) Tableau QlikView IBM DataStage Informatica Ability to interpret and map technical metadata from ETL tools, BI platforms, and databases into Collibra Familiarity with data lineage concepts , including horizontal lineage across systems Proficiency in SQL and scripting for metadata extraction and transformation Excellent communication skills to collaborate with data stewards, engineers, and business stakeholders Preferred Qualifications: Experience with Collibra APIs or Connectors Knowledge of data governance frameworks (e.g., DAMA DMBOK) Prior experience in a regulated industry (e.g., finance, healthcare)
Posted 3 days ago
7.0 - 11.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Governance Architect at Straive, you will play a crucial role in defining and leading enterprise-wide data governance strategies, architecture, and implementation. Your expertise in tools like Informatica EDC/AXON, Collibra, Alation, MHUB, and other leading Data Governance platforms will be essential in ensuring data quality, consistency, and accessibility across various Enterprise platforms and business units. **Key Responsibilities:** - Design, develop, and maintain enterprise-wide data governance architecture frameworks and metadata models. - Establish data governance strategies, policies, standards, and procedures for compliance processes. - Conduct maturity assessments and lead change management efforts. - Evaluate and recommend Data governance framework and tools to meet enterprise business needs. - Design and implement architectural patterns for data catalog, data quality, metadata management, data lineage, data security, and master data management across various data platforms. - Create and manage data dictionaries, metadata repositories, and data catalogs. - Architect technical and business metadata workflows and govern glossary approvals and workflows. - Validate end-to-end lineage across multiple sources and targets. - Design and enforce rules for classification, access, retention, and sharing data techniques. - Analyze and define the enterprise business KPIs and validate data governance requirements. - Collaborate with Data Stewards to define technical specifications for data quality rules, validation checks, and KPIs reporting. - Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications. - Ensure data compliance with relevant regulations like GDPR, HIPAA, CCPA, SOX, etc. - Demonstrate excellent communication skills and ability to mentor and inspire teams. **Qualifications Required:** - Bachelor's/master's degree in information systems, Computer Science, or a related technical field. - Strong knowledge of data governance, architecture techniques, and methodologies with experience in data governance initiatives. - Minimum of 7 years of experience in data governance architecture and implementation across business enterprises. - Hands-on experience in designing and implementing architectural patterns for data quality, metadata management, data lineage, data security, and master data management. - Proficiency in Collibra workflows, APIs, metadata integration, and policy automation. - Experience with ETL/ELT pipelines, data lineage capture, and data integration tools. - Familiarity with data modeling (conceptual, logical, physical). - Proficiency in SQL, Python/Java for integration and automation. - Experience with back-end scripting, APIs, and working with cloud platforms (AWS, Azure, or GCP). - Knowledge of big data technologies (Hadoop/Spark, etc.) and data visualization and BI tools is a plus. - Strong analytical and problem-solving skills. Join Straive, a market-leading Content and Data Technology company, where you can leverage your expertise to shape data governance strategies and drive enterprise-wide data governance initiatives.,
Posted 4 days ago
3.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
Role Overview: As an IDMC Architect at our company, you will play a crucial role in leading the design, implementation, and optimization of data integration and management solutions using Informatica's Intelligent Data Management Cloud (IDMC). Your responsibilities will be pivotal in driving our enterprise-wide data strategy and ensuring scalability, performance, and governance across cloud and hybrid environments. Key Responsibilities: - Design end-to-end data integration architectures leveraging IDMC capabilities such as Data Integration, Data Quality, Data Governance, and API Management. - Define and implement best practices for IDMC deployment, scalability, and performance tuning across multi-cloud environments. - Collaborate closely with business analysts, data engineers, and enterprise architects to translate business requirements into technical solutions. - Ensure compliance with data governance, privacy, and security standards across all IDMC implementations. - Mentor development teams, review code and configurations, and guide troubleshooting efforts. - Continuously evaluate new IDMC features and recommend enhancements to improve data workflows and reduce latency. Qualifications Required: - Bachelors or Masters degree in Computer Science, Information Systems, or related field. - 8+ years of experience in data architecture, with at least 3 years in IDMC or Informatica Cloud. - Strong expertise in cloud platforms (AWS, Azure, GCP) and hybrid data ecosystems. - Proficiency in REST APIs, SQL, ETL/ELT pipelines, and metadata management. - Experience with Informatica Axon, EDC, and Data Quality tools is a plus. - Excellent communication and documentation skills. Company Details: Our company offers competitive compensation and benefits, opportunities for professional growth and certification, and fosters a collaborative and innovation-driven culture.,
Posted 4 days ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
Role Overview: At 66degrees, you will be responsible for owning the end-to-end design of modern data platforms on Microsoft Azure. You will provide architectural leadership, guide the data engineering team in building secure and scalable data platforms, and deliver raw data into analytics-ready assets. Additionally, you will act as a liaison between business and technology stakeholders to define data strategy, standards, and governance while optimizing cost, performance, and compliance across the Azure ecosystem. Key Responsibilities: - Design and document data architectures on Azure Synapse Analytics, Data Lake Storage Gen2, Microsoft Fabric, and CosmosDB - Lead migration of on-premises workloads to Azure with appropriate IaaS, PaaS, or SaaS solutions and ensure right-sizing for cost and performance - Guide development of data pipelines using Azure Data Factory, Synapse Pipelines, dbt, and ensure orchestration, monitoring, and CI/CD via Azure DevOps - Model conceptual, logical, and physical data structures, enforce naming standards, data lineage, and master-data management practices - Implement robust security, data privacy, and regulatory controls such as GDPR or HIPAA - Define data governance policies, metadata management, and catalogue strategies using Microsoft Purview or equivalent tools - Provide technical leadership to data engineers, analysts, and BI developers, lead code/design review meetings, and mentor on Azure best practices - Collaborate with enterprise architects, product owners, and business SMEs to translate analytical use cases into scalable cloud data design and feature roadmap - Establish patterns to monitor platform health, automate cost optimization, and capacity planning via Azure features Qualifications Required: - Proven experience in designing and implementing data architectures on Microsoft Azure - Strong expertise in Azure services such as Synapse Analytics, Data Lake Storage Gen2, Data Factory, and Azure DevOps - Knowledge of security best practices, data privacy regulations, and data governance principles - Experience in leading migration projects to Azure and optimizing cost and performance - Excellent communication skills and the ability to collaborate with cross-functional teams Please note that this job is a Permanent position with benefits including a flexible schedule and health insurance. The work location is remote.,
Posted 4 days ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
As a highly skilled NetSuite Subject Matter Expert at GlobalLogic, your primary role will involve providing advanced integration guidance, environment management, and issue resolution for technical teams. Leveraging your deep expertise in NetSuite ERP and its APIs, you will ensure robust platform performance, faster root cause analysis, and improved customer experience. **Key Responsibilities:** - Configure, maintain, and optimize NetSuite sandbox and test environments. - Troubleshoot, replicate, and resolve customer-reported issues related to connectors and APIs. - Manage token-based authentication (TBA), including roles and permission schemes. - Validate connector/API compatibility across various NetSuite versions and WSDL updates. - Partner with engineering and support teams to reduce ticket recurrence and improve platform stability. - Develop technical playbooks, RCA documentation, and support knowledge base content. - Advise on NetSuite integration best practices and ensure alignment with enterprise needs. **Qualifications Required:** - 5+ years of practical experience working with NetSuite ERP and CRM platforms. - Proficient in SuiteTalk, SuiteScript, and NetSuite connector technologies. - Solid expertise in API integration, metadata management, and debugging tools. - Demonstrated ability to manage token-based authentication and security roles within NetSuite. - Previous experience supporting enterprise or SaaS-based customers is highly desirable. - Strong communicator with a proven track record of effective cross-functional collaboration. - Must possess any graduation degree (Must have Provisional Certificate & Consolidated Marks Memo). - Must be willing to work from the office & in rotational shifts (5 Working Days & 2 Week Offs). - Must be ready to join immediately. At GlobalLogic, you will experience a culture of caring that prioritizes people first, fostering an inclusive environment of acceptance and belonging. You will have opportunities for continuous learning and development, engaging in interesting and meaningful work that makes an impact. The organization values balance and flexibility, offering various career areas, roles, and work arrangements to help you achieve a healthy work-life balance. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for collaborating with clients to transform businesses through intelligent products, platforms, and services. As part of the team, you will have the privilege of working on cutting-edge projects and solutions that shape the world today.,
Posted 4 days ago
14.0 - 18.0 years
0 Lacs
pune, maharashtra
On-site
As an Applications Development Intermediate Programmer Analyst at our company, your role will involve participating in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. Your contribution to applications systems analysis and programming activities will be crucial. **Key Responsibilities:** - Develop and maintain application development for complicated enterprise data lineage - Optimize industry-based tools to simplify enterprise-level data complexity via data lineage - Debug and resolve graph-related issues - Collaborate on designing and implementing new features to simplify complex problems - Conduct code reviews for quality assurance - Write and maintain documentation for functionalities and APIs - Integrate and validate third-party libraries and tools - Manage source code using version control systems - Implement algorithms for code generation and optimization - Perform code refactoring for better maintainability and efficiency - Stay updated with advancements in Data lineage technology - Profile and benchmark compiler performance on various platforms - Develop automated testing and verification of code base and functionality - Provide technical support to teams using technical expertise - Analyze performance metrics to identify areas for improvement - Participate in design and architecture discussions - Use static and dynamic analysis tools to improve code quality - Collaborate with cross-functional teams - Research new techniques and methodologies - Contribute to and engage with open-source compiler projects **Qualifications Required:** - At least 14+ years of application Ab-initio Metadata hub development experience - Strong understanding of Data Lineage, metadata management, and reference data development and data analytics - Good knowledge about relational databases like Oracle, SQL / PLSQL - Strong knowledge in one or more of the areas of: Data lineage, application development, python or Java coding experience - Hands-on experience of any coding language and tool-based configuration prior experience - Full Software Development Kit (SDK) development cycle experience - Pragmatic problem-solving and ability to work independently or as part of a team - Proficiency in ab-initio mHub or python programming languages - Proficiency with 1 or more of the following programming languages: Java, API, Python - A passion for development, strong work ethic, and continuous learning - Experience with code optimization techniques for different hardware architectures If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity, please review Accessibility at Citi.,
Posted 4 days ago
15.0 - 20.0 years
10 - 14 Lacs
pune
Work from Office
About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Informatica MDM Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, addressing any challenges that arise, and providing guidance to team members to foster a productive work environment. You will also engage in strategic discussions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the decision-making process. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM.- Strong understanding of data integration and data quality management.- Experience with data modeling and metadata management.- Ability to troubleshoot and resolve technical issues related to MDM.- Familiarity with ETL processes and data warehousing concepts. Additional Information:- The candidate should have minimum 7.5 years of experience in Informatica MDM.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 4 days ago
1.0 - 5.0 years
0 Lacs
delhi
On-site
Apply Digital is a global experience transformation partner, driving AI-powered change and measurable impact across complex, multi-brand ecosystems. With expertise spanning the customer experience lifecycle from strategy, design to engineering and beyond, Apply Digital enables clients to modernize organizations and maximize value for businesses and customers. The 750+ team members have successfully transformed global companies like Kraft Heinz, NFL, Moderna, Lululemon, Dropbox, Atlassian, A+E Networks, and The Very Group. Founded in 2016 in Vancouver, Canada, Apply Digital has expanded to ten cities across North America, South America, the UK, Europe, and India. The company operates with a "One Team" approach, utilizing a pod structure comprising senior leadership, subject matter experts, and cross-functional skill sets within a common tech and delivery framework. Applying well-oiled scrum and sprint cadences, teams release often and conduct retrospectives to ensure progress towards desired outcomes. Apply Digital fosters a safe, empowered, respectful, and fun community worldwide, embodying SHAPE values - smart, humble, active, positive, and excellent. The company offers remote work options and prefers candidates based in or near the Delhi/NCR region of India, working in hours overlapping with Eastern Standard Timezone (EST). As a Content Author at Apply Digital, you will create, structure, and manage digital content within CMS platforms like Contentful, ensuring accuracy, organization, and optimization for digital experiences. Collaborating with marketing, design, and development teams, you will review peer work, troubleshoot formatting issues, and refine workflows for consistency. Staying updated on content management best practices, accessibility standards, and digital publishing trends is crucial. The ideal candidate possesses strong English proficiency, 1-3 years of relevant work experience, a background in Humanities or Media with excellent writing skills, attention to detail, and problem-solving abilities. Proficiency in Contentful, Contentstack, or similar platforms is required, along with the ability to work with remote teams and clients across North America and Latin America. Apply Digital offers a hybrid-friendly work environment with remote options, private healthcare coverage, contributions to Provident fund, gratuity bonus, flexible PTO policy, engaging projects with international brands, and a commitment to building an inclusive and safe workplace. Learning opportunities include training budgets, tech certifications, custom learning plans, workshops, mentorship, and peer support. Apply Digital values equal opportunity and nurtures an inclusive workplace celebrating individual differences. For more details, visit Apply Digital's Diversity, Equity, and Inclusion (DEI) page or email careers@applydigital.com for special needs or accommodations during the recruitment process.,
Posted 5 days ago
10.0 - 14.0 years
0 Lacs
maharashtra
On-site
As a Data Ops Capability Deployment Analyst at Citi, you will be a seasoned professional contributing to the development of new solutions and techniques for the Enterprise Data function. Your role involves performing data analytics and analysis across various asset classes, as well as building data science capabilities within the team. You will collaborate closely with the wider Enterprise Data team to deliver on business priorities. Working within the B & I Data Capabilities team, you will be responsible for managing the Data quality/Metrics/Controls program and implementing improved data governance and management practices. This program focuses on enhancing Citis approach to data risk and meeting regulatory commitments in this area. Key Responsibilities: - Hands-on experience with data engineering and a strong understanding of Distributed Data platforms and Cloud services. - Knowledge of data architecture and integration with enterprise applications. - Research and assess new data technologies and self-service data platforms. - Collaboration with Enterprise Architecture Team on refining overall data strategy. - Addressing performance bottlenecks, designing batch orchestrations, and delivering Reporting capabilities. - Performing complex data analytics on large datasets including data cleansing, transformation, joins, and aggregation. - Building analytics dashboards and data science capabilities for Enterprise Data platforms. - Communicating findings and proposing solutions to stakeholders. - Translating business requirements into technical design documents. - Collaboration with cross-functional teams for testing and implementation. - Understanding of banking industry requirements. - Other duties and functions as assigned. Skills & Qualifications: - 10+ years of development experience in Financial Services or Finance IT. - Experience with Data Quality/Data Tracing/Data Lineage/Metadata Management Tools. - Hands-on experience with ETL using PySpark, data ingestion, Spark optimization, and batch orchestration. - Proficiency in Hive, HDFS, Airflow, and job scheduling. - Strong programming skills in Python with data manipulation and analysis libraries. - Proficient in writing complex SQL/Stored Procedures. - Experience with DevOps tools, Jenkins/Lightspeed, Git, CoPilot. - Knowledge of BI visualization tools such as Tableau, PowerBI. - Implementation experience with Datalake/Datawarehouse for enterprise use cases. - Exposure to analytical tools and AI/ML is desired. Education: - Bachelor's/University degree, master's degree in information systems, Business Analysis, or Computer Science. In this role, you will be part of the Data Governance job family focusing on Data Governance Foundation. This is a full-time position at Citi, where you will utilize skills like Data Management, Internal Controls, Risk Management, and more to drive compliance and achieve business objectives. If you require a reasonable accommodation due to a disability to utilize search tools or apply for a career opportunity at Citi, please review the Accessibility at Citi guidelines. Additionally, you can refer to Citi's EEO Policy Statement and the Know Your Rights poster for more information.,
Posted 5 days ago
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
As the Lead SAP Datasphere implementations, you will be responsible for overseeing various aspects including data modeling, ETL, integration, and performance optimization. Your role will involve developing and maintaining SAC stories, dashboards, and planning models with advanced formulas & allocations. You will also be expected to handle What-If planning & scenario modeling, predictive forecasting & planning cycles, and the migration and integration of SAP BPC models into Datasphere and SAC. In addition to the above, you will provide support for monthly financial close, budgeting, and forecasting activities. You will be the point of contact for production support, troubleshooting, and deployments related to SAC and Datasphere artifacts. It will be your responsibility to ensure data governance, security, and metadata management are maintained at all times. Collaboration with IT and business teams for end-to-end SAP/non-SAP integration is a crucial part of this role. Moreover, you will be required to work across India and US time zones to provide global support. Knowledge of AWS Glue, Redshift, and cloud-based ETL/data warehousing is essential. Exposure to Tableau or other BI tools is a plus. Ideal candidates will have experience in predictive analytics or advanced planning in SAC. An understanding of finance & operational planning processes such as Sales, Capex, Headcount, Revenue, Balance Sheet, and Cash Flow is highly beneficial for this position.,
Posted 5 days ago
14.0 - 18.0 years
0 Lacs
pune, maharashtra
On-site
The Applications Development Intermediate Programmer Analyst position is an intermediate level role where you will be responsible for taking part in the establishment and implementation of new or updated application systems and programs in collaboration with the Technology team. Your main goal in this position will be to contribute to applications systems analysis and programming activities. You should have at least 14+ years of experience in application Ab-initio Metadata hub development. A strong understanding of Data Lineage, metadata management, reference data development, and data analytics is essential. Additionally, you should possess good knowledge of relational databases such as Oracle, SQL, and PLSQL. Proficiency in areas like Data lineage, application development, and coding experience in languages like Python or Java is required. Hands-on experience with a coding language and tool-based configuration is necessary, along with Full Software Development Kit (SDK) development cycle experience. You should have pragmatic problem-solving skills and the ability to work both independently and as part of a team. Proficiency in ab-initio mHub or Python programming languages is preferred, along with experience in Java, API, or Python programming languages. A passion for development, a strong work ethic, and a continuous learning mindset are important attributes for this role. Knowledge of code optimization techniques for different hardware architectures is also beneficial. Preferred qualifications include a Bachelor's degree in computer science or a related field, experience in relational databases like SQL, PLSQL, Oracle, code development, metadata management, reference data, and Lineage tools. Experience in developing data lineage using tools or custom code, as well as expertise in data management and coding languages, are desirable. As an Applications Development Intermediate Programmer Analyst, your responsibilities will include developing and maintaining application development for complex enterprise data lineage, optimizing industry-based tools to simplify enterprise-level data complexity through data lineage, debugging and resolving graph-related issues, collaborating on designing and implementing new features, conducting code reviews, writing and maintaining documentation, integrating and validating third-party libraries and tools, managing source code using version control systems, implementing algorithms for code generation and optimization, and more. You will also need to stay updated with advancements in Data lineage technology, profile and benchmark compiler performance on various platforms, develop automated testing and verification of code base and functionality, provide technical support to teams, analyze performance metrics for improvement areas, participate in design and architecture discussions, use static and dynamic analysis tools to enhance code quality, collaborate with cross-functional teams, research new techniques and methodologies, and contribute to open-source compiler projects. Please note that this job description offers a high-level overview of the tasks typically performed in this role, and additional job-related duties may be assigned as necessary.,
Posted 5 days ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Data Modeller, you will play a crucial role in designing, governing, and optimizing the core data models within our Data & Analytics platform. Your responsibilities will include leading the development of scalable, modular, and business-aligned models to support analytics and reporting across multiple regions and clients. Collaborating closely with data engineering, BI, and business stakeholders, you will ensure accurate embedding of business logic in the models, maintain semantic consistency, and support high-performance, secure, and compliant data structures. Your expertise will be instrumental in translating complex business requirements into robust technical models that facilitate efficient decision-making and insight generation. Working in the B2B software and services sector, you will contribute to delivering mission-critical solutions for large-scale clients globally, in accordance with our commitment to innovation and excellence. Additionally, you will support initiatives led by the GSTF, demonstrating our company's values of environmental and social responsibility. You will contribute by identifying and proposing local sustainable practices aligned with our Sustainability Charter and participate in challenges to improve sustainable behaviors through our sustainability app. Key Responsibilities: - Design, implement, and maintain scalable and modular data models for Snowflake, incorporating region and country-specific extensions without impacting the global core. - Define, document, and approve changes to the core enterprise data model, embedding business logic into model structures. - Lead data modelling workshops with stakeholders to gather requirements and ensure alignment between business, engineering, and BI teams. - Collaborate with developers, provide technical guidance, and review outputs related to data modelling tasks. - Optimize models for performance, data quality, and governance compliance. - Work with BI teams to ensure semantic consistency and enable self-service analytics. - Ensure adherence to data security, RBAC, and compliance best practices. - Utilize DevOps tools like Git/Bitbucket for version control of data models and related artifacts. - Maintain documentation, metadata, and data lineage for all models. - Preferred: Utilize tools like Matillion or equivalent ETL/ELT tools for model integration workflows. - Fulfill any additional duties as reasonably requested by your direct line leader. Required Skills: - Proven expertise in designing enterprise-level data models for cloud data platforms, preferably Snowflake. - Strong understanding of data warehouse design patterns like dimensional, Data Vault, and other modeling approaches. - Ability to embed business logic into models and translate functional requirements into technical architecture. - Experience managing and approving changes to the core data model, ensuring scalability, semantic consistency, and reusability. - Proficiency in SQL with experience in Snowflake-specific features. - Familiarity with ELT/ETL tools such as Matillion, DBT, Talend, or Azure Data Factory. - Experience with DevOps practices, including version control of modeling artifacts. - Knowledge of metadata management, data lineage, and data cataloging tools. - Strong understanding of data privacy, governance, and RBAC best practices. - Excellent communication and stakeholder engagement skills. - Positive attitude with a focus on delivering excellence. - Strong attention to detail and exceptional relationship management skills. - Open-minded consultative approach and ability to provide and receive constructive feedback. - Creative problem-solving skills and ability to work effectively in a team environment.,
Posted 5 days ago
7.0 - 10.0 years
10 - 20 Lacs
hyderabad, gurugram, bengaluru
Work from Office
>> About KPMG in India KPMG in India, a professional services firm, is the Indian member firm affiliated with KPMG International and was established in September 1993. Our professionals leverage the global network of firms, providing detailed knowledge of local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada. KPMG in India offers services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focussed and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. Our professionals provide the experience to help companies stay on track and deal with risks that could unhinge their business survival. Our services enable clients to effectively co-ordinate their key growth, quality and operational challenges and working in partnership with us, clients have the benefits of KPMG's experienced, objective, and industry-grounded viewpoints. KPMG Advisory professionals provide advice and assistance to enable companies, intermediaries and public sector bodies to mitigate risk, improve performance, and create value. KPMG firms provide a wide range of Risk Consulting, Management Consulting and Transactions & Restructuring services that can help clients respond to immediate needs as well as put in place the strategies for the longer term. With increasing regulatory requirements, the need for greater transparency in operations, and disclosure norms, stakeholders require assurance beyond the traditional critique of numbers. Hence assurance is being increasingly required on industry issues, business risks and key business processes. The Governance, Risk & Compliance Services practice assists companies and public sector bodies to mitigate risk, improve performance and create value. We assist our clients to effectively manage business and process risks by providing a full spectrum of corporate governance, risk management, and Compliance Services. These services are tailored to meet clients individual needs and provide effective support to management in meeting the challenges and opportunities presented by today's complex business environment. Our professionals provide the experience to help companies stay on track and deal with risks that could unhinge their business survival. Our services enable clients to effectively co-ordinate their key growth, quality and operational challenges and working in partnership with us, clients have the benefits of KPMG's experienced, objective, and industry-grounded viewpoints. Data Governance Consultant/ AM/Manager Role & Responsibilty Lead the design and implementation of the Data Governance framework. Lead maturity assessments for data management/data governance capabilities and identify gaps & recommendations and build implementation roadmaps. Support data management initiatives including setting up and monitoring of data governance programs and coordinating with different teams/business units. Strong understanding of DG Org structure, and roles & responsibilities / RACI (Data Steward, Data Custodian, Data Owner, Producers and Consumers, etc.) Lead the building of Stewardship and Interaction models and ensure clear accountability for data ownership and stewardship. Strong understanding of Data Landscape (data flows, data lineage, CDEs, Data domains etc.) Experience in managing data lineage tools & visualizations to track data movement and transformations across systems Lead the development of data quality rules, identifying data quality issues, and working with stakeholders for remediation planning. Strong skills in data quality management and hands-on experience with data quality tools and techniques. Strong understanding of Metadata Mgmt. and MDM/RDM concepts and processes (creation, curation, update, and change mgmt., etc.) Strong understanding of Data Catalog, Business Glossary and/or Data Dictionary (understands data definitions and associated best practices) Assist in facilitating training sessions and workshops to improve data governance awareness and practices across the organization. Monitor compliance with relevant data regulations (eg: NDMO, GDPR, CCPA) and internal data policies. Knowledge of industry-leading DG tools (Collibra, Alation, etc.) and DQ tools (SAP IS, Informatica/IDQ etc.). Preferably also awareness of MDM tools Having a good understanding of either of one regulatory data management laws the National Data Management Office (NDMO), Abu Dhabi Government Data Management Standard, IT DPDP Act 2023 etc. Strong understanding of Data Risk Taxonomy (Types, Sub-types) QUALIFICATION BE/BTech/MCA/MBA/Masters degree in information management or related field. Good Knowledge of industry standards & regulations around NDMO, SAMA, BCBS239, GDPR, CCPA etc Extensive experience on Informatica tool stack. Data Management certifications (CDMP, DAMA, DCAM etc) >> SELECTION PROCESS Candidates should expect 2 - 3 rounds of personal or telephonic interviews to assess fitment and communication skills >> Compensation Compensation is competitive with industry standards Details of the compensation breakup will be shared with short-listed candidates only >> WoRK Timing Monday to Friday >> People BENEFITS Continuous learning program Driving a culture of recognition through ENCORE our quarterly rewards and recognition program Comprehensive medical insurance coverage for staff and family Expansive general and accidental coverage for staff Executive Health check-up (Manager & above, and for staff above the age of 30) Les Concierge desks Internal & Global Mobility Various other people-friendly initiatives Strong commitment to our Values such as CSR initiatives The opportunity is now! If you are interested please share your resume in sonalidas4@kpmg.com.
Posted 5 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |