Jobs
Interviews

4955 Data Governance Jobs - Page 5

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

3 - 7 Lacs

pune

Work from Office

Lead CMDB Developer for ServiceNow Project Years - 7+ years relevant CMDB Location - PAN India Note - only US shift (till 3.30 am Rates including mark up - 160 K/M About the Role We are seeking a highly skilled Lead CMDB Developer to join our dynamic ServiceNow project team. This role is critical in ensuring the accurate and efficient management of the clients Configuration Management Database (CMDB) within the ServiceNow platform. The ideal candidate will lead the design, development, and implementation of CMDB solutions while working closely with various stakeholders to ensure alignment with business objectives and ITIL best practices Responsibilities Lead the design, development, and maintenance of the CMDB within the ServiceNow platform. Collaborate with IT and business stakeholders to gather requirements and ensure CMDB solutions meet business needs. Implement and enforce CMDB data standards, policies, and procedures to ensure data integrity and accuracy. Develop and maintain CMDB documentation, including data models, workflows, and processes. Provide technical leadership and mentorship to the CMDB development team. Perform regular audits and data quality checks to identify and resolve discrepancies. Integrate the CMDB with other IT Service Management (ITSM) processes and tools. Stay up-to-date with ServiceNow CMDB best practices and industry trends. Conduct training sessions for end-users and stakeholders on CMDB functionalities and best practices. Education Bachelors degree in Computer Science, Information Technology, or a related field. A Masters degree is a plus. Relevant certifications in ServiceNow, ITIL, or other ITSM frameworks are highly desirable. Certified ServiceNow Configuration Management Database (CMDB) Administrator Certification is required. Experience Minimum of 5 years of experience in CMDB development and administration within the ServiceNow platform. Proven experience leading CMDB projects and teams. In-depth knowledge of ITIL processes and best practices. Experience with integrating CMDB with other ITSM processes and tools. Strong understanding of database design, data modeling, and data governance principles. Skills Proficient in ServiceNow CMDB administration and development. Strong analytical and problem-solving skills. Excellent communication and interpersonal skills. Ability to work collaboratively with cross-functional teams. Detail-oriented with a focus on data accuracy and integrity. Ability to manage multiple priorities and deliver results in a fast-paced environment. Proficient in scripting languages (e.g., JavaScript) and database query languages (e.g., SQL). Strong project management and leadership skills. Mandatory Skills: ServiceNow DevOps Deployment. Experience: 5-8 Years.

Posted 3 days ago

Apply

8.0 - 10.0 years

8 - 14 Lacs

hyderabad

Work from Office

Qualification Bachelors or masters degree in computer science, Information Technology, or a related field 12+ years of experience in data architecture and engineering Proven expertise with Snowflake data platform Strong understanding of ETL/ELT processes and data integration Experience with data modelling and data warehousing concepts Familiarity with performance tuning and optimization techniques Excellent problem-solving skills and attention to detail Strong communication and collaboration skills Responsibilities Design and develop robust data architecture solutions using Snowflake Collaborate with data engineers and analysts to optimize data pipelines Implement data migration strategies and best practices Enhance performance tuning and query optimization for Snowflake environments Ensure data governance, security, and compliance within Snowflake Conduct code reviews and ensure adherence to standards Provide technical guidance and mentorship to junior team members Work with stakeholders to understand data requirements and deliver solutions Mandatory Skills: Data Engineering Full Stack. Experience: 8-10 Years.

Posted 3 days ago

Apply

10.0 - 14.0 years

12 - 22 Lacs

jaipur

Work from Office

Your potential, unleashed. Indias impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realise your potential amongst cutting edge leaders, and organisations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. Location: Jaipur EXPERIENCE in Government projects is mandate Minimum Experience and Qualifications: 1. BE/MCA/ME degree in Computer Science, Information Technology, or a related field. 2. Proven minimum 10 years of work experience in MDM, data quality, and data governance, with a deep understanding of best practices and industry standards. 3. Strong knowledge of data governance frameworks, concepts, and principles. 4. Experience with MDM solutions and data quality tools such as Informatica Power Centre/ Informatica Data Quality (IDQ) or similar platforms. 5. Proficiency in data analysis, data profiling, and data cleansing techniques. 6. Excellent communication skills and the ability to collaborate with cross-functional teams. 7. Strong problem-solving skills and the ability to identify, analyze, and resolve complex data issues. 8. Familiarity with regulatory compliance standards (e.g., GDPR, HIPAA, etc.) related to data governance and quality. Roles and Responsibilities: 1. Develop, implement, and manage Master Data Management (MDM) strategies and initiatives to ensure consistent and accurate data across the organization. 2. Establish and enforce data governance policies, procedures, and standards to maintain data integrity and quality. 3. Collaborate with stakeholders to define data quality standards, guidelines, and metrics, and implement data quality improvements. 4. Design and implement data quality assessment and profiling processes, resolving data issues and ensuring compliance with regulatory requirements. 5. Lead efforts to identify, analyze, and resolve data quality issues within various data sources and systems. 6. Monitor and audit data quality, conducting regular quality checks and assessments to ensure adherence to standards. 7. Collaborate with cross-functional teams to align MDM, data quality, and governance initiatives with business objectives. 8. Provide guidance and support to internal teams in understanding and adhering to data governance and quality standards. 9. Evaluate, select, and implement data quality tools and technologies to support data governance and MDM initiatives. How youll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the worlds most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report. Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyones welcome entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you’re applying to. Check out recruiting tips from Deloitte professionals.

Posted 3 days ago

Apply

1.0 - 6.0 years

1 - 2 Lacs

hyderabad

Work from Office

SUMMARY Job Summary We are looking for a Senior PySpark Developer with 3 to 6 years of experience in building and optimizing data pipelines using PySpark on Databricks, within AWS cloud environments. This role focuses on the modernization of legacy domains, involving integration with systems like Kafka and collaboration across cross-functional teams. Key Responsibilities Develop and optimize scalable PySpark applications on Databricks. Work with AWS services (S3, EMy, Lambda, Dlue) for cloud-native data processing. Integrate streaming and batch data sources, especially using Kafka. Tune Spark jobs for performance, memory, and compute efficiency. Collaborate with DevOps, product, and analytics teams on delivery and deployment. Ensure data governance, lineage, and quality compliance across all pipelines. Required Skills 3 6 years of hands-on development in PySpark. Experience with Databricks and performance tuning using Spark UI. Strong understanding of AWS services, Kafka, and distributed data processing. Proficient in partitioning, caching, join optimization, and resource configuration. Familiarity with data formats like Parquet, Avro, and OyC . Exposure to orchestration tools (Airflow, Databricks Workflows). Scala experience is a strong plus.

Posted 3 days ago

Apply

7.0 - 10.0 years

10 - 14 Lacs

hyderabad

Work from Office

About the Job : We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team. In this pivotal role, you will be instrumental in driving our data engineering initiatives, with a strong emphasis on leveraging Dataiku's capabilities to enhance data processing and analytics. You will be responsible for designing, developing, and optimizing robust data pipelines, ensuring seamless integration of diverse data sources, and maintaining high data quality and accessibility to support our business intelligence and advanced analytics projects. This role requires a unique blend of expertise in traditional data engineering principles, advanced data modeling, and a forward-thinking approach to integrating cutting-AI technologies, particularly LLM Mesh for Generative AI applications. If you are passionate about building scalable data solutions and are eager to explore the cutting edge of AI, we encourage you to apply. Key Responsibilities : - Dataiku Leadership : Drive data engineering initiatives with a strong emphasis on leveraging Dataiku capabilities for data preparation, analysis, visualization, and the deployment of data solutions. - Data Pipeline Development : Design, develop, and optimize robust and scalable data pipelines to support various business intelligence and advanced analytics projects. This includes developing and maintaining ETL/ELT processes to automate data extraction, transformation, and loading from diverse sources. - Data Modeling & Architecture : Apply expertise in data modeling techniques to design efficient and scalable database structures, ensuring data integrity and optimal performance. - ETL/ELT Expertise : Implement and manage ETL processes and tools to ensure efficient and reliable data flow, maintaining high data quality and accessibility. - Gen AI Integration : Explore and implement solutions leveraging LLM Mesh for Generative AI applications, contributing to the development of innovative AI-powered features. - Programming & Scripting : Utilize programming languages such as Python and SQL for data manipulation, analysis, automation, and the development of custom data solutions. - Cloud Platform Deployment : Deploy and manage scalable data solutions on cloud platforms such as AWS or Azure, leveraging their respective services for optimal performance and cost-efficiency. - Data Quality & Governance : Ensure seamless integration of data sources, maintaining high data quality, consistency, and accessibility across all data assets. Implement data governance best practices. - Collaboration & Mentorship : Collaborate closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver impactful solutions. Potentially mentor junior team members. - Performance Optimization : Continuously monitor and optimize the performance of data pipelines and data systems. Required Skills & Experience : - Proficiency in Dataiku : Demonstrable expertise in Dataiku for data preparation, analysis, visualization, and building end-to-end data pipelines and applications. - Expertise in Data Modeling : Strong understanding and practical experience in various data modeling techniques (e.g., dimensional modeling, Kimball, Inmon) to design efficient and scalable database structures. - ETL/ELT Processes & Tools : Extensive experience with ETL/ELT processes and a proven track record of using various ETL tools (e.g., Dataiku's built-in capabilities, Apache Airflow, Talend, SSIS, etc.). - Familiarity with LLM Mesh : Familiarity with LLM Mesh or similar frameworks for Gen AI applications, understanding its concepts and potential for integration. - Programming Languages : Strong proficiency in Python for data manipulation, scripting, and developing data solutions. Solid command of SQL for complex querying, data analysis, and database interactions. - Cloud Platforms : Knowledge and hands-on experience with at least one major cloud platform (AWS or Azure) for deploying and managing scalable data solutions (e.g., S3, EC2, Azure Data Lake, Azure Synapse, etc.). - Gen AI Concepts : Basic understanding of Generative AI concepts and their potential applications in data engineering. - Problem-Solving : Excellent analytical and problem-solving skills with a keen eye for detail. - Communication : Strong communication and interpersonal skills to collaborate effectively with cross-functional teams. Bonus Points (Nice to Have) : - Experience with other big data technologies (e.g., Spark, Hadoop, Snowflake). - Familiarity with data governance and data security best practices. - Experience with MLOps principles and tools. - Contributions to open-source projects related to data engineering or AI. Education : Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related quantitative field.

Posted 3 days ago

Apply

5.0 - 7.0 years

10 - 14 Lacs

mumbai

Work from Office

Job Title : Sr. Data Engineer Ontology & Knowledge Graph Specialist Department : Platform Engineering Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.

Posted 3 days ago

Apply

8.0 - 10.0 years

8 - 12 Lacs

mumbai

Work from Office

Job Title : Data Governance Purview Analyst. Knowledge and an expert in Microsoft Purview for data governance and cataloging. Knowledge and an expert in Azure Data Factory, Azure Synapse, and Power BI for data integration and visualization. Knowledge and an expert in PowerShell for automation and scripting tasks. Knowledge and an expert in Data Quality and Compliance Tools (e.g., Microsoft Compliance Manager). Knowledge in how to use Azure DevOps, or other project management tools for tracking implementation progress. Roles And Responsibility : - Using Purview, perform data discovery, identifying and cataloging data assets. - Implement and configure data classification rules to automatically classify sensitive data, ensuring compliance with regulations. - Ensure the automatic tagging and categorization of data to create a comprehensive inventory. Required Skills And Qualifications : - Bachelors degree in computer science, Information Technology, or a related field. - Experience with Azure Purview or similar data governance tools. - Strong understanding of data governance principles, data management, and regulatory requirements. - Proficiency in data cataloging, metadata management, and data lineage documentation. - Excellent analytical and problem-solving skills. - Effective communication skills for collaboration with cross-functional teams. Preferred Qualifications : - Certification in Azure Purview or other data governance tools. - Prior experience in implementing data governance frameworks for enterprise clients. - Familiarity with cloud platforms like Azure or AWS. - Knowledge of data privacy regulations such as GDPR or CCPA.

Posted 3 days ago

Apply

6.0 - 8.0 years

8 - 13 Lacs

mumbai

Work from Office

About The Opportunity : Operating at the forefront of the consulting and technology services sector, our client delivers transformative data management and business analytics solutions to a global clientele. Specializing in innovative approaches to data architecture and information management, they empower organizations to make data-driven decisions. We are seeking a seasoned professional to drive the evolution of our data infrastructure. This is an exciting chance to join a high-caliber team that values precision, efficiency, and strategic vision in a fully remote setting from anywhere in India. Role & Responsibilities : As a key member of our team, you will be instrumental in shaping our data landscape. Your responsibilities will include : - Designing, developing, and maintaining comprehensive logical and physical data models that meet stringent business and regulatory requirements. - Collaborating with cross-functional teams including data engineers, analysts, and business stakeholders to align data architecture with evolving business needs. - Translating complex business requirements into robust and scalable data models, ensuring optimal performance and data integrity. - Optimizing and refining existing data structures and frameworks to enhance data access, reporting, and analytics capabilities. - Driving governance and data quality initiatives by establishing best practices and documentation standards for data modeling. - Mentoring and guiding junior team members, fostering a culture of continuous improvement and technical excellence. Skills & Qualifications : Must-Have : - Experience : 6+ years of experience in data modeling and related disciplines. - Data Modeling : Extensive expertise in conceptual, logical, and physical models, including Star/Snowflake schema design, Normalization, and Denormalization techniques. - Snowflake : Proven experience with Snowflake schema design, Performance tuning, Time Travel, Streams & Tasks, and Secure & Materialized Views. - SQL & Scripting : Advanced proficiency in SQL (including CTEs and Window Functions), with a strong focus on automation and optimization. Key Skills : - Data Modeling (conceptual, logical, physical) - Schema Design (Star Schema, Snowflake Schema) - Optimization & Performance Tuning - Snowflake (Schema design, Time Travel, Streams and Tasks, Secure Views, Materialized Views) - Advanced SQL (CTEs, Window Functions) - Normalization & Denormalization - Automation - NoSQL (preferred, but not mandatory)

Posted 3 days ago

Apply

4.0 - 6.0 years

4 - 8 Lacs

mumbai

Work from Office

Key Responsibilities : - Exp in Snowflake with min 3 yrs. - Design, develop, and maintain integration workflows using Informatica CAI/CDI within IDMC. - Configure and manage Informatica MDM for data consolidation, cleansing, and governance. - Translate business and technical requirements into robust, scalable integration solutions. - Collaborate with analysts, architects, and business stakeholders to ensure delivery meets expectations. - Monitor and troubleshoot integration processes and performance. - Support deployment activities and CI/CD processes. - Maintain documentation of integration designs and data flows. Essential Skills & Experience : - 4+ years experience in Informatica development (Cloud and/or On-Prem). - Strong experience with CAI (Cloud Application Integration) and CDI (Cloud Data Integration). - Experience configuring and supporting Informatica MDM including data models, match/merge rules, and hierarchies. - Solid understanding of REST/SOAP APIs, event-driven architecture, and message queues. - Hands-on experience with IDMC platform and cloud-native integration patterns. - Proficient in SQL and data manipulation techniques. - Experience in data governance, data quality, and master data best practice. - Experience with CI/CD pipelines for Informatica using Git, Jenkins, or similar tools. - Knowledge of cloud platforms (Azure, AWS, or GCP). - Exposure to data warehousing, data lake, or real-time integration. - Familiarity with Agile/Scrum delivery methods.

Posted 3 days ago

Apply

8.0 - 13.0 years

2 - 2 Lacs

hyderabad

Work from Office

SUMMARY Collibra Implementation & Configuration: o Lead the implementation, configuration, and customization of Collibra modules and features, including Data Catalog, Data Governance, Data Lineage, and Data Quality. o Configure and maintain Collibra workflows, business rules, and custom attributes to support data governance processes. o Develop and maintain integrations with other enterprise systems (e.g., data warehouses, BI tools, ETL tools) to enable seamless data governance. o Stay up-to-date on Collibra releases and updates, implementing upgrades and patches as needed. Data Governance & Collaboration: o Collaborate with data owners, stewards, and other stakeholders to define and document data assets, policies, standards, and business glossaries within Collibra. o Train and support users on the effective use of Collibra, providing guidance and troubleshooting assistance. o Facilitate data governance working groups and workshops to promote data literacy and adoption of the platform. o Monitor and report on the effectiveness of data governance processes and the utilization of Collibra. Data Lineage & Metadata Management: o Implement and maintain automated data lineage processes to track data movement and transformation. o Manage and maintain metadata within Collibra, ensuring accuracy, completeness, and consistency. o Develop and maintain data quality rules and monitoring processes within Collibra. Technical Expertise & Problem Solving: o Troubleshoot and resolve technical issues related to Collibra, including performance issues and integration errors. o Develop and maintain custom solutions using Collibra API, SDK, and other related technologies. o Write technical documentation, including configuration guides, user manuals, and API documentation. o Identify opportunities to improve the efficiency and effectiveness of the Collibra platform and data governance processes. Required skills Education: Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. Experience: o Minimum 3+ years of hands-on experience with Collibra Data Governance Center (or similar Data Governance platform). o Proven experience in implementing, configuring, and customizing Collibra modules and features. o Experience with data governance principles, data cataloging, data lineage, and data quality management. o Experience with data modeling, database concepts, and ETL processes. o Experience with API integrations and scripting languages (e.g., Python, Groovy). Skills & Abilities: o Strong understanding of data governance principles and best practices. o Deep understanding of Collibra architecture, functionality, and administration. o Proficient in configuring and customizing Collibra workflows, business rules, and custom attributes. o Experience with data integration and API development. o Excellent communication, interpersonal, and presentation skills. o Ability to work independently and as part of a team. o Strong problem-solving and analytical skills. o Ability to translate business requirements into technical solutions. o Excellent documentation skills. Preferred Qualifications: o Collibra Certification (e.g., Collibra Data Governance Center Administrator, Collibra Data Citizen) o Experience with Agile methodologies. o Experience with cloud platforms (e.g., AWS, Azure, GCP). o Experience with data quality tools and techniques. Equal opportunities We are an equal opportunities employer and do not discriminate on the grounds of gender, sexual orientation, marital or civil partner status, pregnancy or maternity, gender reassignment, race, colour, nationality, ethnic or national origin, religion or belief, disability or age.

Posted 3 days ago

Apply

5.0 - 10.0 years

25 - 27 Lacs

hyderabad, pune, bengaluru

Hybrid

Role - Data Catalog and Governance - Collibra specialist Experience - 5+ Years Location - All EXL Location Key Skills - Collibra, Metadata Management, Data Governance, SQL Job Qualifications Required Skills & Experience: Hands-on experience with Collibra Data Intelligence Platform, including: Metadata ingestion Data lineage stitching Workflow configuration and customization Strong understanding of metadata management and data governance principles Experience working with the following data sources/tools: Teradata (BTEQ, MLOAD) Tableau QlikView IBM DataStage Informatica Ability to interpret and map technical metadata from ETL tools, BI platforms, and databases into Collibra Familiarity with data lineage concepts , including horizontal lineage across systems Proficiency in SQL and scripting for metadata extraction and transformation Excellent communication skills to collaborate with data stewards, engineers, and business stakeholders Preferred Qualifications: Experience with Collibra APIs or Connectors Knowledge of data governance frameworks (e.g., DAMA DMBOK) Prior experience in a regulated industry (e.g., finance, healthcare)

Posted 3 days ago

Apply

10.0 - 14.0 years

30 - 45 Lacs

bengaluru

Work from Office

Every career journey is personal. That's why we empower you with the tools and support to create your own success story. Be challenged. Be heard. Be valued. Be you ... be here. Job Summary The Data Management Senior Manager will oversee the organization's data management initiative across one or more aspects of the data lifecycle, to ensure alignment with policies, standards, and data risks controls. This role will require expertise in data classification, metadata management, data quality, data retention and data sharing to strengthen policy-based frameworks and improve enterprise data usability. This role will contribute to the development and implementation of best practices for mastering data domains, regulatory compliance, and data stewardship, collaborating with business and technology teams to drive operational excellence. This role is ideal for a data professional who is passionate about enabling trusted, well-managed data ecosystems and ensuring the data is properly governed, classified and securely managed. Essential Job Functions Partner with data communities and collaborate cross-functionally with Data Governance, IT another functions both onshore and offshore to ensure integration of data quality/management processes. Collaborate with cross functional teams to address data-related issues and provide innovative solutions. Network with senior internal and external personnel in own area of expertise. - (20%) Influence and guide across one or more data disciplines through clear communications, decision-making, conflict resolution and inspiring others to achieve their best. Provide guidance, priorities, and personal development of the team. - (20%) Escalation points to assist in resolving problems of diverse scope where analysis of data requires evaluation of identifiable factors. Identify, design, develop and implement data centric solutions driven in alignment to the goals of the Data and Analytics department. Demonstrate good judgment in selecting methods and techniques for obtaining solutions. - (20%) Create and maintain documentation, define and/or implement procedures, ensure compliance with policies and standards and report on established metrics and KPIs. - (20%) Manage the implementation of data lifecycle management strategies, roadmaps and policies ensure data integrity, security, and accessibility throughout the data lifecycle. Establish evidence from industry research of best practices to expand the policies, standards and procedures. - (20%) Minimum Qualifications Bachelors Degree computer science, engineering, information technology or a related STEM field of study. Certified Data Management Professional (CDMP from DAMA). 10+ years of experience working in data engineering, application development, or data architecture. 5+ years of experience leading a team of people. Preferred Qualifications Master’s Degree business, information technology, computer science or related STEM field of study. Experience implementing data management capabilities in a financial services firm. Skills Microsoft Office Cloud Technology Records Management Reports To : Senior Manager and above Direct Reports : 6 - 10 Work Environment Normal office environment, hybrid. Other Duties This job description is illustrative of the types of duties typically performed by this job. It is not intended to be an exhaustive listing of each and every essential function of the job. Because job content may change from time to time, the Company reserves the right to add and/or delete essential functions from this job at any time.

Posted 3 days ago

Apply

5.0 - 10.0 years

8 - 14 Lacs

pune

Work from Office

Position: Team Lead - MDM Location: Pune Kharadi Company: Global MNC Shift time: 4 pm to 1 am (Hybrid) & pick & drop Facility Responsibilities for master data management Assist with data mappings, data modeling, data profiling, query design, data flow design, data strategy and data governance between multiple databases across multiple platforms Analyze main business processes and requirements and translate into IT solutions Analyse data quality Overall project support Develop a master data management strategy for site Identify, support or manage process improvement initiatives Audit data and facilitates resolution Own department Master Data documentation Provide SAP system trainings to new staff and existing staff Own non-conformances related to support system issues Qualifications for master data management Develop, administer and maintain Master Data Management environment (UOs Master Data Repository) by providing technical solutions in support of business objectives and ongoing operations Provide technical expertise with the Master Data Repository Comprehensive understanding of Material, Customer and VendorDomains Business Rules and workflows within an MDM application Experience designing, evolving MDM architecture and solutions for large enterprise You should love working with multiple technologies and be a technology enthusiast Interested ones share resume on dhanashree.chitre@weareams.com

Posted 3 days ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

pune

Work from Office

Position: Team Lead - MDM Location: Pune Kharadi Company: Global MNC Shift time: 4 pm to 1 am (Hybrid) & pick & drop Facility Responsibilities for master data management Assist with data mappings, data modeling, data profiling, query design, data flow design, data strategy and data governance between multiple databases across multiple platforms Analyze main business processes and requirements and translate into IT solutions Analyse data quality Overall project support Develop a master data management strategy for site Identify, support or manage process improvement initiatives Audit data and facilitates resolution Own department Master Data documentation Provide SAP system trainings to new staff and existing staff Own non-conformances related to support system issues Qualifications for master data management Develop, administer and maintain Master Data Management environment (UOs Master Data Repository) by providing technical solutions in support of business objectives and ongoing operations Provide technical expertise with the Master Data Repository Comprehensive understanding of Material, Customer and VendorDomains Business Rules and workflows within an MDM application Experience designing, evolving MDM architecture and solutions for large enterprise You should love working with multiple technologies and be a technology enthusiast Interested ones share resume on dhanashree.chitre@weareams.com

Posted 3 days ago

Apply

7.0 - 10.0 years

5 - 8 Lacs

mumbai

Remote

Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 3 days ago

Apply

6.0 - 11.0 years

25 - 35 Lacs

bengaluru

Work from Office

Experience:6+Yrs Mandatory Skills: Data management, Microsoft purview, Data governance Location: Bangalore About the Role This is a position in Data Quality Lead to take ownership of data quality initiatives across the organization. This role is essential in ensuring enterprise data is accurate, consistent, and reliablesupporting data-driven decision-making and operational excellence. The successful candidate will assess the current state of data quality, define and execute improvement goals, and lead the implementation of governance and monitoring frameworks. You will collaborate across business and IT teams to embed quality controls, optimize data transformation, and enable scalable data processes aligned with business objectives. What You Will Do Assess current data quality conditions and define measurable improvement targets across business domains. Lead Data Quality Management and Transformation efforts including standardization, governance, and automation of critical data elements. Establish and manage frameworks for data cleansing, profiling, and rule-based validation. Define and track data quality metrics using dashboards, scorecards, and audit processes to benchmark and improve performance. Ensure the availability and effective use of Data Quality tools and platforms such as SAP Data Services , SAP Information Steward , and Microsoft Purview to monitor and enhance data quality. Recommend and implement strategies to remediate data quality issues, ensuring solutions align with business goals and compliance requirements. Collaborate with business leadership and key stakeholders to communicate the value of data quality improvements in business terms. Drive a data-quality-first mindset by leading cross-functional collaboration between business units, IT teams, and data governance stakeholders. Escalate unresolved or cross-functional data quality issues to appropriate business or governance leaders. Participate in the design and deployment of systems and data integration processes to ensure quality standards and controls are embedded early. Lead and support data quality improvement projects using structured methodologies, including resource planning and business process redesign. Promote success stories and positive outcomes to build momentum and engagement across the organization. What You Need to Be Successful Bachelors degree in business, Finance, Information Technology, or a related fieldor equivalent work experience. Minimum of 5 years of experience in data governance, data quality, or master data management. Proven ability to design and implement enterprise-wide data quality frameworks and scalable processes. Strong relationship-building and communication skills with the ability to work across a matrixed environment. Experience managing complex projects with structured methodologies and tools for tracking and reporting progress.

Posted 3 days ago

Apply

4.0 - 8.0 years

8 - 12 Lacs

pune, chennai, bengaluru

Work from Office

Your Role Reviewtechnicaldesign and development of master data related interfaces, enhancement, load program, report and workflow inMDG& ECC system. Provide strategy for integration between differentSAP(ECC,MDG) and non-SAPsystems to ensure the integrity of master data. Define technical strategy and best practices related to master data governance for a Global ERP deployment program. Review technical design and development of master data related interfaces, enhancement Your Profile Should have experience in SAP MDG, BRF+, FPM, ABAP OO, ABAP Data Dictionary, Data Modelling or Experience into SAP MDG Functional modules- MM/SD/FI Should have minimum 2 E2E MDG implementation/support projects done or sound understanding in MDG Functional domain Good interpersonal skills with customer facing experience

Posted 3 days ago

Apply

7.0 - 10.0 years

10 - 14 Lacs

bengaluru

Work from Office

About the Job : We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team. In this pivotal role, you will be instrumental in driving our data engineering initiatives, with a strong emphasis on leveraging Dataiku's capabilities to enhance data processing and analytics. You will be responsible for designing, developing, and optimizing robust data pipelines, ensuring seamless integration of diverse data sources, and maintaining high data quality and accessibility to support our business intelligence and advanced analytics projects. This role requires a unique blend of expertise in traditional data engineering principles, advanced data modeling, and a forward-thinking approach to integrating cutting-AI technologies, particularly LLM Mesh for Generative AI applications. If you are passionate about building scalable data solutions and are eager to explore the cutting edge of AI, we encourage you to apply. Key Responsibilities : - Dataiku Leadership : Drive data engineering initiatives with a strong emphasis on leveraging Dataiku capabilities for data preparation, analysis, visualization, and the deployment of data solutions. - Data Pipeline Development : Design, develop, and optimize robust and scalable data pipelines to support various business intelligence and advanced analytics projects. This includes developing and maintaining ETL/ELT processes to automate data extraction, transformation, and loading from diverse sources. - Data Modeling & Architecture : Apply expertise in data modeling techniques to design efficient and scalable database structures, ensuring data integrity and optimal performance. - ETL/ELT Expertise : Implement and manage ETL processes and tools to ensure efficient and reliable data flow, maintaining high data quality and accessibility. - Gen AI Integration : Explore and implement solutions leveraging LLM Mesh for Generative AI applications, contributing to the development of innovative AI-powered features. - Programming & Scripting : Utilize programming languages such as Python and SQL for data manipulation, analysis, automation, and the development of custom data solutions. - Cloud Platform Deployment : Deploy and manage scalable data solutions on cloud platforms such as AWS or Azure, leveraging their respective services for optimal performance and cost-efficiency. - Data Quality & Governance : Ensure seamless integration of data sources, maintaining high data quality, consistency, and accessibility across all data assets. Implement data governance best practices. - Collaboration & Mentorship : Collaborate closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver impactful solutions. Potentially mentor junior team members. - Performance Optimization : Continuously monitor and optimize the performance of data pipelines and data systems. Required Skills & Experience : - Proficiency in Dataiku : Demonstrable expertise in Dataiku for data preparation, analysis, visualization, and building end-to-end data pipelines and applications. - Expertise in Data Modeling : Strong understanding and practical experience in various data modeling techniques (e.g., dimensional modeling, Kimball, Inmon) to design efficient and scalable database structures. - ETL/ELT Processes & Tools : Extensive experience with ETL/ELT processes and a proven track record of using various ETL tools (e.g., Dataiku's built-in capabilities, Apache Airflow, Talend, SSIS, etc.). - Familiarity with LLM Mesh : Familiarity with LLM Mesh or similar frameworks for Gen AI applications, understanding its concepts and potential for integration. - Programming Languages : Strong proficiency in Python for data manipulation, scripting, and developing data solutions. Solid command of SQL for complex querying, data analysis, and database interactions. - Cloud Platforms : Knowledge and hands-on experience with at least one major cloud platform (AWS or Azure) for deploying and managing scalable data solutions (e.g., S3, EC2, Azure Data Lake, Azure Synapse, etc.). - Gen AI Concepts : Basic understanding of Generative AI concepts and their potential applications in data engineering. - Problem-Solving : Excellent analytical and problem-solving skills with a keen eye for detail. - Communication : Strong communication and interpersonal skills to collaborate effectively with cross-functional teams. Bonus Points (Nice to Have) : - Experience with other big data technologies (e.g., Spark, Hadoop, Snowflake). - Familiarity with data governance and data security best practices. - Experience with MLOps principles and tools. - Contributions to open-source projects related to data engineering or AI. Education : Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related quantitative field.

Posted 3 days ago

Apply

2.0 - 7.0 years

10 - 20 Lacs

hyderabad, gurugram, bengaluru

Work from Office

Data Governance Collibra/Alation/Informatica/Purview(Any 3) Strong understanding of DG frameworks such as DCAM, DAMA DMBOK Certifications in DG (e.g CDMP, DCAM, collibra ranger, IDGC) Hands-on Implementation experience using tools like Collibra, Alation, Informatica, Purview etc. Experience with ERP systems. Strong knowledge of data quality management processes and concepts including matching, merging, creation of golden records for master data entities.

Posted 3 days ago

Apply

8.0 - 12.0 years

10 - 14 Lacs

gurugram

Work from Office

Role Responsibilities : - Develop and design comprehensive Power BI reports and dashboards. - Collaborate with stakeholders to understand reporting needs and translate them into functional requirements. - Create visually appealing interfaces using Figma for enhanced user experience. - Utilize SQL for data extraction and manipulation to support reporting requirements. - Implement DAX measures to ensure accurate data calculations. - Conduct data analysis to derive actionable insights and facilitate decision-making. - Perform user acceptance testing (UAT) to validate report performance and functionality. - Provide training and support for end-users on dashboards and reporting tools. - Monitor and enhance the performance of existing reports on an ongoing basis. - Work closely with cross-functional teams to align project objectives with business goals. - Maintain comprehensive documentation for all reporting activities and processes. - Stay updated on industry trends and best practices related to data visualization and analytics. - Ensure compliance with data governance and security standards. - Participate in regular team meetings to discuss project progress and share insights. - Assist in the development of training materials for internal stakeholders. Qualifications - Minimum 8 years of experience in Power BI and Figma. - Strong proficiency in SQL and database management. - Extensive knowledge of data visualization best practices. - Expertise in DAX for creating advanced calculations. - Proven experience in designing user interfaces with Figma. - Excellent analytical and problem-solving skills. - Ability to communicate complex data insights to non-technical stakeholders. - Strong attention to detail and commitment to quality. - Experience with business analytics and reporting tools. - Familiarity with data governance and compliance regulations. - Ability to work independently and as part of a team in a remote setting. - Strong time management skills and ability to prioritize tasks. - Ability to adapt to fast-paced working environments. - Strong interpersonal skills and stakeholder engagement capability. - Relevant certifications in Power BI or data analytics are a plus.

Posted 3 days ago

Apply

6.0 - 8.0 years

8 - 13 Lacs

hyderabad

Work from Office

About The Opportunity : Operating at the forefront of the consulting and technology services sector, our client delivers transformative data management and business analytics solutions to a global clientele. Specializing in innovative approaches to data architecture and information management, they empower organizations to make data-driven decisions. We are seeking a seasoned professional to drive the evolution of our data infrastructure. This is an exciting chance to join a high-caliber team that values precision, efficiency, and strategic vision in a fully remote setting from anywhere in India. Role & Responsibilities : As a key member of our team, you will be instrumental in shaping our data landscape. Your responsibilities will include : - Designing, developing, and maintaining comprehensive logical and physical data models that meet stringent business and regulatory requirements. - Collaborating with cross-functional teams including data engineers, analysts, and business stakeholders to align data architecture with evolving business needs. - Translating complex business requirements into robust and scalable data models, ensuring optimal performance and data integrity. - Optimizing and refining existing data structures and frameworks to enhance data access, reporting, and analytics capabilities. - Driving governance and data quality initiatives by establishing best practices and documentation standards for data modeling. - Mentoring and guiding junior team members, fostering a culture of continuous improvement and technical excellence. Skills & Qualifications : Must-Have : - Experience : 6+ years of experience in data modeling and related disciplines. - Data Modeling : Extensive expertise in conceptual, logical, and physical models, including Star/Snowflake schema design, Normalization, and Denormalization techniques. - Snowflake : Proven experience with Snowflake schema design, Performance tuning, Time Travel, Streams & Tasks, and Secure & Materialized Views. - SQL & Scripting : Advanced proficiency in SQL (including CTEs and Window Functions), with a strong focus on automation and optimization. Key Skills : - Data Modeling (conceptual, logical, physical) - Schema Design (Star Schema, Snowflake Schema) - Optimization & Performance Tuning - Snowflake (Schema design, Time Travel, Streams and Tasks, Secure Views, Materialized Views) - Advanced SQL (CTEs, Window Functions) - Normalization & Denormalization - Automation - NoSQL (preferred, but not mandatory)

Posted 3 days ago

Apply

8.0 - 10.0 years

8 - 12 Lacs

hyderabad

Work from Office

Job Title : Data Governance Purview Analyst. Knowledge and an expert in Microsoft Purview for data governance and cataloging. Knowledge and an expert in Azure Data Factory, Azure Synapse, and Power BI for data integration and visualization. Knowledge and an expert in PowerShell for automation and scripting tasks. Knowledge and an expert in Data Quality and Compliance Tools (e.g., Microsoft Compliance Manager). Knowledge in how to use Azure DevOps, or other project management tools for tracking implementation progress. Roles And Responsibility : - Using Purview, perform data discovery, identifying and cataloging data assets. - Implement and configure data classification rules to automatically classify sensitive data, ensuring compliance with regulations. - Ensure the automatic tagging and categorization of data to create a comprehensive inventory. Required Skills And Qualifications : - Bachelors degree in computer science, Information Technology, or a related field. - Experience with Azure Purview or similar data governance tools. - Strong understanding of data governance principles, data management, and regulatory requirements. - Proficiency in data cataloging, metadata management, and data lineage documentation. - Excellent analytical and problem-solving skills. - Effective communication skills for collaboration with cross-functional teams. Preferred Qualifications : - Certification in Azure Purview or other data governance tools. - Prior experience in implementing data governance frameworks for enterprise clients. - Familiarity with cloud platforms like Azure or AWS. - Knowledge of data privacy regulations such as GDPR or CCPA.

Posted 3 days ago

Apply

4.0 - 6.0 years

4 - 8 Lacs

hyderabad

Work from Office

Key Responsibilities : - Exp in Snowflake with min 3 yrs. - Design, develop, and maintain integration workflows using Informatica CAI/CDI within IDMC. - Configure and manage Informatica MDM for data consolidation, cleansing, and governance. - Translate business and technical requirements into robust, scalable integration solutions. - Collaborate with analysts, architects, and business stakeholders to ensure delivery meets expectations. - Monitor and troubleshoot integration processes and performance. - Support deployment activities and CI/CD processes. - Maintain documentation of integration designs and data flows. Essential Skills & Experience : - 4+ years experience in Informatica development (Cloud and/or On-Prem). - Strong experience with CAI (Cloud Application Integration) and CDI (Cloud Data Integration). - Experience configuring and supporting Informatica MDM including data models, match/merge rules, and hierarchies. - Solid understanding of REST/SOAP APIs, event-driven architecture, and message queues. - Hands-on experience with IDMC platform and cloud-native integration patterns. - Proficient in SQL and data manipulation techniques. - Experience in data governance, data quality, and master data best practice. - Experience with CI/CD pipelines for Informatica using Git, Jenkins, or similar tools. - Knowledge of cloud platforms (Azure, AWS, or GCP). - Exposure to data warehousing, data lake, or real-time integration. - Familiarity with Agile/Scrum delivery methods.

Posted 3 days ago

Apply

7.0 - 10.0 years

5 - 8 Lacs

noida

Work from Office

Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 3 days ago

Apply

8.0 - 12.0 years

10 - 14 Lacs

noida

Work from Office

Role Responsibilities : - Develop and design comprehensive Power BI reports and dashboards. - Collaborate with stakeholders to understand reporting needs and translate them into functional requirements. - Create visually appealing interfaces using Figma for enhanced user experience. - Utilize SQL for data extraction and manipulation to support reporting requirements. - Implement DAX measures to ensure accurate data calculations. - Conduct data analysis to derive actionable insights and facilitate decision-making. - Perform user acceptance testing (UAT) to validate report performance and functionality. - Provide training and support for end-users on dashboards and reporting tools. - Monitor and enhance the performance of existing reports on an ongoing basis. - Work closely with cross-functional teams to align project objectives with business goals. - Maintain comprehensive documentation for all reporting activities and processes. - Stay updated on industry trends and best practices related to data visualization and analytics. - Ensure compliance with data governance and security standards. - Participate in regular team meetings to discuss project progress and share insights. - Assist in the development of training materials for internal stakeholders. Qualifications - Minimum 8 years of experience in Power BI and Figma. - Strong proficiency in SQL and database management. - Extensive knowledge of data visualization best practices. - Expertise in DAX for creating advanced calculations. - Proven experience in designing user interfaces with Figma. - Excellent analytical and problem-solving skills. - Ability to communicate complex data insights to non-technical stakeholders. - Strong attention to detail and commitment to quality. - Experience with business analytics and reporting tools. - Familiarity with data governance and compliance regulations. - Ability to work independently and as part of a team in a remote setting. - Strong time management skills and ability to prioritize tasks. - Ability to adapt to fast-paced working environments. - Strong interpersonal skills and stakeholder engagement capability. - Relevant certifications in Power BI or data analytics are a plus.

Posted 3 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies