Jobs
Interviews

455 Metadata Management Jobs - Page 18

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

9.0 - 14.0 years

11 - 16 Lacs

bengaluru

Work from Office

About the Role: We are looking for an Associate Architect with atleast 9+ years of experience to help scale andmodernize Myntra's data platform. The ideal candidate will have a strong background inbuilding scalable data platforms using a combination of open-source technologies andenterprise solutions.The role demands deep technical expertise in data ingestion, processing, serving, andgovernance, with a strategic mindset to scale the platform 10x to meet the ever-growing dataneeds across the organization.This is a high-impact role requiring innovation, engineering excellence and system stability,with an opportunity to contribute to OSS projects and build data products leveragingavailable data assets. Key Responsibilities: Design and scale Myntra's data platform to support growing data needs acrossanalytics, ML, and reporting. Architect and optimize streaming data ingestion pipelines using Debezium, Kafka(Confluent), Databricks Spark and Flink. Lead improvements in data processing and serving layers, leveraging DatabricksSpark, Trino, and Superset. Good understanding of open table formats like Delta and Iceberg. Scale data quality frameworks to ensure data accuracy and reliability. Build data lineage tracking solutions for governance, access control, and compliance. Collaborate with engineering, analytics, and business teams to identify opportunitiesand build / enhance self-serve data platforms. Improve system stability, monitoring, and observability to ensure high availability ofthe platform. Work with open-source communities and contribute to OSS projects aligned withMyntras tech stack. Implement cost-efficient, scalable architectures for handling 10B+ daily events in acloud environment. Qualifications: Education: Bachelor's or Masters degree in Computer Science, Information Systems, or arelated field. Experience: 9+ years of experience in building large-scale data platforms. Expertise in big data architectures using Databricks, Trino, and Debezium. Strong experience with streaming platforms, including Confluent Kafka. Experience in data ingestion, storage, processing, and serving in a cloud-basedenvironment. Hands-on experience implementing data quality checks using Great Expectations. Deep understanding of data lineage, metadata management, and governancepractices. Strong knowledge of query optimization, cost efficiency, and scaling architectures. Familiarity with OSS contributions and keeping up with industry trends in dataengineering.Soft Skills: Strong analytical and problem-solving skills with a pragmatic approach to technicalchallenges. Excellent communication and collaboration skills to work effectively withcross-functional teams.Ability to lead large-scale projects in a fast-paced, dynamic environment. Passion for continuous learning, open-source collaboration, and buildingbest-in-class data products.

Posted Date not available

Apply

7.0 - 11.0 years

35 - 55 Lacs

chennai

Work from Office

Duties and Responsibilities Collaborate with business, technology and strategic stakeholders to establish a data architecture aligned to the overarching business data strategy and program goals. Design conceputal, logical and physical data models to support transactional and analytics applications Lead the architectural design and implementation of scalable, secure, and high-performance data platforms using AWS services. Collaborate closely with the data engineering team to understand their requirements and provide architectural guidance and solutions. Develop and maintain architectural standards, best practices, and guidelines for data infrastructure. Ensure the integration of various data sources and systems into a cohesive data platform. Optimize data workflows and pipelines for performance, reliability, and scalability. Provide technical leadership and mentorship to the data engineering team. Stay updated with the latest trends and technologies in cloud computing, data engineering, and Azure and AWS services. Conduct regular reviews and audits of the data platform to ensure compliance with security and governance policies. Work with stakeholders to define and implement data governance and data management strategies. Skills Experience in Data Modeling, Master Data Management, Metadata Management, Data engineering, Data Quality Metrics Experience with with AWS services, including AWS Glue, Amazon Redshift, Amazon S3, AWS Lambda, and Amazon RDS. Strong understanding of cloud architecture principles and best practices. Proficiency in programming languages such as Python, Java, or Scala. Deep knowledge of data modeling, ETL processes, and data warehousing concepts. Experience with CI/CD pipelines and version control systems like Git. Excellent problem-solving skills and attention to detail. Strong communication and leadership skills. Ability to work collaboratively with cross-functional teams and stakeholders. Knowledge of security and compliance standards related to cloud data platforms.

Posted Date not available

Apply

3.0 - 6.0 years

10 - 14 Lacs

mumbai

Work from Office

Collaborate with stakeholders to gather requirements and design data quality, data governance, and master data management solutions using Ataccama. Responsible for design and implement data matching and deduplication strategies using Ataccama Data Matching Responsible for developing and maintain data quality rules, data profiling, and data cleansing processes within Ataccama Data Quality centre Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise BE / B Tech in any stream, M.Sc. (Computer Science/IT) / M.C.A, with Minimum 3-5 plus years of experience Experience in the optimization of Ataccama data management solutions. Develop and maintain data quality rules, data profiling, and data cleansing processes within Ataccama Data Quality Centre. Design and implement data matching and deduplication strategies using Ataccama Data Matching Preferred technical and professional experience Utilize Ataccama Data Catalog for metadata management, data lineage tracking, and data discovery.. Provide expertise in integrating Ataccama with other data management tools and platforms within the organization's ecosystem. Collaborate with stakeholders to gather requirements and design data quality, data governance, and master data management solutions using Ataccama

Posted Date not available

Apply

5.0 - 25.0 years

7 - 30 Lacs

hyderabad, pune, bengaluru

Work from Office

Roles and Responsibilities : Design, develop, and maintain metadata management solutions to ensure data quality and governance across the organization. Collaborate with cross-functional teams to identify business requirements and implement metadata standards that meet those needs. Develop and execute testing plans to validate metadata accuracy, completeness, and consistency. Provide training and support to end-users on using metadata tools and applications. Job Requirements : 5-25 years of experience in Metadata Management or related field (Data Governance, Data Quality). Strong understanding of data governance principles, including policies, procedures, and best practices. Proficiency in developing metadata models, schemas, and frameworks for various industries/domains.

Posted Date not available

Apply

15.0 - 20.0 years

10 - 14 Lacs

hyderabad

Work from Office

Project Role :Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Informatica MDM Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, addressing any challenges that arise, and providing guidance to team members to foster a productive work environment. You will also engage in strategic discussions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the decision-making process. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM.- Strong understanding of data integration and data quality processes.- Experience with data modeling and metadata management.- Familiarity with ETL processes and data warehousing concepts.- Ability to troubleshoot and resolve technical issues efficiently. Additional Information:- The candidate should have minimum 5 years of experience in Informatica MDM.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted Date not available

Apply

8.0 - 13.0 years

6 - 10 Lacs

hyderabad

Work from Office

Skill Extensive experience with Google Data Products (Cloud Data Fusion,BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Dataprep, etc.). Expertise in Cloud Data Fusion,BigQuery & Dataproc Experience in MDM, Metadata Management, Data Quality and Data Lineage tools. E2E Data Engineering and Lifecycle (including non-functional requirements and operations) management. Experience with SQL and NoSQL modern data stores. E2E Solution Design skills - Prototyping, Usability testing and data visualization literacy. Excellent knowledge of the software development life cycle

Posted Date not available

Apply

5.0 - 10.0 years

4 - 7 Lacs

bengaluru

Work from Office

Development of workflows and Connectors for the Collibra Platform Administration and configuration of Collibra Platform Duties: Collibra DGC Administration and Configuration Collibra Connect Administration and Configuration Collibra Development of Workflows and MuleSoft Connectors Ingesting metadata from any external sources into Collibra. Installation, upgrading and Administration Collibra Components Setup, support, deployment & migration of Collibra Components Implement Application changes: review and deploy code packages, perform post implementation verifications. Participate in group meetings (including business partners) for problem solving, decision making and implementation planning Senior Collibra Developer- Mandatory Skills MUST HAVE SKILLS: Collibra Connect Collibra DGC Java Advanced hands-on working knowledge of Unix/Linux Advanced hands on experience wit UNIX scripting SQL Server Groovy

Posted Date not available

Apply

7.0 - 12.0 years

11 - 21 Lacs

pune

Work from Office

Our Global Diabetes Capability Center in Pune is expanding to serve more people living with diabetes globally. Our state-of-the-art facility is dedicated to transforming diabetes management through innovative solutions and technologies that reduce the burden of living with diabetes. Medtronic is hiring a Senior Data Governance Engineer. As a Senior engineer, you will be a key member of our Data Governance team, defining scope, assessing current data governance capabilities, building roadmaps to mature data governance capabilities. This role offers a dynamic opportunity to join Medtronic's Diabetes business. Medtronic has announced its intention to separate the Diabetes division to promote future growth and innovation within the business and reallocate investments and resources across Medtronic, subject to applicable information and consultation requirements. While you will start your employment with Medtronic, upon establishment of SpinCo or the transition of the Diabetes business to another company, your employment may transfer to either SpinCo or the other company, at Medtronic's discretion and subject to any applicable information and consultation requirements in your jurisdiction. Responsibilities may include the following and other duties may be assigned: Data Governance Strategy Development - responsible for the strategic development, architectural definition, and enterprise wide data governance and data management initiatives supporting the delivery of data as a service. Provide Data Governance and Data Management advisory expertise Responsible for the identification and evaluation of metadata platform alternatives, the development of the logical metadata framework architecture, and the definition and implementation of metadata maintenance processes. Defining Data Governance operating models, roles, and responsibilities in collaboration with Medtronic requirements. Responsible for the assessment and selection of Data Management tools landscape including: data profiling, data quality, metadata, MDM, rules engines, and pattern matching Refine and develop the data-centric approach and methodologies which may include areas such as: data strategy, data governance, data lineage, analytics, business intelligence, data architecture, data quality, master data management and data integration and delivery Assess current state capabilities, identify gaps versus leading practices, and recommend future state data requirements. Identify opportunities for new workflows, third party glossary/metadata tools, database architectures, and third party pre-existing data models. Work closely with business and technology stakeholders to assist in understanding and documenting data requirements, lineage, and partnering with IT data integration and data warehouse teams to ensure requirements are effectively executed. Help define data modeling naming standards, abbreviations, guidelines, and best practices Enhance or design data model review processes based on business requirements. Required Knowledge and Experience: At least 5 years of experience developing / structuring an enterprise-wide data governance organization and business process (operating models, roles, partner organizations, responsibilities). Hands-on with both the business side and IT side implementing or supporting an MDM and/or Data Warehouse and Reporting IT solutions. Utilize strong business knowledge of the investment management industry and common data management operations. Broad understanding of the role of data management within global markets organizations, information flow, and data governance issues. Domain expertise in specific areas of Data Management such as: data strategy, data governance, data lineage, analytics, business intelligence, data architecture, data quality, master data management and data integration and delivery.

Posted Date not available

Apply

5.0 - 8.0 years

10 - 14 Lacs

navi mumbai

Work from Office

About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Informatica MDM Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, addressing any challenges that arise, and providing guidance to team members to foster a productive work environment. You will also engage in strategic discussions to align project goals with organizational objectives, ensuring that the applications developed meet the needs of stakeholders effectively. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM.- Strong understanding of data integration and data quality processes.- Experience with data modeling and metadata management.- Familiarity with ETL processes and data warehousing concepts.- Ability to troubleshoot and resolve technical issues efficiently. Additional Information:- The candidate should have minimum 7.5 years of experience in Informatica MDM.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education

Posted Date not available

Apply

10.0 - 15.0 years

25 - 30 Lacs

bangalore rural, chennai, bengaluru

Work from Office

Data Governance Architect,Alation,Azure cloud platforms,Snowflake platform metadata Management,Data Cataloging, Data Lineage or Data Quality,DAMA,DCAM,practical implementation,maturity assessments, gap analyses,Collibra, Purview

Posted Date not available

Apply

8.0 - 10.0 years

8 - 10 Lacs

mumbai, delhi / ncr, bengaluru

Work from Office

Job Opening: ETL ODI Developer Night Shift Shift: 9:00 PM 6:00 AM IST (Night Shift) Location: Remote (Open to candidates outside Andhra Pradesh),Delhi NCR,Bengaluru, Chennai,Pune, Kolkata,Ahmedabad,Mumbai,Hyderabad Key Responsibilities Design and develop ETL/ELT workflows using Oracle Data Integrator (ODI). Build stored procedures and data warehouse schemas (fact & dimension tables). Collaborate with DBAs for SQL script execution and schema deployment. Develop Python scripts to automate API-based data ingestion. Analyze data sources and align models with BRD. Implement data validation, profiling, and data quality monitoring practices. Apply best practices for metadata management, query optimization, and performance tuning. Work closely with stakeholders to communicate technical solutions effectively. Required Skills 5+ years of hands-on experience with ODI Strong in SQL, data modeling, ETL/ELT concepts Experience with data warehousing and schema design Proficiency in Python for API integrations Excellent communication, problem-solving, and analytical abilities How to Apply Send your updated resume Please include: Current CTC Expected CTC Notice Period Current Location Confirmation of Night Shift & Remote Work Preference

Posted Date not available

Apply

8.0 - 10.0 years

8 - 10 Lacs

mumbai, delhi / ncr, bengaluru

Work from Office

Job Opening: ETL ODI Developer Night Shift Shift: 9:00 PM 6:00 AM IST (Night Shift) Location: Remote (Open to candidates outside Andhra Pradesh),Delhi NCR,Bengaluru, Chennai,Pune, Kolkata,Ahmedabad,Mumbai,Hyderabad Key Responsibilities Design and develop ETL/ELT workflows using Oracle Data Integrator (ODI). Build stored procedures and data warehouse schemas (fact & dimension tables). Collaborate with DBAs for SQL script execution and schema deployment. Develop Python scripts to automate API-based data ingestion. Analyze data sources and align models with BRD. Implement data validation, profiling, and data quality monitoring practices. Apply best practices for metadata management, query optimization, and performance tuning. Work closely with stakeholders to communicate technical solutions effectively. Required Skills 5+ years of hands-on experience with ODI Strong in SQL, data modeling, ETL/ELT concepts Experience with data warehousing and schema design Proficiency in Python for API integrations Excellent communication, problem-solving, and analytical abilities How to Apply Send your updated resume Please include: Current CTC Expected CTC Notice Period Current Location Confirmation of Night Shift & Remote Work Preference

Posted Date not available

Apply

8.0 - 13.0 years

30 - 45 Lacs

pune, chennai, bengaluru

Work from Office

Creating the overall structure and framework for how data is managed, including databases, data warehouses, and data lakes. Defining the logical structure of data, including entities, attributes, and to ensure data is organized effectively Required Candidate profile Strong understanding of database technologies (SQL, NoSQL), data modeling, data warehousing, and cloud platforms. Ability resolve issues related to data infrastructure and data management.

Posted Date not available

Apply

8.0 - 13.0 years

30 - 45 Lacs

pune, chennai, bengaluru

Work from Office

Creating the overall structure and framework for how data is managed, including databases, data warehouses, and data lakes. Defining the logical structure of data, including entities, attributes, and to ensure data is organized effectively Required Candidate profile Strong understanding of database technologies (SQL, NoSQL), data modeling, data warehousing, and cloud platforms. Ability resolve issues related to data infrastructure and data management.

Posted Date not available

Apply

8.0 - 12.0 years

3 - 6 Lacs

bengaluru

Work from Office

Responsibilities Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms. Implement business and IT data requirements through new data strategies and designs across all data platforms (relational & dimensional - MUST and NoSQL-optional) and data tools (reporting, visualization, analytics, and machine learning). Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Must have Payments Background Skills Hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts required. Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required. Experience in team management, communication, and presentation. Experience with Erwin, Visio or any other relevant tool.

Posted Date not available

Apply

4.0 - 9.0 years

9 - 14 Lacs

hyderabad, india

Work from Office

About this role: Wells Fargo is seeking a Senior Data Management Analyst In this role, you will: Lead or participate in moderately complex programs and initiatives for data quality, governance, and metadata activities Design and conduct moderately complex analysis to identify and remediate data quality, data integrity, process, and control gaps Analyze, assess, and test data controls and data systems to ensure quality and risk compliance standards are met and adhere to data governance standards and procedures Identify data quality metrics and execute data quality audits to benchmark the state of data quality Develop recommendations for optimal approaches to resolve data quality issues and implement plans for assessing the quality of new data sources leveraging domain expertise and data, business, or process analysis to inform and support solution design Lead project teams and mentor less experienced staff members Drive planning and coordination on moderately complex remediation efforts acting as central point of contact Consult with clients to assess the current state of data and metadata quality within area of assigned responsibility Participate in cross-functional groups to develop companywide data governance strategies Provide input into communication routines with stakeholders, business partners, and experienced leaders Required Qualifications: 4+ years of Data Management, Business Analysis, Analytics, or Project Management experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Design and monitor data governance, data quality and metadata policies, standards, tools, processes, or procedures to ensure data control and remediation for companywide data management functions Work with business and technology partners or subject matter professionals to document or maintain business or technical metadata about systems, business or data elements, or data-related controls Participate in analysis to identify and remediate data quality or integrity issues and to identify and remediate process or control gaps Adhere to data governance standards and procedures. Identify data quality metrics and execute data quality audits to benchmark the state of data quality Support communications with basic documentation related to requirements, design decisions, issue closure, or remediation updates Support issue remediation by performing medium risk data profiling, data or business analysis, and data mapping as part of root cause or impact analysis Provide support to regulatory analysis and reporting requirements Recommend plans for the development and implementation of initiatives that assess the quality of new data sources Consult with clients to assess the current state of data quality within area of assigned responsibility 4+ years of experience in Data Governance, Metadata Management, Data Management, Data Lineage, Enterprise Data Quality check, Data Source Dictionary, Business Glossary, Use Case Data Dictionary, Basic SQL Understanding of Commercial Banking domain, BSA/AML program, FR Y-14 etc. Good communication skills to communicate with Business and Technology partners, subject matter professionals

Posted Date not available

Apply

6.0 - 9.0 years

14 - 19 Lacs

bengaluru

Remote

We are looking for a skilled and experienced Microsoft Fabric Data Engineer Lead with strong experience in designing, implementing, and managing data engineering solutions using Microsoft Fabrica unified platform for data integration, analytics, and governance Role Overview Architect scalable data solutions using Fabric components like Lake houses, Pipelines, and Dataflows Lead data ingestion and transformation using tools such as Azure Data Factory, PySpark, SQL, and Kusto Query Language (KQL) Ensure data quality and governance across Bronze (raw), Silver (cleansed), and Gold (curated) layers Collaborate with cross-functional teams including analytics engineers, architects, and business stakeholders Optimize performance and scalability of the data platform through monitoring and tuning Mentor junior engineers and guide best practices in data engineering Design and implement data models and schemas to support business requirements. Ensure data accuracy, consistency, and reliability. Optimize data structures for performance and scalability. Good experience on Azure Purview integrating data pipelines with Purview for Data lineage, experience with Microsoft Fabric, migrating the application from Azure Data Factory and Azure Synapse to Microsoft Fabric Implement robust data security measures, including encryption, access control, and data masking. Ensure compliance with data privacy regulations and company policies. Monitor and optimize data pipelines and queries for performance and efficiency. Implement caching, partitioning, and indexing strategies. Troubleshoot and resolve performance issues. Establish and enforce data quality standards. Implement data governance practices, metadata management, and data lineage tracking. Ensure data quality through validation and cleansing processes. Collaborate with business stakeholders to understand data requirements and deliver. Create and maintain comprehensive technical documentation, including system architecture, design documents, and deployment procedures. Ensure knowledge sharing within the team. Implement Lean daily management and Lean continuous improvement concepts in Application development and operations. Required Qualifications Bachelor's or master's degree in computer science, Information Technology, or a related field. Proven experience in Data integration and orchestration using Microsoft Fabric tools, Cloud platforms, especially Azure (Data Factory, Synapse, Power BI) and ETL/ELT pipelines, lakehouse architecture, and real-time analytics Strong foundation in SQL, PySpark, and Kusto Query Language (KQL) Excellent problem-solving skills and attention to detail, Strong communication and leadership skills. Relevant certifications (Microsoft Certified: Fabric Data Engineer Associate) are a plus. Preferred Qualifications Effective Communicator: Possess the ability to articulate complex technical concepts in a clear and concise manner, facilitating understanding across a wide range of audiences. Solution-Oriented Mindset: Demonstrated ability to approach challenges with a solutions-first attitude, proactively identifying and addressing potential obstacles. Change Advocate: Capable of spearheading and implementing change, showcasing a proactive approach to driving improvements and leading teams towards new initiatives. Leadership Qualities: Show a strong capacity to lead, influence, and guide teams, fostering a collaborative and productive work environment. Technical Translator: Excel at translating and breaking down intricate technical topics, ensuring that stakeholders at all levels have a clear understanding of the subject matter. Analytical Thinker: Possess an analytical and detail-oriented mindset, critically evaluating information derived from multiple sources and drawing meaningful conclusions. Team Player: Exhibit a strong willingness to collaborate and work cohesively with colleagues, understanding the importance of collective growth and knowledge sharing. Adaptability: Demonstrate resilience and adaptability in an ever-changing environment, staying updated with the latest technological advancements and best practices in the field of database administration.

Posted Date not available

Apply

5.0 - 7.0 years

5 - 8 Lacs

pune

Work from Office

Job Summary: Cummins is seeking a skilled Data Engineer to support the development, maintenance, and optimization of our enterprise data and analytics platform. This role involves hands-on experience in software development , ETL processes , and data warehousing , with strong exposure to tools like Snowflake , OBIEE , and Power BI . The engineer will collaborate with cross-functional teams, transforming data into actionable insights that enable business agility and scale. Please NoteWhile the role is categorized as remote, it will follow a hybrid work model based out of our Pune office . Key Responsibilities: Design, develop, and maintain ETL pipelines using Snowflake and related data transformation tools. Build and automate data integration workflows that extract, transform, and load data from various sources including Oracle EBS and other enterprise systems. Analyze, monitor, and troubleshoot data quality and integrity issues using standardized tools and methods. Develop and maintain dashboards and reports using OBIEE , Power BI , and other visualization tools for business stakeholders. Work with IT and Business teams to gather reporting requirements and translate them into scalable technical solutions. Participate in data modeling and storage architecture using star and snowflake schema designs. Contribute to the implementation of data governance , metadata management , and access control mechanisms . Maintain documentation for solutions and participate in testing and validation activities. Support migration and replication of data using tools such as Qlik Replicate and contribute to cloud-based data architecture . Apply agile and DevOps methodologies to continuously improve data delivery and quality assurance processes. Why Join Cummins? Opportunity to work with a global leader in power solutions and digital transformation. Be part of a collaborative and inclusive team culture. Access to cutting-edge data platforms and tools. Exposure to enterprise-scale data challenges and finance domain expertise . Drive impact through data innovation and process improvement . Competencies Data Extraction & Transformation - Ability to perform ETL activities from varied sources with high data accuracy. Programming - Capable of writing and testing efficient code using industry standards and version control systems. Data Quality Management - Detect and correct data issues for better decision-making. Solution Documentation - Clearly document processes, models, and code for reuse and collaboration. Solution Validation - Test and validate changes or solutions based on customer requirements. Problem Solving - Address technical challenges systematically to ensure effective resolution and prevention. Customer Focus - Understand business requirements and deliver user-centric data solutions. Communication & Collaboration - Work effectively across teams to meet shared goals. Values Differences - Promote inclusion by valuing diverse perspectives and backgrounds. Education, Licenses, Certifications Bachelor s or Master s degree in Computer Science, Information Systems, Data Engineering, or a related technical discipline. Certifications in data engineering or relevant tools (Snowflake, Power BI, etc.) are a plus. Experience Must have skills 5-7 years of experience in data engineering or software development , preferably within a finance or enterprise IT environment. Proficient in ETL tools , SQL , and data warehouse development . Proficient in Snowflake , Power BI , and OBIEE reporting platforms. Must have worked in implementation using these tools and technologies. Strong understanding of data warehousing principles , including schema design (star/snowflake), ER modeling, and relational databases. Working knowledge of Oracle databases and Oracle EBS structures. Preferred Skills: Experience with Qlik Replicate , data replication , or data migration tools. Familiarity with data governance , data quality frameworks , and metadata management . Exposure to cloud-based architectures, Big Data platforms (e.g., Spark, Hive, Kafka), and distributed storage systems (e.g., HBase, MongoDB). Understanding of agile methodologies (Scrum, Kanban) and DevOps practices for continuous delivery and improvement.

Posted Date not available

Apply

3.0 - 6.0 years

0 - 0 Lacs

vadodara

Work from Office

JD for Document Controller Experience: 3-6 Yrs Location: Vadodara Position: Document Controller Qualification: Diploma/BE (Instrumentation/Electrical/Mechanical Engineering) Skills:- 1. Control and update project documents. 2. Maintain the Document Distribution Matrix (DDM). 3. Coordinate project deliverables flow. Report on deliverables progress and issues. 4. Enforce Document Management procedures. 5. Validate Master Document Registers (MDRs). 6. Perform quality checks on contractor deliverables. 7. Maintain the Company MDR. 8. Assign document numbers and metadata. 9. Participate in project meetings. 10. Assist with project deliverables handover and Proficiency with Aconex, SharePoint, Documentum, and EDMS. Preferred candidate profile

Posted Date not available

Apply

5.0 - 7.0 years

6 - 10 Lacs

bengaluru

Work from Office

Job Summary: We are seeking a highly skilled and experienced Data Modeler to join our data engineering team. The ideal candidate will bring deep expertise in designing scalable and efficient data models for cloud platforms, particularly with a strong background in Oracle Data Warehousing and Databricks Lakehouse architecture. You will play a critical role in our strategic migration from an on-prem Oracle data warehouse to a modern cloud-based Databricks platform. Key Responsibilities: Design and implement conceptual, logical, and physical data models that support business requirements and analytics needs. Lead the migration of data models from an on-prem Oracle Data Warehouse to Databricks on AWS or Azure cloud. Reverse-engineer and analyze complex Oracle schemas, PL/SQL code, and existing structures to build equivalent or improved models on the cloud. Collaborate with data architects and engineers to define optimal data structures in Databricks, leveraging Delta Lake and Spark. Optimize data models for performance, scalability, and cost-efficiency in a cloud-native environment. Develop and maintain dimensional models using star and snowflake schemas to support BI and reporting tools. Ensure data governance standards are met by incorporating metadata management, data lineage, and documentation practices. Participate in data architecture reviews and provide input on best practices in modeling and data pipeline integration. Work closely with cross-functional teams to understand business needs and translate them into effective data solutions. Required Skills & Experience: 5+ years of hands-on experience in data modeling, including conceptual, logical, and physical design. Proven experience migrating large-scale Oracle DWH environments to Databricks Lakehouse or similar platforms. Strong expertise in Oracle database schemas, PL/SQL, and performance tuning. Proficiency in Databricks, Delta Lake, Spark SQL, and DataFrame APIs. Experience designing models optimized for cloud platforms (preferably AWS or Azure). Deep knowledge of dimensional modeling techniques (Star/Snowflake). Familiarity with tools and practices for metadata management, data lineage, and governance. Strong analytical and communication skills with the ability to work collaboratively in Agile teams. Ability to document and communicate data model designs to both technical and non-technical stakeholders.

Posted Date not available

Apply

4.0 - 9.0 years

7 - 11 Lacs

bengaluru

Work from Office

Shift: Timing:1.00 PM-10.00 PM Main location: Bangalore and Chennai Employment Type: Full Time Education Qualification: Bachelor's degree in Computer Science or related field or higher with minimum 3 years of relevant experience. Position Description: We are looking for an experienced Data Management resource to join our team. The ideal candidate should have experince in managing, configuring, and supporting Collibra or similar data governance tools. Your future duties and responsibilities: Generate and maintain reports to track data governance and quality metrics. oProvide insights on platform usage, compliance, and governance effectiveness to stakeholders. Collaboration with Stakeholders: oCollaborate with data stewards, data owners, business users, and other key stakeholders to understand their requirements and ensure that Collibra is configured to meet business needs. oAssist in the training and onboarding of new users to ensure effective use of the platform. Required qualifications to be successful in this role: Must-Have Skills: Minimum of 4+years of experience in managing, configuring, and supporting Collibra or similar data governance tools. oExperience in data governance, data management, and metadata management. oStrong understanding of data governance frameworks, data quality principles, and metadata management. Certifications (Preferred but not required): oCollibra Certified Professional or similar certifications. #LI-AD11 oITIL Foundation or other relevant IT service management certifications. Skills: AutoSys Python Unix Java

Posted Date not available

Apply

8.0 - 10.0 years

10 - 15 Lacs

mumbai, delhi / ncr, bengaluru

Work from Office

Shift: 9:00 PM 6:00 AM IST (Night Shift) Location: Remote (Open to candidates outside Andhra Pradesh),Delhi NCR,Bengaluru, Chennai, Pune, Kolkata,Ahmedabad, Mumbai, Hyderabad Key Responsibilities Design and develop ETL/ELT workflows using Oracle Data Integrator (ODI). Build stored procedures and data warehouse schemas (fact & dimension tables). Collaborate with DBAs for SQL script execution and schema deployment. Develop Python scripts to automate API-based data ingestion. Analyze data sources and align models with BRD. Implement data validation, profiling, and data quality monitoring practices. Apply best practices for metadata management, query optimization, and performance tuning. Work closely with stakeholders to communicate technical solutions effectively. Required Skills 5+ years of hands-on experience with ODI Strong in SQL, data modeling, ETL/ELT concepts Experience with data warehousing and schema design Proficiency in Python for API integrations Excellent communication, problem-solving, and analytical abilities How to Apply Send your updated resume to: [Insert Email] Please include: Current CTC Expected CTC Notice Period Current Location Confirmation of ight Shift & Remote Work Preference

Posted Date not available

Apply

5.0 - 10.0 years

5 - 9 Lacs

bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica MDM Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop solutions that align with business needs and requirements. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the team in implementing innovative solutions Ensure adherence to project timelines and quality standards Professional & Technical Skills: Must To Have Skills: Proficiency in Informatica MDM Strong understanding of data integration and data quality management Experience in designing and implementing MDM solutions Knowledge of data modeling and metadata management Hands-on experience with Informatica PowerCenter Good To Have Skills: Experience with Informatica Data Quality Additional Information: The candidate should have a minimum of 5 years of experience in Informatica MDM This position is based at our Bengaluru office A 15 years full-time education is required Qualifications 15 years full time education

Posted Date not available

Apply

3.0 - 5.0 years

5 - 9 Lacs

navi mumbai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Collibra Data Quality & Observability Good to have skills : Collibra Data GovernanceMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are functioning optimally. You will also engage in problem-solving activities, providing support and enhancements to existing applications while maintaining a focus on quality and efficiency. Key Responsibilities:Configure and implement Collibra Data Quality (CDQ) rules, workflows, dashboards, and data quality scoring metrics.Collaborate with data stewards, data owners, and business analysts to define data quality KPIs and thresholds.Develop data profiling and rule-based monitoring using CDQ's native rule engine or integrations (e.g., with Informatica, Talend, or BigQuery).Build and maintain Data Quality Dashboards and Issue Management workflows within Collibra.Integrate CDQ with Collibra Data Intelligence Cloud for end-to-end governance visibility.Drive root cause analysis and remediation plans for data quality issues.Support metadata and lineage enrichment to improve data traceability.Document standards, rule logic, and DQ policies in the Collibra Catalog.Conduct user training and promote data quality best practices across teams.Required Skills and Experience:3+ years of experience in data quality, metadata management, or data governance.Hands-on experience with Collibra Data Quality & Observability (CDQ) platform.Knowledge of Collibra Data Intelligence Cloud including Catalog, Glossary, and Workflow Designer.Proficiency in SQL and understanding of data profiling techniques.Experience integrating CDQ with enterprise data sources (Snowflake, BigQuery, Databricks, etc.).Familiarity with data governance frameworks and data quality dimensions (accuracy, completeness, consistency, etc.).Excellent analytical, problem-solving, and communication skills. Additional Information:- The candidate should have minimum 7.5 years of experience in Collibra Data Quality & Observability.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education

Posted Date not available

Apply

7.0 - 12.0 years

25 - 30 Lacs

bengaluru

Hybrid

Collibra data quality, SQL, and Excel, stewardship, stakeholders management Data Privacy, Security, Lineage, Literacy & Analysis. Metadata Mgmnt. Agile delivery, including user story grooming and sprint planning, development, and user acceptance. Required Candidate profile Collibra Data Quality preferred Hands-on experience in data lineage, data governance, data privacy, and security best practices, SQL, and data analysis, profiling tools Should be immediate joiner

Posted Date not available

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies