Home
Jobs

1386 Data Governance Jobs - Page 19

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 10.0 years

8 - 14 Lacs

Kanpur

Remote

Naukri logo

Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 2 weeks ago

Apply

7.0 - 10.0 years

8 - 14 Lacs

Hyderabad

Remote

Naukri logo

Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 2 weeks ago

Apply

9.0 - 13.0 years

11 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Role Description: You will play a key role in the implementation and adoption of the data governance framework which will modernize Amgen's data ecosystem, positioning Amgen as a leader in biopharma innovation. This role leverages state-of-the-art technologies, including Generative AI, Machine Learning, and integrated data. You will leverage domain, technical and business process expertise to provide exceptional support of Amgens data governance framework. This role involves working closely with business stakeholders and data analysts to ensure implementation and adoption of the data governance framework. You will collaborate with the Product Owner and other Business Analysts to ensure operational support and excellence from the team. Roles & Responsibilities: Responsible for the data governance and data management framework implementation for the General and Administrative operations (G&A) domain of the biopharma lifecycle. Responsible for the operationalization of the Enterprise data governance framework and aligning broader stakeholder community with their data governance needs, including data quality, data access controls, compliance with privacy and security regulations, foundational master data management, data sharing, communication and change management. Works with Enterprise MDM and Reference Data to enforce standards and data reusability. Drives cross functional alignment in his/her domain(s) of expertise to ensure adherence to Data Governance principles. Maintain documentation on data definitions, data standards, data flows, legacy data structures / hierarchies, common data models, data harmonization etc. for assigned domains. Ensure compliance requirements with data privacy, security, and regulatory policies for the assigned domains Jointly with Technology teams, business functions, and enterprise teams (e.g., MDM, Enterprise Data Fabric, etc.) define the specifications shaping the development and implementation of data foundations. Build strong relationships with key business leads and partners to ensure their needs are being met Functional Skills: Must-Have Functional Skills: Technical skills with knowledge of Pharma processes with specialization in the General and Administrative operations (G&A) domain of the biopharma lifecycle. In depth knowledge of data management, common data models, metadata management, data quality, master data management, data stewardship, data protection, etc. In depth experience with data products development life cycle, including the enablement of data dictionaries, business glossary to increase data products reusability and data literacy. Customer-focused with excellent written and verbal communication skills who can confidently work with internal Amgen business stakeholders and external service partners on business process and technology topics In depth experience of working with or supporting systems used to data governance framework. E.g. Collibra, Alation Excellent problem-solving skills and committed attention to detail in finding solutions Good-to-Have Functional Skills: Experience of working with data governance councils or forums Experience with Agile software development methodologies (Scrum) Proficiency in data analysis and quality tools (e.g., SQL, Excel, Python, or SAS) Soft Skills: Highly organized and able to work under minimal supervision Excellent analytical and assessment skills Ability to work effectively with global, virtual teams Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Ambitious to further develop their skills and career Ability to build business relationships and understand end-to-end data use and needs. Excellent interpersonal skills (team player). People management skills either in matrix or direct line function. Strong verbal and written communication skills High degree of initiative and self-motivation. Good presentation and public speaking skills. Strong attention to detail, quality, time management and customer focus. Basic Qualifications: Any Degree and 9-13 Years of Experience

Posted 2 weeks ago

Apply

5.0 - 9.0 years

7 - 17 Lacs

Pune

Work from Office

Naukri logo

Job Overview: Diacto is looking for a highly capable Data Architect with 5 to 9 years of experience to lead cloud data platform initiatives with a primary focus on Snowflake and Azure Data Hub. This individual will play a key role in defining the data architecture strategy, implementing robust data pipelines, and enabling enterprise-grade analytics solutions. This is an on-site role based in our Baner, Pune office. Qualifications: B.E./B.Tech in Computer Science, IT, or related discipline MCS/MCA or equivalent preferred Key Responsibilities: Design and implement enterprise-level data architecture with a strong focus on Snowflake and Azure Data Hub Define standards and best practices for data ingestion, transformation, and storage Collaborate with cross-functional teams to develop scalable, secure, and high-performance data pipelines Lead Snowflake environment setup, configuration, performance tuning, and optimization Integrate Azure Data Services with Snowflake to support diverse business use cases Implement governance, metadata management, and security policies Mentor junior developers and data engineers on cloud data technologies and best practices Experience and Skills Required: 5 to 9 years of overall experience in data architecture or data engineering roles Strong, hands-on expertise in Snowflake , including design, development, and performance tuning Solid experience with Azure Data Hub and Azure Data Services (Data Lake, Synapse, etc.) Understanding of cloud data integration techniques and ELT/ETL frameworks Familiarity with data orchestration tools such as DBT, Airflow , or Azure Data Factory Proven ability to handle structured, semi-structured, and unstructured data Strong analytical, problem-solving, and communication skills Nice to Have: Certifications in Snowflake and/or Microsoft Azure Experience with CI/CD tools like GitHub for code versioning and deployment Familiarity with real-time or near-real-time data ingestion Why Join Diacto Technologies? Work with a cutting-edge tech stack and cloud-native architectures Be part of a data-driven culture with opportunities for continuous learning Collaborate with industry experts and build transformative data solutions Competitive salary and benefits with a collaborative work environment in Baner, Pune How to Apply: Option 1 (Preferred) Copy and paste the following link on your browser and submit your application for automated interview process : - https://app.candidhr.ai/app/candidate/gAAAAABoRrcIhRQqJKDXiCEfrQG8Rtsk46Etg4-K8eiwqJ_GELL6ewSC9vl4BjaTwUAHzXZTE3nOtgaiQLCso_vWzieLkoV9Nw==/ Option 2 1. Please visit our website's career section at https://www.diacto.com/careers/ 2. Scroll down to the " Who are we looking for ?" section 3. Find the listing for " Data Architect (Snowflake) " and 4. Proceed with the virtual interview by clicking on " Apply Now ."

Posted 2 weeks ago

Apply

7.0 - 10.0 years

8 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title : Data Engineer / Data Modeler. Location : Remote (India). Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 2 weeks ago

Apply

7.0 - 10.0 years

8 - 14 Lacs

Jaipur

Work from Office

Naukri logo

Job Title : Data Engineer / Data Modeler. Location : Remote (India). Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 2 weeks ago

Apply

7.0 - 10.0 years

8 - 14 Lacs

Chennai

Remote

Naukri logo

Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 2 weeks ago

Apply

5.0 - 7.0 years

8 - 14 Lacs

Chennai

Work from Office

Naukri logo

Business Analyst focused on designing and implementing a Data Governance strategy and Master Data Management (MDM) framework. This role will support the high-level design and detailed design phases of a transformative project involving systems such as 3DS PLM, SAP, Team Centre, and Blue Yonder. The ideal candidate will bring a blend of business analysis expertise, data governance knowledge, and automotive/manufacturing domain experience to drive workshops, map processes, and deliver actionable recommendations. Working closely with the GM of Master Data and MDM technical resources, you will play a pivotal role in aligning people, processes, and technology to achieve M&M's data governance and MDM objectives. Key Responsibilities : - Requirements Gathering & Workshops: Lead and facilitate workshops with business and IT stakeholders to elicit requirements, define data governance policies, and establish MDM strategies for automotive specific data domains (e.g. , parts, engineering data, bill of material, service parts, supplier and dealer master data). - Process Mapping & Design: Document and design master data-related processes, including data flows between systems such as 3DS, SAP, Talend, and Blue Yonder, ensuring alignment with business needs and technical feasibility. - Analysis & Recommendations: Analyse existing data structures, processes, and system integrations to identify gaps and opportunities; provide clear, actionable recommendations to support Data Governance and MDM strategy. - Stakeholder Collaboration: Act as a bridge between business units, IT teams, and technical resources (e.g. , 3DS specialists) to ensure cohesive delivery of the project objectives. - Documentation & Communication: Create high-quality deliverables, including process maps, requirement specifications, governance frameworks, and summary reports, tailored to both technical and non-technical audiences. - Support Detailed Design: Collaborate with the 3DS/Talend technical resource to translate high-level designs into detailed MDM solutions, ensuring consistency across people, process, and technology components. - Project Support: Assist the MDM Leadership in planning, tracking, and executing project milestones, adapting to evolving client needs. Experience : Required Skills & Qualifications : - 5+ years of experience as a Business Analyst, with a focus on data governance, master data management (MDM) such as Talend, Informatica, Reltio etc. - Proven track record of working on auto/manufacturing industry projects, ideally with exposure to systems like 3DS, Team Centre, SAP S/4HANA, MDG, or Blue Yonder. Technical Knowledge : - Strong understanding of MDM concepts, data flows, and governance frameworks. - Familiarity with auto-specific data domains (e.g. , ECCMA/E-Class Schema). - Experience with process modelling tools (e.g. , Visio, Lucid chart, or BPMN) and documentation standards. Soft Skills : - Exceptional communication and facilitation skills, with the ability to engage diverse stakeholders and drive consensus in workshops. - Methodical and structured approach to problem-solving and project delivery. - Ability to summarize complex information into clear, concise recommendations. - Education: Bachelor's degree in business, Information Systems, or a related field (or equivalent experience). - Certifications: Relevant certifications (e.g. , CBAP, PMP, or MDM-specific credentials) are a plus but not required. Preferred Qualifications : - Prior consulting experience in a client-facing role. - Hands-on experience with MDG, Talend, Informatica, Reltio etc. or similar MDM platforms. - Exposure to data quality analysis or profiling (not required to be at a Data Analyst level).

Posted 2 weeks ago

Apply

7.0 - 10.0 years

8 - 14 Lacs

Lucknow

Remote

Naukri logo

Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Pune, Chennai

Work from Office

Naukri logo

Data Modeler- Primary Skills: Data modeling (conceptual, logical, physical), relational/dimensional/data vault modeling, ERwin/IBM Infosphere, SQL (Oracle, SQL Server, PostgreSQL, DB2), banking domain data knowledge (Retail, Corporate, Risk, Compliance), data governance (BCBS 239, AML, GDPR), data warehouse/lake design, Azure cloud. Secondary Skills: MDM, metadata management, real-time modeling (payments/fraud), big data (Hadoop, Spark), streaming platforms (Confluent), M&A data integration, data cataloguing, documentation, regulatory trend awareness. Soft skills: Attention to detail, documentation, time management, and teamwork.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 17 Lacs

Mumbai, Navi Mumbai, Mumbai (All Areas)

Hybrid

Naukri logo

Job Title: MDM Informatica and MDG Informatica Job Location : Mumbai Experience : 4 to 12 years Notice period: Immediate to 30 days Role Description She/he would serve as a key resource to build and execute master data management (MDM) and reference data management (RDM) implementation and strategy for PwCs client. This individual will work with key stakeholders, business partners, and IT counterparts to define requirements, analyze source data, assist with high level data designs, data modeling, design and development and troubleshooting data issues. The person should be adept at solving diverse business problems by utilizing a variety of different tools, strategies, and methodologies. S/he should be comfortable working with data in varying technology platforms, data types, and formats. Key Responsibilities Responsible for End-to-End Development Activities in any MDM tool. Experience in developing multidomain MDM solutions. Expertise in working with Banking / Life Science/ Retail/ Manufacturing industry Master Data. Proficient in any leading MDM tool configurations Data Modeling, User Group & Privilege Setup, Hierarchy Management, Workflow Setup, etc. Reference Data Management in the MDM tool. Expertise in Data Quality (DQ) rules, Match and Merge strategies, Golden Records Survivorship Rules, and Match Tuning. Ability to configure the User Interface using Provisioning Tool. Good knowledge of Integration Endpoints (Outbound and Inbound integration endpoints using SOAP, REST, and other upstream/downstream systems). Extend support to Data Stewards to improve business processes. Assist in MDM support activities. Implement and support data loading to the MDM application. Customize MDM solutions, creating master data management solutions using agile methodologies. Strong understanding of MDM concepts such as Integration Patterns, Trust, Survivorship, Publishing, and Change Data Capture (CDC). Capable of working on new developments by understanding the data model, making changes to the repository as needed, onboarding source data, DQ configurations, and enrichment in MDM. Execution of MDM process and oversee gaps, inefficiencies and improvements needed Deep Understanding of business scenarios and converting them to MDM and RDM business logic Experience in SDLC life cycle activities such as Requirement Analysis, Design, Process Flows, Fit/Gap Analysis, Solution Design, CRP, and Testing Experience in SDLC life cycle activities such as Requirement Analysis, Design, Process Flows, Fit/Gap Analysis, Solution Design, CRP, and Testing Experience in SDLC life cycle activities such as Requirement Analysis, Design, Process Flows, Fit/Gap Analysis, Solution Design, CRP, and Testing Hands-on experience in SDLC process like requirement analysis, design, process flows, gap analysis, solution design and testing Pro-active in identifying and resolving incidents related to MDM and RDM raised by business teams Managing data governance principles including development & updating of rules and regulations, reviewing changes and discussing with concerned business teams Nice to Have: Exposure to other MDM tools, such as IBM MDM, Reltio, or STIBO. Experience working with business, technology, and process workstreams for master data identification, definition, MDM requirements, architecture patterns, and setting up MDM platforms across the organization. Understanding of business domains, data definitions, technical and operational metadata, with the ability to document and maintain a data dictionary and data quality definitions at both concept and data element levels. Certification in Informatica MDM or other MDM tools is a plus. Good knowledge of industries such as BFSI, Life Sciences, or Manufacturing, and related products is preferred.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

12 - 13 Lacs

Hyderabad

Work from Office

Naukri logo

As an ICM developer within Global Incentive Compensations (GIC), you will be responsible for developing, maintaining, and testing the systems used for compensating the Salesforce sales organization, as we'll as tools used by internal teams for their functions. We are seeking a talented and motivated individual with hands-on experience in Incentive Compensation Management (ICM) to join our team. The ideal candidate will possess both strong technical skills and functional expertise to design, evaluate, and develop systems and processes that meet business user requirements and comply with IT and Finance audit guidelines. Additionally, expertise in Salesforce Data Management, including creating scenarios and seeding data, will be essential for the role. The role will involve managing and optimizing our ICM platform, including customization, configuration, and integration of processes, alongside handling data management tasks within Salesforce. The candidate should have a good understanding of Salesforce administration and be capable of designing, implementing, and maintaining ICM solutions that drive effective sales compensation strategies. Key Responsibilities: ICM (Incentive Compensation Management): Understand incentive compensation models, including commission structures, bonus programs, and sales performance metrics within Salesforce. Strong experience with compensation systems (Spiff, Everstage, Captivate IQ, Varicent, or Xactly). Develop and test processes within Xactly Connect or other ETL tools to support movement of reference and transactional data. Provide regular analysis and insights into compensation plans, recommending improvements based on system data. Manage the sales compensation workflow, ensuring timely and accurate payments. ICM implementation experience. Salesforce Data Management: Conduct data imports/exports and cleanup activities as needed. Experience in Salesforce Administration, SOQL, Apex, JavaScript (Lightning), and VisualForce is preferable. Ensure data integrity and enforce best practices for data governance in Salesforce. Collaboration and Communication: Provide regular updates and reports to stakeholders about ICM performance. Stay current on best practices and emerging technologies in Salesforce administration and ICM management. Skills and Experience: A minimum of 3 years of experience is required, 3 to 5 years preferred. Degree in Computer Science, Information Systems, or equivalent industry experience in application and software development. Understanding of database infrastructure and concepts. Experience developing customer-facing interfaces. Good communication skills. Problem-solving ability for high-level software and application issues. Familiarity with Agile (scrum) framework is preferred

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Hyderabad, Delhi / NCR

Hybrid

Naukri logo

Proficiency in data modeling tools such as ER/Studio, ERwin or similar. Deep understanding of relational database design, normalization/denormalization, and data warehousing principles. Experience with SQL and working knowledge of database platforms like Oracle, SQL Server, PostgreSQL, or Snowflake. Strong knowledge of metadata management, data lineage, and data governance practices. Understanding of data integration, ETL processes, and data quality frameworks. Ability to interpret and translate complex business requirements into scalable data models. Excellent communication and documentation skills to collaborate with cross-functional teams.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

16 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Project description We are seeking a highly skilled Data Modelling Expert with deep experience in the Avaloq Core Banking platform to join our technology team. The ideal candidate will be responsible for designing, maintaining, and optimizing complex data models that support our banking products, client data, and regulatory reporting needs. This role requires a blend of domain expertise in banking and wealth management, strong data architecture capabilities, and hands-on experience working with the Avaloq platform. Responsibilities Design, implement, and maintain conceptual, logical, and physical data models within the Avaloq Core Banking system. Collaborate with business analysts, product owners, and Avaloq parameterisation teams to translate business requirements into robust data models. Ensure alignment of data models with Avaloq's object model and industry best practices. Perform data profiling, quality checks, and lineage tracing to support regulatory and compliance requirements (e.g., Basel III, MiFID II, ESG). Support integration of Avaloq data with downstream systems (e.g., CRM, data warehouses, reporting platforms). Provide expert input on data governance, metadata management, and model documentation. Contribute to change requests, upgrades, and data migration projects involving Avaloq. Collaborate with cross-functional teams to drive data consistency, reusability, and scalability. Review and validate existing data models, identify gaps or optimisation opportunities. Ensure data models meet performance, security, and privacy requirements. Skills Must have Proven experience (5+ years) in data modelling or data architecture, preferably within financial services. 3+ years of hands-on experience with Avaloq Core Banking Platform, especially its data structures and object model. Strong understanding of relational databases and data modelling tools (e.g., ER/Studio, ERwin, or similar). Proficient in SQL and data manipulation in Avaloq environments. Knowledge of banking products, client lifecycle data, and regulatory data requirements. Familiarity with data governance, data quality, and master data management concepts. Experience working in Agile or hybrid project delivery environments. Nice to have Exposure to Avaloq Scripting or parameterisation is a strong plus. Experience integrating Avaloq with data lakes, BI/reporting tools, or regulatory platforms. Understanding of data privacy regulations (GDPR, FINMA, etc.). Certification in Avaloq or relevant financial data management domains is advantageous. Other Languages EnglishC1 Advanced Seniority Senior

Posted 2 weeks ago

Apply

5.0 - 7.0 years

20 - 25 Lacs

Bangalore Rural

Work from Office

Naukri logo

Data Governance Specialist Req number: R5477 Employment type: Full time Worksite flexibility: Remote Who we are CAI is a global technology services firm with over 8,500 associates worldwide and a yearly revenue of $1 billion+. We have over 40 years of excellence in uniting talent and technology to power the possible for our clients, colleagues, and communities. As a privately held company, we have the freedom and focus to do what is right—whatever it takes. Our tailor-made solutions create lasting results across the public and commercial sectors, and we are trailblazers in bringing neurodiversity to the enterprise. Job Summary We are looking for a motivated Data Governance Specialist ready to take us to the next level! If you are strong in creating and maintaining integrations between data systems and data governance tools and are looking for your next career move, apply now Job Description We are looking for a Data Governance Specialist to develop and maintain Collibra workflows to support data governance initiatives. This position will be Full Time and remote. What You’ll Do Develop and maintain Collibra workflows to support data governance initiatives. Create and maintain integrations between data systems and data governance tools. Write and maintain data quality rules to measure data quality. Work with vendors to troubleshoot and resolve technical issues related to workflows and integrations. Work with other teams to ensure adherence to DG policies and standards. Assist in implementing data governance initiatives around data quality, master data, and metadata management. What You'll Need Required: Experience: 5-7 years Strong programming skills. Knowledge of system integration and use of middleware solutions. Proficiency in SQL and relational databases. Understanding of data governance, including data quality, master data, and metadata management. Willingness to learn new tools and skills. Preferred Proficient with Java or Groovy. Proficient with Mulesoft or other middleware. Proficient with Collibra DIP, Collibra Data Quality, and DQLabs. Experience with AWS Redshift, and Databricks. Physical Demands Ability to safely and successfully perform the essential job functions Sedentary work that involves sitting or remaining stationary most of the time with occasional need to move around the office to attend meetings, etc. Ability to conduct repetitive tasks on a computer, utilizing a mouse, keyboard, and monitor Reasonable accommodation statement If you require a reasonable accommodation in completing this application, interviewing, completing any pre-employment testing, or otherwise participating in the employment selection process, please direct your inquiries to application.accommodations@cai.io or (888) 824 – 8111.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

4 - 8 Lacs

Chennai

Work from Office

Naukri logo

Your Role Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications. 1. Applies scientific methods to analyse and solve software engineering problems. 2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance. 3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers. 4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities. 5. The software engineer collaborates and acts as team player with other software engineers and stakeholders. Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications.1. Applies scientific methods to analyse and solve software engineering problems.2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance.3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers.4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities.5. The software engineer collaborates and acts as team player with other software engineers and stakeholders. Your Profile 4+ years of experience in data architecture, data warehousing, and cloud data solutions. Minimum 3+ years of hands-on experience with End to end Snowflake implementation. Experience in developing data architecture and roadmap strategies with knowledge to establish data governance and quality frameworks within Snowflake Expertise or strong knowledge in Snowflake best practices, performance tuning, and query optimisation. Experience with cloud platforms like AWS or Azure and familiarity with Snowflakes integration with these environments. Strong knowledge in at least one cloud(AWS or Azure) is mandatory Solid understanding of SQL, Python, and scripting for data processing and analytics. Experience in leading teams and managing complex data migration projects. Strong communication skills, with the ability to explain technical concepts to non-technical stakeholders. Knowledge on new Snowflake features,AI capabilities and industry trends to drive innovation and continuous improvement. Skills (competencies) Verbal Communication

Posted 2 weeks ago

Apply

2.0 - 7.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Develop, test and support future-ready data solutions for customers across industry verticals Develop, test, and support end-to-end batch and near real-time data flows/pipelines Demonstrate understanding in data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum of 2+ years of related experience required Experience in modeling and business system designs Good hands-on experience on DataStage, Cloud based ETL Services Have great expertise in writing TSQL code Well versed with data warehouse schemas and OLAP techniques Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate Must be a strong team player/leader Ability to lead Data transformation project with multiple junior data engineers Strong oral written and interpersonal skills for interacting and throughout all levels of the organization. Ability to clearly communicate complex business problems and technical solutions.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

9 - 18 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Job Title: Informatica MDM Developer Location: Bangalore & Chennai & Hyderabad & Pune Work Mode: Hybrid (Work from Client Office 23 days/week) Experience: 5 to 10 Years Role Overview: We are seeking an experienced Informatica MDM Developer to design and implement Master Data Management solutions. This role is key to ensuring the accuracy, consistency, and integrity of critical business data across various systems. You will collaborate with cross-functional teams and contribute to data governance and process optimization. Key Responsibilities: Design and develop MDM models, hierarchies, and workflows using Informatica MDM Hub and IDD Integrate data from multiple source systems into the MDM platform Define and configure match & merge rules to consolidate duplicate records Implement data quality rules , validations, and exception handling Automate data stewardship and approval workflows using ActiveVOS Monitor and tune MDM performance for responsiveness and scalability Collaborate with business stakeholders, data stewards, and IT teams Maintain technical documentation including system design and configuration details Required Qualifications: Bachelor’s degree in Computer Science , Information Technology , or related field Minimum 5 years of hands-on experience with Informatica MDM , including: MDM Hub IDD (Informatica Data Director) ActiveVOS Proficiency in SQL and experience with relational databases (e.g., Oracle, SQL Server) Experience with ETL processes and data integration tools Strong understanding of data governance principles and data quality practices Excellent analytical and communication skills Preferred Qualifications: Experience with Informatica Cloud MDM or Informatica Data Quality (IDQ) Familiarity with data privacy standards (e.g., GDPR , CCPA ) Experience in Agile/Scrum development environments

Posted 2 weeks ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Informatica MDM Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems, contributing to the overall efficiency and reliability of data management within the organization. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications.- Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM.- Good To Have Skills: Experience with data warehousing concepts and practices.- Strong understanding of data modeling techniques and best practices.- Familiarity with SQL and database management systems.- Experience in implementing data governance and data quality frameworks. Additional Information:- The candidate should have minimum 3 years of experience in Informatica MDM.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Informatica Data Quality Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with team members to enhance data workflows and contribute to the overall efficiency of data management practices. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the design and implementation of data architecture to support data initiatives.- Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica Data Quality.- Strong understanding of data integration techniques and ETL processes.- Experience with data profiling and data cleansing methodologies.- Familiarity with database management systems and SQL.- Knowledge of data governance and data quality best practices. Additional Information:- The candidate should have minimum 3 years of experience in Informatica Data Quality.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Navi Mumbai

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Business AgilityMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Additionally, you will monitor and optimize data workflows to enhance performance and reliability, ensuring that data is accessible and actionable for stakeholders. Roles & Responsibilities:- Need Databricks resource with Azure cloud experience- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with data architects and analysts to design scalable data solutions.- Implement best practices for data governance and security throughout the data lifecycle. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with Business Agility.- Strong understanding of data modeling and database design principles.- Experience with data integration tools and ETL processes.- Familiarity with cloud platforms and services related to data storage and processing. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Business AgilityMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Additionally, you will monitor and optimize data workflows to enhance performance and reliability, ensuring that data is accessible and actionable for stakeholders. Roles & Responsibilities:- Need Databricks resource with Azure cloud experience- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with data architects and analysts to design scalable data solutions.- Implement best practices for data governance and security throughout the data lifecycle. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with Business Agility.- Strong understanding of data modeling and database design principles.- Experience with data integration tools and ETL processes.- Familiarity with cloud platforms and services related to data storage and processing. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP Master Data Governance MDG Tool Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for overseeing the application development process and ensuring successful project delivery. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Ensure successful project delivery Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Master Data Governance MDG Tool- Strong understanding of data governance principles- Experience in configuring and customizing SAP MDG Tool- Knowledge of SAP data models and structures- Hands-on experience in data migration and data quality management- ABAP hands on mandatory. Additional Information:- The candidate should have a minimum of 5 years of experience in SAP Master Data Governance MDG Tool- This position is based at our Hyderabad office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Microsoft Azure Data Services Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems, contributing to the overall efficiency and reliability of data management within the organization. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather requirements and deliver data solutions that meet business needs.- Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Data Services.- Good To Have Skills: Experience with Azure Data Factory, Azure SQL Database, and Azure Synapse Analytics.- Strong understanding of data modeling and database design principles.- Experience with data integration and ETL tools.- Familiarity with data governance and data quality best practices. Additional Information:- The candidate should have minimum 2 years of experience in Microsoft Azure Data Services.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP Master Data Governance MDG Tool Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process and ensuring seamless communication within the team and stakeholders. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process effectively- Ensure seamless communication within the team and stakeholders Professional & Technical Skills: (MDG Technical Role)- Must To Have Skills: Proficiency in SAP Master Data Governance MDG Tool- Strong understanding of data governance principles- Experience in configuring and customizing SAP MDG Tool- Knowledge of data modeling and data quality management- Hands-on experience in leading application development projects Additional Information:- The candidate should have a minimum of 5 years of experience in SAP Master Data Governance MDG Tool- This position is based at our Pune office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies