Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 6.0 years
12 - 20 Lacs
Chennai
Remote
Develop, test, and implement Alteryx workflows and macros. Collaborate with business stakeholders to gather and analyze data requirements. Optimize existing Alteryx workflows for performance and efficiency. Troubleshoot & resolve issues .
Posted 1 week ago
10.0 - 15.0 years
25 - 27 Lacs
Mumbai
Work from Office
Preferred Qualification: BE/ B Tech/ MCA/ MBA Required Qualification: BE/ B Tech/ MCA/ MBA Skills: Incumbent should possess hands on knowledge of the database design and should be able to devise data migration strategies from scratch. o Ability to design the database E-R modelling vis-a-vis legacy system database models by understanding the legacy packages and procedures o The Incumbent would function as Technical lead for Database Migration in the transformation project and would be responsible for initiating, leading,planning and ensuring the successful migration of logacy data to the transformation o Ability to analyse, assess and evaluate trending technologies in DB related tools o Ability to understand & troubleshoot data related issues and suggest mitigation solutions. o Significant prior experience in designing, building and supporting enterprise class Database systems in the capital markets fixed income / foreign exchange / derivatives domain o Implement and govern the database migration strategies. o Experience with agile development methodologies and supporting tools o Thorough understanding of SDLC processes including testing methodologies; exposure to automated testing would be an added advantage o Good Communication and Presentation skills o Understand the legacy data model and data and provide strategic database direction to the teams. Be responsible for maintaining staging Databases Core competency Database design/ Dovelopment in Capital Markets / Fixed Income / Foreign Exchange | Derivatives Treasury domain| Information technology/ Fintech DB Security and Governance DB Performance Tuning IT Team Management & Dolivery Oracle Certification (DBA Track) on Oracie 10g/11g/19c Hands on experience with PL-SQL complex queries, procedure, packages and DB administration. Appreciation of Enterprise Functional Database Architecture in Capital Markets Cloud Native design, Agile Methodologies AWS Certification Job Purpose: The successful candidate would join the IT Department in the Transformation Programme for Clearing and Settlement systems. The transformation programme is focussed on transforming the existing applications for the clearing and settlement systems as per the new technical architecture and design. The successful candidate would be responsible for leading an database team or teams through all stages of the development lite cycle while also working closely with all other project stakeholders. Area Of Operations: PL SQL/ Database Strategic Development Management. Recruitment and Team Building Key Responsibility: * ER Modelling, Project Estimation and Delivery Tracking, DB Design, DB recon, DB Migration, DB Support (UAT and Production). Developing assessment criteria, conducting interviews, guiding the development team, problem solving, setting clear goals and delivering through teams Any Other Requirement: * Leadership skills with ability to build consensus across multiple stakeholders Would be required to work with multiple projects / teams concurrently Team building and Team working
Posted 1 week ago
10.0 - 15.0 years
12 - 17 Lacs
Mumbai
Work from Office
Position Purpose The Data Architect is to support the work for ensuring that systems are designed, upgraded, managed, de-commissioned and archived in compliance with data policy across the full data life cycle. This includes complying with the data strategy and undertaking the design of data models and supporting the management of metadata. The Data Architect mission will integrate a focus on GDPR law, with the contribution to the privacy impact assessment and Record of Process & Activities relating to personal Data. The scope is CIB EMEA and CIB ASIA Responsibilities Direct Responsibilities Engage with key business stakeholders to assist with establishing fundamental data governance processes Define key data quality metrics and indicators and facilitate the development and implementation of supporting standards Help to identify and deploy enterprise data best practices such as data scoping, metadata standardization, data lineage, data deduplication, mapping and transformation and business validation Structures the information in the Information System (any data modelling tool like Abacus), i.e. the way information is grouped, as well as the navigation methods and the terminology used within the Information Systems of the entity, as defined by the lead data architects. Creates and manages data models (Business Flows of Personal Data with process involved) in all their forms, including conceptual models, functional database designs, message models and others in compliance with the data framework policy Allows people to step logically through the Information System (be able to train them to use tools like Abacus) Contribute and enrich the Data Architecture framework through the material collected during analysis, projects and IT validations Update all records in Abacus collected from stakeholder interviews/ meetings. Skill Area Expected Communicating between the technical and the non-technical Is able to communicate effectively across organisational, technical and political boundaries, understanding the context. Makes complex and technical information and language simple and accessible for non- technical audiences. Is able to advocate and communicate what a team does to create trust and authenticity, and can respond to challenge. Able to effectively translate and accurately communicate across technical and non- technical stakeholders as well as facilitating discussions within a multidisciplinary team, with potentially difficult dynamics. Data Modelling (Business Flows of Data in Abacus) Produces data models and understands where to use different types of data models. Understands different tools and is able to compare between different data models. Able to reverse engineer a data model from a live system. Understands industry recognized data modelling patterns and standards. Understands the concepts and principles of data modelling and is able to produce, maintain and update relevant data models for specific business needs. Data Standards (Rules defined to manage/ maintain Data) Develops and sets data standards for an organisation. Communicates the business benefit of data standards, championing and governing those standards across the organisation. Develops data standards for a specific component. Analyses where data standards have been applied or breached and undertakes an impact analysis of that breach. Metadata Management Understands a variety of metadata management tools. Designs and maintains the appropriate metadata repositories to enable the organization to understand their data assets. Works with metadata repositories to complete and Maintains it to ensure information remains accurate and up to date. The objective is to manage own learning and contribute to domain knowledge building Turning business problems into data design Works with business and technology stakeholders to translate business problems into data designs. Creates optimal designs through iterative processes, aligning user needs with organisational objectives and system requirements. Designs data architecture by dealing with specific business problems and aligning it to enterprise-wide standards and principles. Works within the context of well understood architecture and identifies appropriate patterns. Contributing Responsibilities It is expected that the data architect applies knowledge and experience of the capability, including tools and technique and adopts those that are more appropriate for the environment. The Data Architect needs to have the knowledge of: The Functional & Application Architecture, Enterprise Architecture and Architecture rules and principles The activities Global Market and/or Global Banking Market meta-models, taxonomies and ontologies (such as FpML, CDM, ISO2022) Skill Area Expected Data Communication Uses the most appropriate medium to visualise data to tell compelling and actionable stories relevant for business goals. Presents, communicates and disseminates data appropriately and with high impact. Able to create basic visuals and presentations. Data Governance Understands data governance and how it works in relation to other organisational governance structures. Participates in or delivers the assurance of a service. Understands what data governance is required and contribute to these data governance. Data Innovation Recognises and exploits business opportunities to ensure more efficient and effective performance of organisations. Explores new ways of conducting business and organisational processes Aware of opportunities for innovation with new tools and uses of data Technical & Behavioral Competencies 1. Able to effectively translate and accurately communicate across technical and non- technical stakeholders as well as facilitating discussions within a multidisciplinary team, with potentially difficult dynamics. 2. Able to create basic visuals and presentations. 3. Experience in working with Enterprise Tools for Data Cataloging and Data Management (like Abacus, collibra, Alation, etc) 4. Experience in working with BI Tools (Like Power BI) 5. Good understanding of Excel (formulas and Functions) Specific Qualifications (if required) Preferred: BE/ BTech, BSc-IT, BSc-Comp, MSc-IT, MSc Comp, MCA Skills Referential Behavioural Skills : (Please select up to 4 skills) Communication skills - oral & written Ability to collaborate / Teamwork Ability to deliver / Results driven Creativity & Innovation / Problem solving Transversal Skills: (Please select up to 5 skills) Analytical Ability Ability to understand, explain and support change Ability to develop and adapt a process Ability to anticipate business / strategic evolution Choose an item. Education Level: Bachelor Degree or equivalent Experience Level At least 10 years Other/Specific Qualifications (if required) 1. Experience in GDPR (General Data Protection Regulation) or in Privacy by Design would be preferred 2. DAMA Certified (Good to have)
Posted 1 week ago
7.0 - 12.0 years
9 - 14 Lacs
Mumbai
Work from Office
Position Purpose The Data Architect is to support the work for ensuring that systems are designed, upgraded, managed, de-commissioned and archived in compliance with data policy across the full data life cycle. This includes complying with the data strategy and undertaking the design of data models and supporting the management of metadata. The Data Architect mission will integrate a focus on GDPR law, with the contribution to the privacy impact assessment and Record of Process & Activities relating to personal Data. The scope is CIB EMEA and CIB ASIA Responsibilities Direct Responsibilities Engage with key business stakeholders to assist with establishing fundamental data governance processes Define key data quality metrics and indicators and facilitate the development and implementation of supporting standards Help to identify and deploy enterprise data best practices such as data scoping, metadata standardization, data lineage, data deduplication, mapping and transformation and business validation Structures the information in the Information System (any data modelling tool like Abacus), i.e. the way information is grouped, as well as the navigation methods and the terminology used within the Information Systems of the entity, as defined by the lead data architects. Creates and manages data models (Business Flows of Personal Data with process involved) in all their forms, including conceptual models, functional database designs, message models and others in compliance with the data framework policy Allows people to step logically through the Information System (be able to train them to use tools like Abacus) Contribute and enrich the Data Architecture framework through the material collected during analysis, projects and IT validations Update all records in Abacus collected from stakeholder interviews/ meetings. Skill Area Expected Communicating between the technical and the non-technical Is able to communicate effectively across organisational, technical and political boundaries, understanding the context. Makes complex and technical information and language simple and accessible for non- technical audiences. Is able to advocate and communicate what a team does to create trust and authenticity, and can respond to challenge. Able to effectively translate and accurately communicate across technical and non- technical stakeholders as well as facilitating discussions within a multidisciplinary team, with potentially difficult dynamics. Data Modelling (Business Flows of Data in Abacus) Produces data models and understands where to use different types of data models. Understands different tools and is able to compare between different data models. Able to reverse engineer a data model from a live system. Understands industry recognized data modelling patterns and standards. Understands the concepts and principles of data modelling and is able to produce, maintain and update relevant data models for specific business needs. Data Standards (Rules defined to manage/ maintain Data) Develops and sets data standards for an organisation. Communicates the business benefit of data standards, championing and governing those standards across the organisation. Develops data standards for a specific component. Analyses where data standards have been applied or breached and undertakes an impact analysis of that breach. Metadata Management Understands a variety of metadata management tools. Designs and maintains the appropriate metadata repositories to enable the organization to understand their data assets. Works with metadata repositories to complete and Maintains it to ensure information remains accurate and up to date. The objective is to manage own learning and contribute to domain knowledge building Turning business problems into data design Works with business and technology stakeholders to translate business problems into data designs. Creates optimal designs through iterative processes, aligning user needs with organisational objectives and system requirements. Designs data architecture by dealing with specific business problems and aligning it to enterprise-wide standards and principles. Works within the context of well understood architecture and identifies appropriate patterns. Contributing Responsibilities It is expected that the data architect applies knowledge and experience of the capability, including tools and technique and adopts those that are more appropriate for the environment. The Data Architect needs to have the knowledge of: The Functional & Application Architecture, Enterprise Architecture and Architecture rules and principles The activities Global Market and/or Global Banking Market meta-models, taxonomies and ontologies (such as FpML, CDM, ISO2022) Skill Area Expected Data Communication Uses the most appropriate medium to visualise data to tell compelling and actionable stories relevant for business goals. Presents, communicates and disseminates data appropriately and with high impact. Able to create basic visuals and presentations. Data Governance Understands data governance and how it works in relation to other organisational governance structures. Participates in or delivers the assurance of a service. Understands what data governance is required and contribute to these data governance. Data Innovation Recognises and exploits business opportunities to ensure more efficient and effective performance of organisations. Explores new ways of conducting business and organisational processes Aware of opportunities for innovation with new tools and uses of data Technical & Behavioral Competencies 1. Able to effectively translate and accurately communicate across technical and non- technical stakeholders as well as facilitating discussions within a multidisciplinary team, with potentially difficult dynamics. 2. Able to create basic visuals and presentations. 3. Experience in working with Enterprise Tools (like Abacus, informatica, big data, collibra, etc) 4. Experience in working with BI Tools (Like Power BI) 5. Good understanding of Excel (formulas and Functions) Specific Qualifications (if required) Preferred: BE/ BTech, BSc-IT, BSc-Comp, MSc-IT, MSc Comp, MCA Skills Referential Behavioural Skills : (Please select up to 4 skills) Communication skills - oral & written Ability to collaborate / Teamwork Ability to deliver / Results driven Creativity & Innovation / Problem solving Transversal Skills: (Please select up to 5 skills) Analytical Ability Ability to understand, explain and support change Ability to develop and adapt a process Ability to anticipate business / strategic evolution Choose an item. Education Level: Bachelor Degree or equivalent Experience Level At least 7 years Other/Specific Qualifications (if required) 1. Experience in GDPR (General Data Protection Regulation) or in Privacy by Design would be preferred 2. DAMA Certified.
Posted 1 week ago
2.0 - 6.0 years
5 - 8 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Job Description KPI Partners is seeking an enthusiastic and skilled Data Engineer specializing in STIBO (STEP) development to join our dynamic team. As a pivotal member of our data engineering team, you will be responsible for designing, developing, and implementing data solutions that meet the needs of our clients. This role requires a strong understanding of data management principles along with technical expertise in the STIBO STEP platform. Key Responsibilities - Design and develop data models and solutions using STIBO STEP for effective Master Data Management (MDM). - Collaborate with data architects, data analysts, and business stakeholders to gather requirements and translate them into technical specifications. - Implement and maintain ETL processes for data extraction, transformation, and loading to ensure data integrity and reliability. - Optimize data pipelines and workflows for performance and efficiency. - Monitor data quality and implement best practices for data governance. - Troubleshoot and resolve technical issues related to STIBO STEP development and data processes. - Provide technical support and guidance to team members and stakeholders regarding best practices in data management. Qualifications. - Bachelor’s degree in Computer Science, Information Technology, or a related field. - Proven experience as a Data Engineer or in a similar role, with a focus on STIBO (STEP) development. - Strong understanding of Master Data Management concepts and methodologies. - Proficiency in data modeling and experience with ETL tools and data integration processes. - Familiarity with database technologies such as SQL Server, Oracle, or PostgreSQL. - Excellent problem-solving skills and the ability to work independently as well as part of a team. - Strong communication skills to effectively collaborate with technical and non-technical stakeholders. - Experience with data visualization tools is a plus. What We Offer. - Competitive salary and performance-based incentives. - Opportunity to work on innovative projects in a collaborative environment. - Professional development and training opportunities to enhance your skills. - A flexible work environment that promotes work-life balance. - A vibrant company culture that values creativity and teamwork. If you are passionate about data engineering and want to play a crucial role in shaping our clients' data strategies, we would love to hear from you! Apply now to join KPI Partners in delivering impactful data solutions. KPI Partners is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.
Posted 1 week ago
6.0 - 8.0 years
27 - 42 Lacs
Bengaluru
Work from Office
Job Summary The Sr. Developer role involves designing developing and maintaining data integration solutions using Abinitio admin and other ETL tools. The candidate will work on data warehousing projects ensuring efficient data processing and integration. This position requires a strong understanding of data warehousing concepts and proficiency in SQL and Unix Shell Scripting. The role is hybrid with no travel required. Responsibilities Design and develop data integration solutions using Ab Initio tools to ensure seamless data processing and transformation. Collaborate with cross-functional teams to gather and analyze requirements for data warehousing projects. Implement ETL processes to extract transform and load data from various sources into data warehouses. Optimize SQL queries to enhance performance and ensure efficient data retrieval and manipulation. Utilize Unix Shell Scripting for automation and scheduling of data processing tasks. Monitor and troubleshoot data integration workflows to ensure data accuracy and integrity. Provide technical support and guidance to team members on data warehousing and ETL best practices. Conduct regular reviews of data integration processes to identify areas for improvement and implement necessary changes. Ensure compliance with data governance and security standards in all data integration activities. Collaborate with stakeholders to understand business requirements and translate them into technical specifications. Develop and maintain documentation for data integration processes and workflows. Stay updated with the latest trends and technologies in data warehousing and ETL to drive innovation. Contribute to the companys mission by enabling data-driven decision-making through robust data integration solutions. Qualifications Demonstrate expertise in Data Warehousing Concepts and Scheduling Basics to design effective data solutions. Possess strong skills in ETL and SQL to manage data extraction and transformation processes efficiently. Show proficiency in Ab Initio GDE Conduct>It and Co>Operating System for advanced data integration tasks. Have experience in Unix Shell Scripting to automate and streamline data workflows. Nice to have domain experience in Telecom to understand industry-specific data requirements. Exhibit ability to work in a hybrid model balancing remote and on-site tasks effectively. Display strong analytical and problem-solving skills to address data integration challenges. Certifications Required Ab Initio Certification SQL Certification
Posted 1 week ago
15.0 - 20.0 years
20 - 25 Lacs
Hyderabad
Work from Office
Job Role: Global Data Governance and Quality Lead Job Location: Hyderabad Work Mode: (work from office) Shift Timings : 2 PM to 11 PM Job Overview This role will lead and deliver the implementation and institutionalization of Data Governance and Master Data Management (MDM) platform along with the complimentary processes, across the whole global firm. This initiative is one of the foundational fixes around underlying data management that the Global Data Strategy & Architecture team is addressing to drive standardization and simplification of data creation, management and usage at Clifford Chance. The role will oversee an external delivery team to achieve target outcomes on time and on budget. Who you will work with : You will work closely with data management teams, business stakeholders, IT professionals & delivery partners to deliver on the Program objectives. What you will do and be responsible for Project Scoping and Planning Develop a multi-year project and budget plan for the execution of the Data Governance and Master Data Management (MDM) priorities, aligning approach with the broader Global Data Strategy & Architecture team Establish positive relationships with a large network of cross-functional and leadership stakeholders to drive engagement, buy-in and collaborative working arrangements to support delivery of target outcomes Work with the Global Head of Data Strategy & Architecture to select the technology platforms and external delivery partner to support the project Project Delivery Provide day-to-day oversight of selected external delivery partner, leading on target outcomes: Single source of truth for firms master data Standardized and managed taxonomies across the firm Clearly defined linkages and relationships between master and taxonomy data Clearly defined categories of firm’s data that is owned and managed by nominated individuals Governance approach to changes to master data Approach to maximised automated data capture to minimize manual entry and defects, and implementation Implementation of selected MDM platform Embedded behavioural change in the firm around use of data Work closely with the ERP/CGP Data Governance Lead to ensure that the guidance and support to these programmes to establish and maintain data standards, policies, and processes is in line with the firm-wide approach. Project Management and Stakeholder Management To own the plan for the delivery of project outcomes, managing project and technical interdependencies and a large network of cross-functional and leadership stakeholders to deliver and embed project outputs into BAU, e.g. business units, Technology, Legal and Compliance. To continuously ensure that operational and technical outcome of the projects aligns with the expected strategic and business outcomes of the overall Global Data Strategy & Architecture program
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Mumbai
Work from Office
This role requires deep understanding of data warehousing, business intelligence (BI), and data governance principles, with strong focus on the Microsoft technology stack. Data Architecture: Develop and maintain the overall data architecture, including data models, data flows, data quality standards. Design and implement data warehouses, data marts, data lakes on Microsoft Azure platform Business Intelligence: Design and develop complex BI reports, dashboards, and scorecards using Microsoft Power BI. Data Engineering: Work with data engineers to implement ETL/ELT pipelines using Azure Data Factory. Data Governance: Establish and enforce data governance policies and standards. Primary Skills Experience: 15+ years of relevant experience in data warehousing, BI, and data governance. Proven track record of delivering successful data solutions on the Microsoft stack. Experience working with diverse teams and stakeholders. Required Skills and Experience: Technical Skills: Strong proficiency in data warehousing concepts and methodologies. Expertise in Microsoft Power BI. Experience with Azure Data Factory, Azure Synapse Analytics, and Azure Databricks. Knowledge of SQL and scripting languages (Python, PowerShell). Strong understanding of data modeling and ETL/ELT processes. Secondary Skills Soft Skills Excellent communication and interpersonal skills. Strong analytical and problem-solving abilities. Ability to work independently and as part of a team. Strong attention to detail and organizational skills.
Posted 1 week ago
4.0 - 9.0 years
25 - 35 Lacs
Pune, Gurugram, Bengaluru
Hybrid
Exp- 4-7yrs Skillset- IDMC (preferably) or any other DG tool, SQL, ETL Location- Gurgaon (Preferred), Bangalore, Pune, NP- URGENT Work with Customers onshore team collaboratively to support following initiatives: Interface with business stakeholders, understand their data and analytics needs, establish requirement with technical stakeholders and align on delivery plan. Understand various data sources around asset classes, portfolio, historical performances, market trends etc. and Develop/enhance data documentation. Help deliver data-driven analysis and recommendations that effectively influence business decisions. Extract data, perform data cleansing / data quality checking tasks, prepare data quality reports, and model ready data. Candidate Profile: Over 4 years of experience in data analytics, governance, and business analysis Strong understanding of data analytics and ability to derive actionable insights Skilled in developing strategic project roadmaps and aligning data initiatives with business goals Proactive in proposing suggestions and providing regular project updates to stakeholders Hands-on experience with data governance frameworks; Collibra knowledge helpful but not mandatory. Strong comprehension of metadata strategies and real-world use cases Excellent communication skills and ability to work across business and technical teams Familiar with technology stack: SQL, Snowflake, Power BI Experience with IceDQ (a plus) Understanding of investment fundamentals is a valuable asset Detail-oriented, self-motivated, and adept at cross-functional collaboration
Posted 1 week ago
5.0 - 10.0 years
14 - 24 Lacs
Mumbai, Hyderabad, Bengaluru
Hybrid
Integration Specialist Collibra Must have Understanding of data governance concepts Experience with Source to Target Data Mapping Collibra Data Catalog API based integration programming knowledge Python Java Good to have Knowledge on Groovy scripts
Posted 1 week ago
12.0 - 14.0 years
20 - 27 Lacs
Bengaluru
Work from Office
Job Description & Summary: We are seeking an experienced Senior Data Architect to lead the design and development of our data architecture, leveraging cloud-based technologies, big data processing frameworks, and DevOps practices. The ideal candidate will have a strong background in data warehousing, data pipelines, performance optimization, and collaboration with DevOps teams. Responsibilities: 1. Design and implement end-to-end data pipelines using cloud-based services (AWS/ GCP/Azure) and conventional data processing frameworks. 2. Lead the development of data architecture, ensuring scalability, security, and performance. 3. Collaborate with cross-functional teams, including DevOps, to design and implement data lakes, data warehouses, and data ingestion/extraction processes. 4. Develop and optimize data processing workflows using PySpark, Kafka, and other big data processing frameworks. 5. Ensure data quality, integrity, and security across all data pipelines and architectures. 6. Provide technical leadership and guidance to junior team members. 7. Design and implement data load strategies, data partitioning, and data storage solutions. 8. Collaborate with stakeholders to understand business requirements and develop data solutions to meet those needs. 9. Work closely with DevOps team to ensure seamless integration of data pipelines with overall system architecture. 10. Participate in design and implementation of CI/CD pipelines for data workflows. DevOps Requirements: 1. Knowledge of DevOps practices and tools, such as Jenkins, GitLab CI/CD, or Apache Airflow. 2. Experience with containerization using Docker. 3. Understanding of infrastructure as code (IaC) concepts using tools like Terraform or AWS CloudFormation. 4. Familiarity with monitoring and logging tools, such as Prometheus, Grafana, or ELK Stack. Requirements: 1. 12-14 years of experience for Senior Data Architect in data architecture, data warehousing, and big data processing. 2. Strong expertise in cloud-based technologies (AWS/ GCP/ Azure) and data processing frameworks (PySpark, Kafka, Flink , Beam etc.). 3. Experience with data ingestion, data extraction, data warehousing, and data lakes. 4. Strong understanding of performance optimization, data partitioning, and data storage solutions. 5. Excellent leadership and communication skills. 6. Experience with NoSQL databases is a plus. Mandatory skill sets: 1. Experience with agile development methodologies. 2. Certification in cloud-based technologies (AWS / GCP/ Azure) or data processing frameworks. 3. Experience with data governance, data quality, and data security. Preferred skill sets: Knowledge of AgenticAI and GenAI is added advantage
Posted 1 week ago
4.0 - 7.0 years
8 - 9 Lacs
Bengaluru
Work from Office
Complete PMD tasks in CS Quotations from MATCH locations Maintain Global PMD in SAP-MRP Type, Weight, Dimensions, Delivery Time & Origin, Stackable Process Intercompany Order requests Manage information flow Support SAS in PMD related questions/tasks Required Candidate profile Understand technical /mechanical topics-machines / spare /wear parts Exp with data admn Good communication in English-MUST MS Office Interpersonal/teamwork Customer focus Manufacturing Background ONLY Perks and benefits MUST-exp in PMD MGT in MNCs-MANUFACTURING CO ONLY
Posted 1 week ago
5.0 - 8.0 years
5 - 12 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Detailed JD *(Roles and Responsibilities) Understand the current data governance structure of the organization and draft a data governance charter and operating model with roles and responsibilities on levels of operating model. Create a glossary with terms and definitions, mapping between logical elements and physical elements, and simple and complex relations for mapping. Set up Collibra communities, domains, types, attributes, status, articulation, workflow, and customize attribution. Identify and prioritize data domains for data governance based on business value and ease of implementation. Define roles and responsibilities governing communities and assets within the Collibra environment. Recommend and implement workflows to govern metadata. Engage with client SMEs to identify key business terms from shared documents to be showcased in Collibra as part of the business glossary. Identify key attributes like definition, criticality, security classification, purpose, etc., associated with the business terms. Create templates to gather the information required about business term attributes and technical metadata. Automate the manual data demand process by configuring and implementing workflows. Create end-to-end lineage in Collibra DGC based on the analysis performed and display the lineage in visual format for business users. Document best practices and provide training to stakeholders. Mandatory skills* Collibra is the mandatory skill
Posted 1 week ago
6.0 - 10.0 years
22 - 25 Lacs
Mumbai, Hyderabad
Work from Office
About the role As Master Data Management Manager, you will manage a cluster of technology platforms, continuously evaluate technology solutions induct an innovative technology stack to drive business excellence at ICICI Bank. You will work along with the cross functional business teams in creating technology solutions by leveraging digital data capabilities induct new age technologies. In this role, you will have opportunities to ideate, develop, manage, maintain improvise our digital offerings also our internal tools/platforms. Key Responsibilities Design and Develop Designing and developing customized MDM code based on Specification. Customization of MDM using MDM features such as Extension, Additions and Business proxies, rule, Services. Support Design, develop, Conduct Unit Testing Cases of new releases of SW components within the MDM repository. Be Up-to-Date Committed to learning and expanding professional and technical knowledge in master data management processes tools, data modeling data integration. Key Qualifications & Skills Educational Qualification B.E /B. Tech/M. E/ M. Tech with 6 to 10 years of relevant experience in IBM Infosphere Master Data Management Expertise in IBMs MDM version 11.x product, hands-on implementation experience, preferably in Banking domain. Experience with IBM MDM Customization includes Extension, Addition, Business proxies, SDP, Match-Merge Rules, Event Manage Support Provide support in understanding of OSGI architecture in terms of MDM customization, deployment and how to troubleshoot failures. Expert Java development background with RSA/ RAD, MDM Workbench, SOAP Web Services, XML, XSD, WSDL Communication skills Excellent oral and written communication skills
Posted 1 week ago
10.0 - 13.0 years
12 - 15 Lacs
Bengaluru
Work from Office
Atos is a global leader in digital transformation,European number one in Cloud, Cybersecurity and High-Performance Computing, the Group provides end-to-end Orchestrated Hybrid Cloud, Big Data, Business Applications and Digital Workplace solutions. The Group is the Worldwide Information Technology Partner for the Olympic & Paralympic Games and operates under the brands Atos, Atos Syntel, and Unify. Atos is a SE (Societas Europaea), listed on the CAC40 Paris stock index. The purpose of Atos is to help design the future of the information space. Its expertise and services support the development of knowledge, education and research in a multicultural approach and contribute to the development of scientific and technological excellence. Across the world, the Group enables its customers and employees, and members of societies at large to live, work and develop sustainably, in a safe and secure information space Role Overview: The Technical Architect has expertise in Google Cloud Platform (GCP) AI technologies, including Gemini. This role involves designing and implementing cutting-edge AI solutions that drive business transformation, scalability, and efficiency. Responsibilities: Architect and implement AI-driven solutions using GCP services and Gemini technologies. Collaborate with stakeholders to understand business requirements and translate them into scalable, secure technical solutions. Design efficient architectures for AI applications, ensuring compliance with industry standards and best practices. Optimize AI workflows for performance, reliability, and cost-effectiveness. Establish and enforce best practices for AI development, deployment, and monitoring. Provide technical leadership and mentorship to teams working on AI projects. Stay updated with emerging trends in AI and cloud technologies, particularly within the GCP ecosystem. Troubleshoot and resolve complex technical issues related to AI systems. Document architectural designs and decisions for future reference. Key Technical Skills & Responsibilities Strategic AI Leadership Advanced Model Development Google Cloud Architecture Mastery Data Science and Engineering Prompt Engineering Expertise Responsible AI Implementation Cross-Functional Collaboration AI Governance and Compliance Strategic AI System Design Programming Mastery - Python Multi-Agent Orchestration using Vertex AI Agent Builder Cloud Infrastructure Expertise (Vertex ML, Google Cloud API management) AI Governance and Compliance - Ethical AI practices Integration and Interoperability Performance Optimization Eligibility Criteria: Bachelors degree in Computer Science, Data Science, or a related field. Extensive experience with GCP AI services and Gemini technologies. Strong understanding of AI architecture, design, and optimization. Proficiency in programming languages such as Python and Java. Experience with cloud-based AI solutions is a plus. Familiarity with ethical AI principles and practices. Knowledge of data governance and compliance standards. Excellent problem-solving and analytical skills. Proven leadership and team management abilities. Ability to work in a fast-paced environment and manage multiple priorities.
Posted 1 week ago
5.0 - 7.0 years
7 - 9 Lacs
Bengaluru
Work from Office
We are currently seeking a Data Privacy Specialist to join the CSC Enterprise Data Governance and Privacy team. As a Data Privacy Specialist, you will play a crucial role in ensuring the protection and compliance of personal data within our organization. You will be responsible for supporting the Global Data Privacy and Protection Office with developing, implementing, and maintaining data privacy policies and procedures to safeguard our data assets and ensure compliance with relevant regulations and standards. Key Responsibilities: Data Privacy Compliance : Monitor and assess the organizations compliance with data privacy laws, regulations, and standards such as GDPR, CCPA, HIPAA, etc. Policy Development : Develop and maintain data privacy policies, procedures, and guidelines tailored to the organizations needs and regulatory requirements. Privacy Impact Assessments (PIAs) : Conduct PIAs to identify and assess the potential privacy risks associated with new projects, systems, or processes, and recommend mitigation strategies. Record of Processing Activities (ROPAs) : Develop and maintain the Record of Processing Activities in accordance with regulatory requirements. Ensure ROPAs are updated regularly to reflect changes in data processing activities within the organization. Data Mapping and Inventory : Maintain an inventory of data assets, including personal and sensitive data, and ensure appropriate data mapping to understand data flows and identify privacy risks. Privacy Training and Awareness : Develop and deliver privacy training programs and awareness campaigns to educate employees about data privacy best practices and their responsibilities. Personal Data Incident/Breach Tracking : Develop and implement procedures for responding to data privacy incidents, including breach notification requirements, and coordinate incident response efforts as needed. Projects : Lead or represent Data Governance and Privacy as needed for project work. Qualifications Bachelor s degree in a relevant field such as Information Technology, Law, or Business Administration., Masters preferred. Privacy certification preferred (i.e., CIPM, CIPP/E) Overall 5-7 years of experience, including 3+ years of proven Data Privacy and Protection experience is desired. Knowledge of Data Privacy and Protection regulations (i.e., GDPR, CCPA/CPRA) required. Strong analytical or problem-solving skills. Self-motivated with the ability to drive tasks to completion. Excellent communication and interpersonal skills. Intermediate level of proficiency with PC based software programs and automated database management systems required (Excel, Access, Visio, PowerPoint). Demonstrated process improvement, workflow, benchmarking and / or evaluation of business processes.
Posted 1 week ago
4.0 - 8.0 years
13 - 17 Lacs
Bengaluru
Work from Office
Job Title: Associate Specialist- Data Engineering Location: Bengaluru Shift : UK Shift About the Role: We are seeking a skilled and experienced Data Engineer to join our team and help build, optimize, and maintain data pipelines and architectures. The ideal candidate will have deep expertise in the Microsoft data engineering ecosystem, particularly leveraging tools such as Azure Data Factory , Databricks , Synapse Analytics , Microsoft Fabric , and a strong command of SQL , Python , and Apache Spark . Key Responsibilities: Design, develop, and optimize scalable data pipelines and workflows using Azure Data Factory , Synapse Pipelines , and Microsoft Fabric . Build and maintain ETL/ELT processes for ingesting structured and unstructured data from various sources. Develop and manage data transformation logic using Databricks (PySpark/Spark SQL) and Python . Collaborate with data analysts, architects, and business stakeholders to understand requirements and deliver high-quality data solutions. Ensure data quality, integrity, and governance across the data lifecycle. Implement monitoring and alerting for data pipelines to ensure reliability and performance. Work with Azure Synapse Analytics to build data models and enable analytics and reporting. Utilize SQL for querying and managing large datasets efficiently. Participate in data architecture discussions and contribute to technical design decisions. Required Skills and Qualifications: data engineering or a related field. Strong proficiency in the Microsoft Azure data ecosystem including: Azure Data Factory (ADF) Azure Synapse Analytics Microsoft Fabric Azure Databricks Solid experience with Python and Apache Spark (including PySpark). Advanced skills in SQL for data manipulation and transformation. Experience in designing and implementing data lakes and data warehouses. Familiarity with data governance, security, and compliance standards. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Preferred Qualifications: Microsoft Azure certifications (e.g., Azure Data Engineer Associate). Experience with DevOps tools and CI/CD practices in data workflows. Knowledge of REST APIs and integration techniques. Background in agile methodologies and working in cross-functional teams.
Posted 1 week ago
8.0 - 13.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Number of Openings 2 ECMS Request no 530008 & 530010 Total Yrs. of Experience* 10 +Yrs. Relevant Yrs. of experience* 8 + Yrs. Job Description Minimum 8 years of experience in design, development, and deployment of largescale, distributed environmentDevelopment experience on ABAP, FPM, Webdynpro, Webservices, Workflow, Fiori, BADI, Expertise with SAP MDG configurations for Data modelling, UI modelling, process modelling, rules, and derivations, BRF, replication configurationsMust have knowledge and experiences of implementing multiple SAP MDG Master Data Governance architectures and business processes across multiple master data domain. Mandatory skill SAP MDG Desired skills* SAP MDG Domain* SAP MDG Vendor billing rate* 12.5 K to 12.7K Precise Work Location Offshore BG Check Post Onboarding Delivery Anchor for screening, interviews and feedback* saravanan.arumugam@infosys.com Is there any working in shifts from standard Daylight (to avoid confusions post onboarding) * Normal Shift
Posted 1 week ago
10.0 - 15.0 years
12 - 17 Lacs
Chennai
Work from Office
We are looking for a BI Architect with 13+ years of experience to lead the design and implementation of scalable BI and data architecture solutions. The role involves driving data modeling, cloud-based pipelines, migration projects, and data lake initiatives using technologies like AWS, Kafka, Spark, SQL, and Python. Experience with EDW modeling and architecture is a strong plus. Key Responsibilities Design and develop scalable BI and data models to support enterprise analytics. Lead data platform migration from legacy BI systems to modern cloud architectures. Architect and manage data lakes, batch and streaming pipelines, and real-time integrations via Kafka and APIs. Support data governance, quality, and access control initiatives. Partner with data engineers, analysts, and business stakeholders to deliver reliable, high-performing data solutions. Contribute to architecture decisions and platform scalability planning Qualifications Should have 10-15 years of relevant experience. 10+ years in BI, data engineering, or data architecture roles. Proficiency in SQL, Python, Apache Spark, and Kafka. Strong hands-on experience with AWS data services (e.g., S3, Redshift, Glue, EMR). Track record of leading data migration and modernization projects. Solid understanding of data governance, security, and scalable pipeline design. Excellent collaboration and communication skills. Good to Have Experience with enterprise data warehouse (EDW) modeling and architecture. Familiarity with BI tools like Power BI, Tableau, Looker, or Quicksight. Knowledge of lakehouse, data mesh, or modern data stack concepts.
Posted 1 week ago
5.0 - 10.0 years
25 - 30 Lacs
Noida
Work from Office
Join us as Assistant Vice President Data Analyst, for the Financial Crime Operations Data Domain to implement data quality process and procedures, ensuring that data is reliable and trustworthy, then extract actionable insights from it to help the organisation improve its operation, and optimise resources, Accountabilities Investigation and analysis of data issues related to quality, lineage, controls, and authoritative source identification, Execution of data cleansing and transformation tasks to prepare data for analysis, Designing and building data pipelines to automate data movement and processing, Development and application of advanced analytical techniques, including machine learning and AI, to solve complex business problems, Documentation of data quality findings and recommendations for improvement, To Be Successful In This Role, You Should Have: Experience in Data Management, Data Governance including records management Ability to review business processes from data lens and identify critical upstream and downstream components especially in financial services organisation understanding of models, EUDAs etc Strong understanding of Data Governance, Data Quality & Controls, Data Lineage and Reference Data/Metadata Management including relevant policies and frameworks, A clear understanding of the elements of an effective control environment, enterprise risk management framework, operational risk or other principal risk frameworks Experience of managing stakeholders directly & indirectly and across geographies & cultures, Strong understanding and practical exposure to application of BCBS 239 principles and related frameworks Commercially astute, demonstrates a consultative, yet pragmatic approach with integrity to solving issues, focusing on areas of significance and value to the business, A strong understanding of Risk and Control environment/control frameworks/op risk, including understanding of second and third line functions and impact across people, process and technology, analytical techniques and tools to extract meaningful insights from complex data sets and drive dataStrategic Leadership: Provide strategic direction and leadership for data analysis initiatives, ensuring alignment with organizational and program goals Functional understanding of financial crime and fraud data domains would be preferred, Data Governance: Oversee data governance policies and procedures to ensure data integrity, security, and compliance with regulatory requirements, Stakeholder Collaboration: Collaborate with cross-functional teams to identify data needs and deliver actionable insights, Advanced Analytics: Utilize advanced driven decision-making Deliver best in class insights to enable stakeholders to make informed business decisions and support data quality issue remediation, Perform robust reivew and QA of key deliverables being sent out by the team to stakeholders Demonstrate a collaborative communication style, promoting trust and respect with a range of stakeholders including Operational Risk/Chief Controls Office/ Chief Data Office/ Financial Crime Operations subject matter experts (SMEs), Chief Data Office, Risk Information Services, Technology Some Other Desired Skills Include: Graduate in any discipline Effective communication and presentation skills, Experience in Data Management/ Data Governance/ Data Quality Controls, Governance, Reporting and Risk Management preferably in a financial services organisation Experience in Data Analytics and Insights (using latest tools and techniques e-g Python, Tableau, Tableau Prep, Power Apps, Aletryx ), analytics on structured and unstructured data Experience on data bases and data science/ analytics tools and techniques like SQL, AI and ML (on live projects and not just academic projects) Proficient in MS Office PPT, Excel, Word & Visio Comprehensive understanding of Risk, Governance and Control Frameworks and Processes Location Noida Purpose of the role To implement data quality process and procedures, ensuring that data is reliable and trustworthy, then extract actionable insights from it to help the organisation improve its operation, and optimise resources, Accountabilities Investigation and analysis of data issues related to quality, lineage, controls, and authoritative source identification, Execution of data cleansing and transformation tasks to prepare data for analysis, Designing and building data pipelines to automate data movement and processing, Development and application of advanced analytical techniques, including machine learning and AI, to solve complex business problems, Documentation of data quality findings and recommendations for improvement, Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness Collaborate closely with other functions/ business divisions, Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard The four LEAD behaviours are: L Listen and be authentic, E Energise and inspire, A Align across the enterprise, D Develop others, OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes, Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues, Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda, Take ownership for managing risk and strengthening controls in relation to the work done, Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function, Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy, Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc) to solve problems creatively and effectively, Communicate complex information 'Complex' information could include sensitive information or information that is difficult to communicate because of its content or its audience, Influence or convince stakeholders to achieve outcomes, All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship our moral compass, helping us do what we believe is right They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge and Drive the operating manual for how we behave,
Posted 1 week ago
6.0 - 8.0 years
8 - 10 Lacs
Gurugram
Work from Office
Global Data Steward Role We are looking for a highly skilled and experienced Global Data Steward to join our team at AXALTA COATING SYSTEMS INDIA PRIVATE LIMITED. The ideal candidate will have 6-8 years of experience in data stewardship. Roles and Responsibility Develop and implement effective data stewardship strategies to ensure data quality and integrity. Collaborate with cross-functional teams to identify and prioritize data requirements. Design and maintain scalable and secure data architectures to support business growth. Ensure compliance with regulatory requirements and industry standards. Provide expert guidance on data management best practices to stakeholders. Analyze and resolve complex data-related issues to improve operational efficiency. Job Requirements Strong understanding of data stewardship principles and practices. Experience with data governance frameworks and regulations. Proficiency in data modeling, warehousing, and analytics tools. Excellent communication and collaboration skills. Ability to work in a fast-paced environment with multiple priorities. Strong problem-solving skills with attention to detail.
Posted 1 week ago
8.0 - 10.0 years
13 - 18 Lacs
Bengaluru
Work from Office
Job Title: Data Management Program Lead About Us Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO? You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. Job Description: Data Management Program Lead Mandate Skills : BA PM in Data Management (Candidate should Technical + Functional in BA PM) Location : Bangalore, Pune, Chennai, Hyderabad Notice: Immediate to 30 Days Level: M3/M4 Strong change and project management skills Lead, influence and manage a wide range of operational resources in the execution of programme objectives Robust planning, organisational and risk management skills to ensure project deliveries are on track Stakeholder Management, Communications, Reporting Adept in negotiating and influencing at all levels, across geographies internationally Ability to provide trackable updates, achievements and blockers to senior stakeholders, focusing on business outcome of the programme Proven ability to analyse, synthesize and take decisions / make timely recommendations on complex issues Subject Matter Expertise required in more than one of the following areas- Data Management, Data Governance/Data Quality Implementation & Monitoring, Data Stewardship, Data Lineage Ability to contextualise technical requirements and strategic objectives into executable business activities and deliverables 8-10+ years experience. WHY JOIN CAPCO? You will work on engaging projects with some of the largest banks in the world, on projects that will transform the financial services industry. We offer: A work culture focused on innovation and creating lasting value for our clients and employees Ongoing learning opportunities to help you acquire new skills or deepen existing expertise A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients A diverse, inclusive, meritocratic culture #LI-Hybrid #LI-RA1
Posted 1 week ago
7.0 - 10.0 years
9 - 12 Lacs
Chennai
Work from Office
As Data Governance Manager , based in Bucharest, Chennai or Monterrey, you will play a central role in FLSs digital transformation. You will ensure data is well-organized and used effectively by collaborating with stakeholders, setting standards, and tracking progress. Your responsibilities Collaborate with business units, IT teams, and other stakeholders to understand data needs and establish governance requirements. Lead and improve data governance practices, ensuring that FLS s data is organized, governed, and used to drive impactful business transformation. Provide expert guidance on defining and implementing data standards, quality metrics, and governance frameworks. Track and report on master data governance progress, ensuring measurable outcomes and continuous improvement. Stay ahead of industry trends and best practices in master data management and data governance. Establish data policies and procedures, defining and documenting governance policies and procedures. Drive cross-functional data forums for all data domains. What you bring Strong understanding of data governance principles, best practices, and data quality management. Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams. Ability to analyze complex data governance issues, identify root causes, and propose solutions to improve processes and mitigate risks. Attention to detail in ensuring data accuracy and compliance with governance standards. Ability to thrive in a fast-paced, dynamic environment and manage multiple priorities proactively. Experience with Microsoft Dynamics (CRM and ERP) would be an advantage. A masters degree or equivalent in IT, Data Management, Business Economics, or a related field. Fluency in English, as you will be joining an international team working across borders.
Posted 1 week ago
3.0 - 4.0 years
5 - 6 Lacs
Mumbai
Work from Office
- Qlik Development : Design, develop, and Deploy QlikView and Qlik Sense applications, reports, and dashboards to meet business requirements. - Data Integration : Collaborate with data engineers to integrate data from multiple sources (databases, APIs, flat files) into Qlik for reporting and analysis. - Performance Tuning : Optimize Qlik applications for performance, focusing on reducing load times and improving user experience. - User Support & Troubleshooting : Provide support for existing Qlik applications, resolving any technical issues and ensuring optimal performance. - Collaborative Problem Solving : Work closely with business stakeholders to gather requirements and transform them into functional and technical specifications. - Best Practices : Ensure that development follows Qlik best practices, including data modeling, data governance, and performance optimization. - Testing & Documentation : Participate in the testi ng process and create comprehensive documentation for developed applications, including usage guides and technical specifications. - Mentorship : Provide guidance and mentorship to junior developers, sharing knowledge on Qlik development best practices and approaches. - Continuous Improvement : Stay up-to-date with the latest features and updates in Qlik technologies and actively contribute to improving the development processes. Required Skills : - Qlik Development : Design, develop, and Deploy QlikView and Qlik Sense applications, re ports, and dashboards to meet business requirements. - Data Integration : Collaborate with data engineers to integrate data from multiple sources (databases
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Kochi, Bengaluru
Work from Office
Overview: As a Senior MDM Developer, you will play a critical role in designing, developing, and optimizing Master Data Management (MDM) solutions. You will work closely with business and technical teams to ensure data integrity, efficient integration, and compliance with enterprise standards. Your expertise in MDM platforms, data modeling, and integration technologies will be key to delivering high-quality solutions. Key Responsibilities: Design, develop, and implement MDM solutions based on business requirements. Ensure data quality, consistency, and governance across multiple domains. Collaborate with architects and business analysts to define MDM strategies and best practices. Develop integrations between MDM platforms and enterprise applications using APIs and ETL tools. Optimize data models, workflows, and MDM performance for scalability and efficiency. Troubleshoot and resolve data-related issues, ensuring system reliability and integrity. Stay updated with emerging MDM technologies and trends to enhance technical capabilities. Required Qualifications: Bachelor s degree in Computer Science, Engineering, or a related field. 5+ years of experience in MDM development and implementation. Hands-on experience with platforms such as Reltio, Informatica, DataBricks, Azure, Oracle, and Snowflake. Strong expertise in data integration, ETL processes, and API development. Solid understanding of data governance, quality management, and compliance standards. Experience working with multiple data sources, country-specific data models, and life sciences MDM implementations. Excellent problem-solving skills and the ability to work in a fast-paced environment.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane