Jobs
Interviews

866 Collibra Jobs - Page 16

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

7 - 12 Lacs

Gurugram

Work from Office

Data Catalog Lead (Collibra) : Position Overview : We are seeking an experienced Data Catalog Lead to lead the implementation and ongoing development of enterprise data catalog using Collibra. This role focuses specifically on healthcare payer industry requirements, including complex regulatory compliance, member data privacy, and multi-system data integration challenges unique to health plan operations. Key Responsibilities : - Data Catalog Implementation & Development - Configure and customize Collibra workflows, data models, and governance processes to support health plan business requirements - Develop automated data discovery and cataloging processes for healthcare data assets including claims, eligibility, provider networks, and member information - Design and implement data lineage tracking across complex healthcare data ecosystems spanning core administration systems, data warehouses, and analytics platforms - Healthcare-Specific Data Governance - Build specialized data catalog structures for healthcare data domains including medical coding systems (ICD-10, CPT, HCPCS), pharmacy data (NDC codes), and provider taxonomies - Configure data classification and sensitivity tagging for PHI (Protected Health Information) and PII data elements in compliance with HIPAA requirements - Implement data retention and privacy policies within Collibra that align with healthcare regulatory requirements and member consent management - Develop metadata management processes for regulatory reporting datasets (HEDIS, Medicare Stars, MLR reporting, risk adjustment) Technical Integration & Automation : - Integrate Collibra with healthcare payer core systems including claims processing platforms, eligibility systems, provider directories, and clinical data repositories - Implement automated data quality monitoring and profiling processes that populate the data catalog with technical and business metadata - Configure Collibra's REST APIs to enable integration with existing data governance tools and business intelligence platforms Required Qualifications : - Collibra Platform Expertise - 8+ years of hands-on experience with Collibra Data Intelligence Cloud platform implementation and administration - Expert knowledge of Collibra's data catalog, data lineage, and data governance capabilities - Proficiency in Collibra workflow configuration, custom attribute development, and role-based access control setup - Experience with Collibra Connect for automated metadata harvesting and system integration - Strong understanding of Collibra's REST APIs and custom development capabilities - Healthcare Payer Industry Knowledge - 4+ years of experience working with healthcare payer/health plan data environments - Deep understanding of healthcare data types including claims (professional, institutional, pharmacy), eligibility, provider data, and member demographics - Knowledge of healthcare industry standards including HL7, X12 EDI transactions, and FHIR specifications - Familiarity with healthcare regulatory requirements (HIPAA, ACA, Medicare Advantage, Medicaid managed care) - Understanding of healthcare coding systems (ICD-10-CM/PCS, CPT, HCPCS, NDC, SNOMED CT) Technical Skills : - Strong SQL skills and experience with healthcare databases (claims databases, clinical data repositories, member systems) - Knowledge of cloud platforms (AWS, Azure, GCP) and their integration with Collibra cloud services - Understanding of data modeling principles and healthcare data warehouse design patterns Data Governance & Compliance : - Experience implementing data governance frameworks in regulated healthcare environments - Knowledge of data privacy regulations (HIPAA, state privacy laws) and their implementation in data catalog tools - Understanding of data classification, data quality management, and master data management principles - Experience with audit trail requirements and compliance reporting in healthcare organizations Preferred Qualifications : - Advanced Healthcare Experience - Experience with specific health plan core systems (such as HealthEdge, Facets, QNXT, or similar platforms) - Knowledge of Medicare Advantage, Medicaid managed care, or commercial health plan operations - Understanding of value-based care arrangements and their data requirements - Experience with clinical data integration and population health analytics Technical Certifications & Skills : - Collibra certification (Data Citizen, Data Steward, or Technical User) - Experience with additional data catalog tools (Alation, Apache Atlas, IBM Watson Knowledge Catalog) - Knowledge of data virtualization tools and their integration with data catalog platforms - Experience with healthcare interoperability standards and API management

Posted 2 months ago

Apply

5.5 - 9.9 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Role: AWS Data Engineer- Senior Associate Experience: : 5.5 -9.9 years Key Skills: AWS Educational Qualification: BE / B Tech / ME / M Tech / MBA Work Location: Bangalore Job Description As a Senior Associate, you will work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review, Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Demonstrate leadership capabilities by working, with clients directly and leading the engagement. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good team player, take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements: Required Skills: AWS Cloud Engineer: Job description: Candidate is expected to demonstrate extensive knowledge and/or a proven record of success in the following areas: Should have minimum 5 years hand on experience building advanced Data warehousing solutions on leading cloud platforms. Should have minimum 3-4 years of Operate/Managed Services/Production Support Experience Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data that for downstream consumption like Business Intelligence systems, Analytics modelling, Data scientists etc. Designing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems, such as data warehouses or data lakes. Should have experience in building efficient, ETL/ELT processes using industry leading tools like AWS, AWS GLUE, AWS Lambda, AWS DMS, PySpark, SQL, Python, DBT, Prefect, Snowflake, etc. Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in AWS. Work together with data scientists and analysts to understand the needs for data and create effective data workflows. Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Improve the scalability, efficiency, and cost-effectiveness of data pipelines. Monitoring and troubleshooting data pipelines and resolving issues related to data processing, transformation, or storage. Implementing and maintaining data security and privacy measures, including access controls and encryption, to protect sensitive data Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Should have experience in Building and maintaining Data Governance solutions (Data Quality, Metadata management, Lineage, Master Data Management and Data security) using industry leading tools Scaling and optimizing schema and performance tuning SQL and ETL pipelines in data lake and data warehouse environments. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Should have Experience of ITIL processes like Incident management, Problem Management, Knowledge management, Release management, Data DevOps etc. Should have Strong communication, problem solving, quantitative and analytical abilities. Nice to have: AWS certification Managed Services- Data, Analytics & Insights Managed Service At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights where we focus more so on the evolution of our clients’ Data and Analytics ecosystem. Our focus is to empower our clients to navigate and capture the value of their Data & Analytics portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Data, Analytics & Insights offerings and engagement including help desk support, enhancement, and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective.

Posted 2 months ago

Apply

1.0 - 5.0 years

0 Lacs

karnataka

On-site

As a member of the Sanctions team within the Global Financial Crimes (GFC) program at FIS, you will play a key role in supporting the Senior Director to establish a strong tech and data knowledge base, as well as assist in system implementations across Global Financial Crimes Compliance. Your responsibilities will include writing documentation on sanctions workflows, standards, guidelines, testing procedures, taxonomies, and operating procedures. You will also be tasked with developing and optimizing complex SQL queries to extract, manipulate, and analyze large volumes of financial data to ensure data accuracy and integrity. Additionally, you will be responsible for creating and maintaining comprehensive data lineage documentation, contributing to the development and maintenance of master data management processes, and generating regular reporting on Financial Crimes Data Governance KPIs, metrics, and activities. Your role will involve monitoring LOB compliance activities, verifying regulatory compliance deadlines are met, and tracking product data compliance deficiencies to completion. To excel in this role, you should possess a Bachelor's or Master's degree in a relevant field such as Computer Science, Statistics, or Engineering, along with 1-3 years of experience in the regulatory compliance field. Previous experience as a Data Analyst in the financial services industry, particularly with a focus on Financial Crimes Compliance, is highly desirable. Proficiency in SQL, data analysis tools, and experience with data governance practices is essential. Strong analytical, problem-solving, and communication skills are key to success in this position. If you have experience in regulatory oversight of high-risk product lines containing complex banking functions or are considered a subject matter expert in sanctions regulatory compliance, it would be considered an added bonus. At FIS, we offer a flexible and creative work environment, diverse and collaborative atmosphere, professional and personal development resources, opportunities for volunteering and supporting charities, as well as competitive salary and benefits. FIS is committed to protecting the privacy and security of all personal information processed to provide services to clients. Our recruitment model primarily relies on direct sourcing, and we do not accept resumes from recruitment agencies that are not on our preferred supplier list. We take pride in our commitment to diversity, inclusion, and professional growth, and we invite you to be part of our team to advance the world of fintech.,

Posted 2 months ago

Apply

1.0 - 4.0 years

2 - 4 Lacs

Hyderabad

Remote

Job description We have a vacancy with below details, Role : Analyst, Data Sourcing Metadata -Cloud Designations: Analyst Experience -1-4 Notice Period : Immediate to 60 days ( Currently serving) Work Mode : WFH(Remote) Working Days : 5 days Mandatory Skills : Data Management, SQL, Cloud tools(AWS/Azure/GCP),ETL Tools (Ab initio, Collibra, Informatica),Data Catalog, Data Lineage, Data Integration, Data Dictionary, Maintenance, RCA, Issue Analysis Required Skills/Knowledge: Bachelors Degree, preferably in Engineering or Computer Science with more than 1 years hands-on Data Management experience or in lieu of a degree with more than 3 years experience Minimum of 1 years experience in data management, focusing on metadata management, data governance, or data lineage, with exposure to cloud environments (AWS, Azure, or Google Cloud) and on-premise infrastructure. Basic understanding of metadata management concepts, familiarity with data cataloging tools (e.g., AWS Glue Data Catalog, AbInitio, Collibra), basic proficiency in data lineage tracking tools (e.g., Apache Atlas, AbInitio, Collibra), and understanding of data integration technologies (e.g., ETL, APIs, data pipelines). Good communication and collaboration skills, strong analytical thinking and problemsolving abilities, ability to work independently and manage multiple tasks, and attention to detail Desired Characteristics: AWS certifications such as AWS Cloud practitioner, AWS Certified Data Analytics Specialty Familiarity with hybrid cloud environments (combination of cloud and on-prem). Skilled in Ab Initio Metahub development and support including importers, extractors, Metadata Hub database extensions, technical lineage, QueryIT, Ab Initio graph development, Ab Initio Control center and Express IT Experience with harvesting technical lineage and producing lineage diagrams. Familiarity with Unix, Linux, Stonebranch, and familiarity with database platforms such as Oracle and Hive Basic knowledge of SQL and data query languages for managing and retrieving metadata. Understanding of data governance frameworks (e.g., EDMC DCAM, GDPR compliance). • Familiarity with Collibra

Posted 2 months ago

Apply

1.0 - 4.0 years

0 - 2 Lacs

Hyderabad

Remote

Job description We have a vacancy with below details, Role : Analyst, Data Sourcing Metadata Designations: Analyst Experience -1-4 Notice Period : Immediate to 60 days ( Currently serving) Work Mode : WFH(Remote) Working Days : 5 days Mandatory Skills : Data Management ,Collibra, Collibra BPM, Business Process Management, SQL, Groovy Scripting Required Skills/Knowledge: Bachelors Degree, preferably in Engineering or Computer Science with more than 1 years hands-on Data Management experience or in lieu of a degree with more than 3 years experience Minimum of 1 years’ experience in data management, focusing on metadata management, data governance, or data lineage, with exposure to cloud environments (AWS, Azure, or Google Cloud) and on-premise infrastructure. Basic understanding of metadata management concepts, familiarity with data cataloging tools (e.g., AWS Glue Data Catalog, AbInitio, Collibra), basic proficiency in data lineage tracking tools (e.g., Apache Atlas, AbInitio, Collibra), and understanding of data integration technologies (e.g., ETL, APIs, data pipelines). Good communication and collaboration skills, strong analytical thinking and problemsolving abilities, ability to work independently and manage multiple tasks, and attention to detail Desired Characteristics: AWS certifications such as AWS Cloud practitioner, AWS Certified Data Analytics – Specialty Familiarity with hybrid cloud environments (combination of cloud and on-prem). Skilled in Ab Initio Metahub development and support including importers, extractors, Metadata Hub database extensions, technical lineage, QueryIT, Ab Initio graph development, Ab Initio Control center and Express IT Experience with harvesting technical lineage and producing lineage diagrams. Familiarity with Unix, Linux, Stonebranch, and familiarity with database platforms such as Oracle and Hive Basic knowledge of SQL and data query languages for managing and retrieving metadata. Understanding of data governance frameworks (e.g., EDMC DCAM, GDPR compliance). • Familiarity with Collibra

Posted 2 months ago

Apply

7.0 years

7 - 8 Lacs

Bengaluru, Karnataka, India

On-site

Job Title: Associate Analyst – MDM Operations Location: Bangalore / Pune Shift: Open to Shifts Key Responsibilities Master data maintenance & cleansing (Vendor, Customer, Material, Pricing) in SAP/S4 Execute create, update, change processes for master data Perform data audits, validations, and reconciliation Support data migration using LSMW, LTMC, Winshuttle Prepare data design documents (Data Models, Standards, CRUD Matrix) Collaborate on projects and process improvement initiatives Skills & Experience 2–7 years in Master Data Management Strong SAP functional knowledge (tables, T-codes, MDG) Familiar with ETL, data transformation & error handling Advanced MS Excel skills (Pivots, Index Match, etc.) Exposure to MDM tools like Stibo, Collibra, Informatica (optional) Knowledge of P2P process (added advantage) Education BE in Mechanical, Electrical, or Electronics Skills: etl,data reconciliation,excel,data migration,standards,lsmw,crud matrix,ltmc,sap functional,sap,master data maintenance,winshuttle,sap/s4,error handling,data transformation,data audits,data validations,data models,mdm tools,p2p process,sap functional knowledge,cleansing,master data management,advanced ms excel,mdm operations,data design documents

Posted 2 months ago

Apply

5.0 years

0 Lacs

India

On-site

About Billigence: Billigence is a boutique data consultancy with a global reach and diverse clientele, transforming how organizations leverage data. We utilize cutting-edge technologies to design, tailor, and implement advanced Business Intelligence solutions with high added value across a wide range of applications—from process digitization to Cloud Data Warehousing, Visualization, Data Science, Engineering, and Data Governance. About the Role : We are seeking an experienced Data Architect to lead the data architecture and modeling, with a focus on Snowflake and Matillion. The ideal candidate for this position is pro-active, self-directed, team player and passionate. What You’ll Do: Design and implement end-to-end data architecture leveraging Snowflake and Matillion for data ingestion, transformation, and storage. Define and maintain data modeling standards , data flows, and architecture best practices (dimensional, normalized, star/snowflake schemas). Lead the evaluation and adoption of tools in the modern data stack , ensuring alignment with business goals and future scalability. Collaborate with data engineers, analysts, and stakeholders to define data requirements and create robust data pipelines. Ensure data security, access controls, and compliance with governance frameworks (e.g., GDPR, HIPAA, SOC 2). Optimize performance of Snowflake through clustering, caching, query tuning, and cost management. Oversee data integration strategies from multiple sources (APIs, databases, flat files, third-party platforms). Establish data quality and metadata management practices across platforms. Act as a subject matter expert and provide guidance on Matillion orchestration , reusable job frameworks, and job performance optimization What You’ll Need: Required Qualifications 5+ years of experience in data engineering, architecture, or similar roles. Proven expertise with Snowflake (multi-cluster warehouse design, role-based access, data sharing, etc.). Strong experience designing and implementing pipelines with Matillion ETL . Proficient in data modeling and building scalable cloud data platforms. Deep understanding of ELT/ETL design patterns and orchestration principles. Strong SQL and scripting (e.g., Python or Bash) skills. Hands-on experience with cloud platforms (preferably AWS or Azure ). Familiar with CI/CD, version control (Git), and infrastructure-as-code (e.g., Terraform, CloudFormation). Preferred Qualifications Experience with dbt , Airflow , or other orchestration and transformation tools. Knowledge of BI/reporting tools (e.g., Power BI, Tableau, Looker). Familiarity with data governance and data catalog solutions (e.g., Alation, Collibra, Atlan). Background in supporting machine learning platforms and real-time data pipelines. Industry experience in [finance, healthcare, retail, etc. — can be customized]. Nice to have: Very good communication skills and can do attitude. Very good analytical skills and structured approach (Detail-oriented). Organizational skills, team player. Ability to work independently, with self-motivation and self-confidence. Comfortable working in a fast-paced environment. Ability to time manage efficiently to avoid over-commitments. Strong customer centric mindset. Fluent English (written and spoken).

Posted 2 months ago

Apply

1.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

JOB_POSTING-3-72576 Job Description Role Title : Analyst, Analytics - Data Quality Developer(L08) Company Overview : Synchrony (NYSE: SYF) is a premier consumer financial services company delivering one of the industry’s most complete digitally enabled product suites. Our experience, expertise and scale encompass a broad spectrum of industries including digital, health and wellness, retail, telecommunications, home, auto, outdoors, pet and more. We have recently been ranked #2 among India’s Best Companies to Work for by Great Place to Work. We were among the Top 50 India’s Best Workplaces in Building a Culture of Innovation by All by GPTW and Top 25 among Best Workplaces in BFSI by GPTW. We have also been recognized by AmbitionBox Employee Choice Awards among the Top 20 Mid-Sized Companies, ranked #3 among Top Rated Companies for Women, and Top-Rated Financial Services Companies. Synchrony celebrates ~51% women diversity, 105+ people with disabilities, and ~50 veterans and veteran family members. We offer Flexibility and Choice for all employees and provide best-in-class employee benefits and programs that cater to work-life integration and overall well-being. We provide career advancement and upskilling opportunities, focusing on Advancing Diverse Talent to take up leadership roles Organizational Overview Our Analytics organization comprises of data analysts who focus on enabling strategies to enhance customer and partner experience and optimize business performance through data management and development of full stack descriptive to prescriptive analytics solutions using cutting edge technologies thereby enabling business growth. Role Summary/Purpose The Analyst, Analytics - Data Quality Developer (Individual Contributor) role is located in the India Analytics Hub (IAH) as part of Synchrony’s enterprise Data Office. This role will be responsible for the proactive design, implementation, execution, and monitoring of Data Quality process capabilities within Synchrony’s Public and Private cloud and on-prem environments within the Chief Data Office. The Data Quality Developer – Analyst will work within the IT organization to support and participate in build and run activities and environment (e.g. DevOps) for Data Quality. Key Responsibilities Monitor and maintain Data Quality and Data Issue Management operating level agreements in support of data quality rule execution and reporting Assist in performing root cause analysis for data quality issues and data usage challenges, particularly for the workload migration to the public cloud. Recommend, design, implement and refine / remediate data quality specifications within Synchrony’s approved Data Quality platforms Participate in the solution design of data quality and data issue management technical and procedural solutions, including metric reporting Work closely with Technology teams and key stakeholders to ensure the data quality issues are prioritized, analyzed and addressed Regularly communicate the states of data quality issues and progress to key stakeholders Participate in the planning and execution of agile release cycles and iterations Qualifications/Requirements Minimum of 1 years’ experience in data quality management, including implementing data quality rules, data profiling and root cause analysis for data issues, with exposure to cloud environments (AWS, Azure, or Google Cloud) and on-premise infrastructure. Minimum of 1 years’ experience with data quality or data integration tools such as Ab Initio, Informatica, Collibra, Stonebranch or Tableau, gained through hands-on experience or projects. Good communication and collaboration skills, strong analytical thinking and problem-solving abilities, ability to work independently and manage multiple tasks, and attention to detail. Desired Characteristics Broad understanding of banking, credit card, payment solutions, collections, marketing, risk and regulatory & compliance. Experience using data governance and data quality tools such as: Collibra, Ab Initio Express>IT; Ab Initio MetaHub. Proficient in writing / understanding SQL. Experience querying/analyzing data in cloud-based environments (e.g, AWS, Redshift) AWS certifications such as AWS Cloud practitioner, AWS Certified Data Analytics – Specialty Intermediate to advanced MS Office Suite skills including Power Point, Excel, Access, Visio. Strong relationship management and influencing skills to build enduring and productive alliances across matrix organizations. Demonstrated success in managing multiple deliverables concurrently often within aggressive timeframes; ability to cope under time pressure. Experience in partnering with a diverse team composed of staff and consultants located in multiple locations and time zones. Eligibility Criteria: Bachelor’s Degree, preferably in Engineering or Computer Science with more than 1 years’ hands-on Data Management experience or in lieu of a degree with more than 3 years’ experience. Work Timings: This role qualifies for Enhanced Flexibility and Choice offered in Synchrony India and will require the incumbent to be available between 06:00 AM Eastern Time – 11:30 AM Eastern Time (timings are anchored to US Eastern hours and will adjust twice a year locally). This window is for meetings with India and US teams. The remaining hours will be flexible for the employee to choose. Exceptions may apply periodically due to business needs. Please discuss this with the hiring manager for more details. For Internal Applicants Understand the criteria or mandatory skills required for the role, before applying Inform your manager and HRM before applying for any role on Workday Ensure that your professional profile is updated (fields such as education, prior experience, other skills) and it is mandatory to upload your updated resume (Word or PDF format) Must not be any corrective action plan (Formal/Final Formal) or PIP L4 to L7 Employees who have completed 12 months in the organization and 12 months in their current role and level are eligible. L8+ Employees who have completed 18 months in the organization and 12 months in their current role and level are eligible. Grade/Level: 08 Job Family Group Information Technology

Posted 2 months ago

Apply

5.0 - 7.0 years

4 - 7 Lacs

Cochin

On-site

5 - 7 Years 2 Openings Kochi Role description Role Proficiency: Act creatively to develop applications and select appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions account for others' developmental activities Outcomes: Interpret the application/feature/component design to develop the same in accordance with specifications. Code debug test document and communicate product/component/feature development stages. Validate results with user representatives; integrates and commissions the overall solution Select appropriate technical options for development such as reusing improving or reconfiguration of existing components or creating own solutions Optimises efficiency cost and quality. Influence and improve customer satisfaction Set FAST goals for self/team; provide feedback to FAST goals of team members Measures of Outcomes: Adherence to engineering process and standards (coding standards) Adherence to project schedule / timelines Number of technical issues uncovered during the execution of the project Number of defects in the code Number of defects post delivery Number of non compliance issues On time completion of mandatory compliance trainings Outputs Expected: Code: Code as per design Follow coding standards templates and checklists Review code – for team and peers Documentation: Create/review templates checklists guidelines standards for design/process/development Create/review deliverable documents. Design documentation r and requirements test cases/results Configure: Define and govern configuration management plan Ensure compliance from the team Test: Review and create unit test cases scenarios and execution Review test plan created by testing team Provide clarifications to the testing team Domain relevance: Advise Software Developers on design and development of features and components with a deep understanding of the business problem being addressed for the client. Learn more about the customer domain identifying opportunities to provide valuable addition to customers Complete relevant domain certifications Manage Project: Manage delivery of modules and/or manage user stories Manage Defects: Perform defect RCA and mitigation Identify defect trends and take proactive measures to improve quality Estimate: Create and provide input for effort estimation for projects Manage knowledge: Consume and contribute to project related documents share point libraries and client universities Review the reusable documents created by the team Release: Execute and monitor release process Design: Contribute to creation of design (HLD LLD SAD)/architecture for Applications/Features/Business Components/Data Models Interface with Customer: Clarify requirements and provide guidance to development team Present design options to customers Conduct product demos Manage Team: Set FAST goals and provide feedback Understand aspirations of team members and provide guidance opportunities etc Ensure team is engaged in project Certifications: Take relevant domain/technology certification Skill Examples: Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Break down complex problems into logical components Develop user interfaces business software components Use data models Estimate time and effort required for developing / debugging features / components Perform and evaluate test in the customer or target environment Make quick decisions on technical/project related challenges Manage a Team mentor and handle people related issues in team Maintain high motivation levels and positive dynamics in the team. Interface with other teams designers and other parallel practices Set goals for self and team. Provide feedback to team members Create and articulate impactful technical presentations Follow high level of business etiquette in emails and other business communication Drive conference calls with customers addressing customer questions Proactively ask for and offer help Ability to work under pressure determine dependencies risks facilitate planning; handling multiple tasks. Build confidence with customers by meeting the deliverables on time with quality. Estimate time and effort resources required for developing / debugging features / components Make on appropriate utilization of Software / Hardware’s. Strong analytical and problem-solving abilities Knowledge Examples: Appropriate software programs / modules Functional and technical designing Programming languages – proficient in multiple skill clusters DBMS Operating Systems and software platforms Software Development Life Cycle Agile – Scrum or Kanban Methods Integrated development environment (IDE) Rapid application development (RAD) Modelling technology and languages Interface definition languages (IDL) Knowledge of customer domain and deep understanding of sub domain where problem is solved Additional Comments: Role Overview As a Data Quality Integration Engineer, you will play a key role in embedding data quality capabilities within our enterprise data landscape. You will focus on integrating industry-leading Data Quality tools (such as Ataccama and/or Collibra) with core platforms such as Snowflake and SQL databases, ensuring data quality processes are robust, scalable, and well governed. Key Responsibilities • Design, implement, and support the integration of Data Quality tools (Ataccama, Collibra, or similar) with Snowflake and SQL-based data platforms. • Develop and maintain data pipelines and connectors enabling automated data quality assessments, profiling, cleansing, and monitoring. • Collaborate with Data Governance, Data Architecture, and Data Engineering teams to align integration designs with business and governance requirements. • Configure and manage Data Quality rules and workflows, ensuring alignment with business data quality KPIs, risk controls, and governance policies. • Document integration solutions, workflows, and technical decisions to facilitate transparency and knowledge sharing. • Troubleshoot integration issues, monitor performance, and optimise connector reliability and efficiency. • Support user training and the adoption of data quality processes and tooling. Required Skills and Experience • Proven experience of integrating Data Quality tools into enterprise data environments, ideally with Ataccama and/or Collibra. • Hands-on experience with Snowflake data warehousing platform and SQL databases (e.g., MS SQL Server, PostgreSQL, Oracle, MySQL). • Strong SQL scripting and data pipeline development skills (preferably with Python, Scala, or similar). • Thorough understanding of data quality concepts including profiling, cleansing, enrichment, and monitoring. • Experience working with data integration technologies and ETL frameworks. • Knowledge of data governance frameworks, standards, and best practices. • Familiarity with REST APIs and metadata integration is highly desirable. • Experience working in financial services, asset management, or highly regulated sectors is advantageous. • Strong documentation and communication skills. Desirable • Certification in Ataccama, Collibra, Snowflake or related data governance/data engineering technologies. • Experience with cloud platforms (AWS, Azure) hosting Snowflake and data tooling. • Prior experience supporting data stewardship or data governance initiatives. Skills Data Quality,Snowflake,Sql Database About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Posted 2 months ago

Apply

10.0 years

7 - 9 Lacs

Gurgaon

On-site

Overview: Keysight is on the forefront of technology innovation, delivering breakthroughs and trusted insights in electronic design, simulation, prototyping, test, manufacturing, and optimization. Our ~15,000 employees create world-class solutions in communications, 5G, automotive, energy, quantum, aerospace, defense, and semiconductor markets for customers in over 100 countries. Learn more about what we do. Our award-winning culture embraces a bold vision of where technology can take us and a passion for tackling challenging problems with industry-first solutions. We believe that when people feel a sense of belonging, they can be more creative, innovative, and thrive at all points in their careers. We are seeking a Lead Data Engineer to design, build, and optimize enterprise-grade data pipelines and platforms that serve as the foundation for advanced analytics, reporting, and AI solutions. This role combines technical leadership, architecture oversight, and hands-on development, with a strong focus on quality, scalability, and stakeholder partnership. Responsibilities: 1. Data Engineering Architect and implement data pipelines, data marts, and transformation layers using Snowflake as the primary cloud data platform. Develop and optimize ELT workflows with tools such as Matillion , dbt , or custom Python-based frameworks. Design Snowflake schemas, manage data modeling (star/snowflake) and performance tuning of queries, warehouses, and storage. 2. Technical Leadership Set technical standards and provide leadership in building modular, reusable data engineering components in Snowflake. Lead code reviews, mentor data engineers, and guide development on CI/CD pipelines , version control, and Snowflake DevOps. Own the implementation of multi-environment Snowflake development frameworks for dev/test/prod promotion and change control. 3. Platform Optimization & Monitoring Implement and monitor Resource Monitors , Query Profiles , and Warehouse auto-scaling to optimize cost and performance. Establish usage reporting, job monitoring , and error alerting using Snowflake native capabilities and integrated observability tools. 4. Governance, Security, and Data Quality Define and enforce role-based access control (RBAC) , data masking , and secure data sharing within and across domains. Support the implementation of data contracts , lineage , and data quality frameworks (e.g., Great Expectations). Collaborate with Data Governance teams to align platform use with compliance and policy requirements . 5. Business Partnership & Collaboration Engage with product owners, analysts, and data scientists to deliver trusted, high-performance datasets . Translate business requirements into technical Snowflake data solutions . Serve as a Snowflake Data SME for cross-functional teams and stakeholders. Qualifications: Careers Privacy Statement ***Keysight is an Equal Opportunity Employer.*** Required: 10+ years of experience in data engineering, with 3+ years of hands-on work in Snowflake . Proficiency in SQL and Python , especially for ELT orchestration and automation. Expertise in data modeling (dimensional, normalized), warehouse optimization , and multi-cluster Snowflake configuration . Experience with modern ELT tools like Matillion , dbt , or similar. Familiarity with CI/CD practices , Git, and Snowflake-specific DevOps practices. Preferred: Snowflake SnowPro Certification (Core or Advanced). Experience integrating Snowflake with Salesforce, Oracle ERP, or other SaaS/enterprise systems . Exposure to cataloging tools (e.g., Alation, Collibra), Airflow , and monitoring platforms . Knowledge of cost governance and Snowflake usage optimization strategies .

Posted 2 months ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Noida

Work from Office

Position Overview : We are seeking an experienced Data Catalog Lead to lead the implementation and ongoing development of enterprise data catalog using Collibra. This role focuses specifically on healthcare payer industry requirements, including complex regulatory compliance, member data privacy, and multi-system data integration challenges unique to health plan operations. Key Responsibilities : - Data Catalog Implementation & Development - Configure and customize Collibra workflows, data models, and governance processes to support health plan business requirements - Develop automated data discovery and cataloging processes for healthcare data assets including claims, eligibility, provider networks, and member information - Design and implement data lineage tracking across complex healthcare data ecosystems spanning core administration systems, data warehouses, and analytics platforms - Healthcare-Specific Data Governance - Build specialized data catalog structures for healthcare data domains including medical coding systems (ICD-10, CPT, HCPCS), pharmacy data (NDC codes), and provider taxonomies - Configure data classification and sensitivity tagging for PHI (Protected Health Information) and PII data elements in compliance with HIPAA requirements - Implement data retention and privacy policies within Collibra that align with healthcare regulatory requirements and member consent management - Develop metadata management processes for regulatory reporting datasets (HEDIS, Medicare Stars, MLR reporting, risk adjustment) Technical Integration & Automation : - Integrate Collibra with healthcare payer core systems including claims processing platforms, eligibility systems, provider directories, and clinical data repositories - Implement automated data quality monitoring and profiling processes that populate the data catalog with technical and business metadata - Configure Collibra's REST APIs to enable integration with existing data governance tools and business intelligence platforms Required Qualifications : - Collibra Platform Expertise - 8+ years of hands-on experience with Collibra Data Intelligence Cloud platform implementation and administration - Expert knowledge of Collibra's data catalog, data lineage, and data governance capabilities - Proficiency in Collibra workflow configuration, custom attribute development, and role-based access control setup - Experience with Collibra Connect for automated metadata harvesting and system integration - Strong understanding of Collibra's REST APIs and custom development capabilities - Healthcare Payer Industry Knowledge - 4+ years of experience working with healthcare payer/health plan data environments - Deep understanding of healthcare data types including claims (professional, institutional, pharmacy), eligibility, provider data, and member demographics - Knowledge of healthcare industry standards including HL7, X12 EDI transactions, and FHIR specifications - Familiarity with healthcare regulatory requirements (HIPAA, ACA, Medicare Advantage, Medicaid managed care) - Understanding of healthcare coding systems (ICD-10-CM/PCS, CPT, HCPCS, NDC, SNOMED CT) Technical Skills : - Strong SQL skills and experience with healthcare databases (claims databases, clinical data repositories, member systems) - Knowledge of cloud platforms (AWS, Azure, GCP) and their integration with Collibra cloud services - Understanding of data modeling principles and healthcare data warehouse design patterns Data Governance & Compliance : - Experience implementing data governance frameworks in regulated healthcare environments - Knowledge of data privacy regulations (HIPAA, state privacy laws) and their implementation in data catalog tools - Understanding of data classification, data quality management, and master data management principles - Experience with audit trail requirements and compliance reporting in healthcare organizations Preferred Qualifications : - Advanced Healthcare Experience - Experience with specific health plan core systems (such as HealthEdge, Facets, QNXT, or similar platforms) - Knowledge of Medicare Advantage, Medicaid managed care, or commercial health plan operations - Understanding of value-based care arrangements and their data requirements - Experience with clinical data integration and population health analytics Technical Certifications & Skills : - Collibra certification (Data Citizen, Data Steward, or Technical User) - Experience with additional data catalog tools (Alation, Apache Atlas, IBM Watson Knowledge Catalog) - Knowledge of data virtualization tools and their integration with data catalog platforms - Experience with healthcare interoperability standards and API management

Posted 2 months ago

Apply

4.0 - 7.0 years

15 - 18 Lacs

Bhubaneswar, Coimbatore, Bengaluru

Work from Office

Role & responsibilities The candidate must have deep expertise in data management maturity models, data governance frameworks, and regulatory requirements, ensuring businesses can maximize their data assets while complying with both local and international regulations. This is an exciting opportunity to work in a consulting environment, collaborating with industry leaders and driving data-driven business transformation. This role is based in India, with the expectation of traveling to Middle Eastern client locations as required 1. Data Strategy & Advisory Develop and implement enterprise-wide data strategies aligned with business objectives. Assess data maturity levels using industry-standard frameworks and define roadmaps for data-driven transformation. Advise clients on data monetization, data quality, and data lifecycle management 2. Data Governance & Compliance Define and implement data governance frameworks, policies, and best practices. Ensure compliance with local and international data regulations, including GDPR, HIPAA, and region-specific laws. Develop data stewardship programs, ensuring clear roles and responsibilities for data management. 3. Regulatory & Risk Management Provide expertise on data privacy, security, and risk management strategies. Align data strategies with regulatory frameworks such as ISO 27001, NIST, and other industry-specific compliance standards. Advise on data sovereignty and cross-border data transfer policies. 4. Consulting & Pre-Sales Support Conduct client workshops to define data strategy and governance models. Develop thought leadership, whitepapers, and strategic insights to support client engagements. Assist in business development efforts, including proposals and pre-sales discussions. 5. Team Mentorship & Leadership Mentor junior consultants on data governance and strategic advisory. Stay updated on emerging trends in data strategy, regulations, and governance technologies. Represent the company at industry events, conferences, and knowledge-sharing forums. Preferred candidate profile 1. Education & Experience Bachelors or Masters in Data Management, Business Analytics, Information Systems, or a related field. 5 years of experience in data strategy, governance, or regulatory compliance consulting. 2. Technical & Regulatory Expertise Deep understanding of data management maturity models (e.g., DAMA-DMBOK, CMMI for Data Management) ; Should be DAMA Certified Basic Proficiency in data governance tools such as Collibra, Informatica, or Azure Purview. Strong knowledge of local and international data regulations (e.g., GDPR, CCPA, PDPA, UAE’s NDPL, KSA-NDMO , UAE DGE Data Regulations, Dubai Data Law).

Posted 2 months ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

You will have a pivotal role in implementing and embracing the data governance framework at Amgen, which aims to revolutionize the company's data ecosystem and establish Amgen as a pioneer in biopharma innovation. This position will make use of cutting-edge technologies such as Generative AI, Machine Learning, and integrated data. Your expertise in domains, technical knowledge, and business processes will be crucial in providing exceptional support for Amgen's data governance framework. Collaboration with business stakeholders and data analysts will be essential to ensure successful implementation and adoption of the data governance framework. Working closely with the Product Owner and other Business Analysts will be necessary to guarantee operational support and excellence from the team. You will be responsible for the implementation of the data governance and data management framework within a specific domain of expertise, such as Research, Development, or Supply Chain. Operationalizing the Enterprise data governance framework and aligning a broader stakeholder community with their data governance needs, including data quality, data access controls, compliance with privacy and security regulations, master data management, data sharing, communication, and change management will be part of your responsibilities. Collaborating with Enterprise MDM and Reference Data to enforce standards and data reusability will also be key. You will drive cross-functional alignment in your area of expertise to ensure adherence to Data Governance principles and maintain privacy policies and procedures to safeguard sensitive data and ensure compliance. Regular privacy risk assessments and audits will be conducted by you to identify and mitigate potential risks as required. Furthermore, you will be responsible for maintaining documentation on data definitions, data standards, data flows, legacy data structures, common data models, and data harmonization for the assigned domains. Ensuring compliance with data privacy, security, and regulatory policies for the assigned domains, including GDPR, CCPA, and other relevant legislations, will be critical. Together with Technology teams, business functions, and enterprise teams, you will define the specifications shaping the development and implementation of data foundations. Building strong relationships with key business leads and partners to ensure their needs are met will also be part of your role. Your must-have functional skills include technical knowledge of Pharma processes with specialization in a domain, in-depth understanding of data management, data quality, master data management, data stewardship, data protection, and familiarity with data protection laws and regulations. You should have experience in the development life cycle of data products and proficiency in tools like Collibra and Alation. Strong problem-solving skills, excellent communication, and working with data governance frameworks are essential. Experience with data governance councils, Agile software development methodologies, proficiency in data analysis and quality tools, and 3-5 years of experience in data privacy or compliance are good-to-have functional skills. Soft skills required for this role include integrity, adaptability, proactivity, leadership, organization, analytical skills, ability to work effectively with teams, manage multiple priorities, ambition to develop skills and career, build business relationships, understand end-to-end data use and needs, interpersonal skills, initiative, self-motivation, presentation skills, attention to detail, time management, and customer focus. Basic qualifications for this position include any Degree and 9-13 years of experience.,

Posted 2 months ago

Apply

0 years

0 Lacs

Kochi, Kerala, India

On-site

Role Description Role Proficiency: Act creatively to develop applications and select appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions account for others' developmental activities Outcomes Interpret the application/feature/component design to develop the same in accordance with specifications. Code debug test document and communicate product/component/feature development stages. Validate results with user representatives; integrates and commissions the overall solution Select appropriate technical options for development such as reusing improving or reconfiguration of existing components or creating own solutions Optimises efficiency cost and quality. Influence and improve customer satisfaction Set FAST goals for self/team; provide feedback to FAST goals of team members Measures Of Outcomes Adherence to engineering process and standards (coding standards) Adherence to project schedule / timelines Number of technical issues uncovered during the execution of the project Number of defects in the code Number of defects post delivery Number of non compliance issues On time completion of mandatory compliance trainings Code Outputs Expected: Code as per design Follow coding standards templates and checklists Review code – for team and peers Documentation Create/review templates checklists guidelines standards for design/process/development Create/review deliverable documents. Design documentation r and requirements test cases/results Configure Define and govern configuration management plan Ensure compliance from the team Test Review and create unit test cases scenarios and execution Review test plan created by testing team Provide clarifications to the testing team Domain Relevance Advise Software Developers on design and development of features and components with a deep understanding of the business problem being addressed for the client. Learn more about the customer domain identifying opportunities to provide valuable addition to customers Complete relevant domain certifications Manage Project Manage delivery of modules and/or manage user stories Manage Defects Perform defect RCA and mitigation Identify defect trends and take proactive measures to improve quality Estimate Create and provide input for effort estimation for projects Manage Knowledge Consume and contribute to project related documents share point libraries and client universities Review the reusable documents created by the team Release Execute and monitor release process Design Contribute to creation of design (HLD LLD SAD)/architecture for Applications/Features/Business Components/Data Models Interface With Customer Clarify requirements and provide guidance to development team Present design options to customers Conduct product demos Manage Team Set FAST goals and provide feedback Understand aspirations of team members and provide guidance opportunities etc Ensure team is engaged in project Certifications Take relevant domain/technology certification Skill Examples Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Break down complex problems into logical components Develop user interfaces business software components Use data models Estimate time and effort required for developing / debugging features / components Perform and evaluate test in the customer or target environment Make quick decisions on technical/project related challenges Manage a Team mentor and handle people related issues in team Maintain high motivation levels and positive dynamics in the team. Interface with other teams designers and other parallel practices Set goals for self and team. Provide feedback to team members Create and articulate impactful technical presentations Follow high level of business etiquette in emails and other business communication Drive conference calls with customers addressing customer questions Proactively ask for and offer help Ability to work under pressure determine dependencies risks facilitate planning; handling multiple tasks. Build confidence with customers by meeting the deliverables on time with quality. Estimate time and effort resources required for developing / debugging features / components Make on appropriate utilization of Software / Hardware’s. Strong analytical and problem-solving abilities Knowledge Examples Appropriate software programs / modules Functional and technical designing Programming languages – proficient in multiple skill clusters DBMS Operating Systems and software platforms Software Development Life Cycle Agile – Scrum or Kanban Methods Integrated development environment (IDE) Rapid application development (RAD) Modelling technology and languages Interface definition languages (IDL) Knowledge of customer domain and deep understanding of sub domain where problem is solved Additional Comments Role Overview As a Data Quality Integration Engineer, you will play a key role in embedding data quality capabilities within our enterprise data landscape. You will focus on integrating industry-leading Data Quality tools (such as Ataccama and/or Collibra) with core platforms such as Snowflake and SQL databases, ensuring data quality processes are robust, scalable, and well governed. Key Responsibilities Design, implement, and support the integration of Data Quality tools (Ataccama, Collibra, or similar) with Snowflake and SQL-based data platforms. Develop and maintain data pipelines and connectors enabling automated data quality assessments, profiling, cleansing, and monitoring. Collaborate with Data Governance, Data Architecture, and Data Engineering teams to align integration designs with business and governance requirements. Configure and manage Data Quality rules and workflows, ensuring alignment with business data quality KPIs, risk controls, and governance policies. Document integration solutions, workflows, and technical decisions to facilitate transparency and knowledge sharing. Troubleshoot integration issues, monitor performance, and optimise connector reliability and efficiency. Support user training and the adoption of data quality processes and tooling. Required Skills and Experience Proven experience of integrating Data Quality tools into enterprise data environments, ideally with Ataccama and/or Collibra. Hands-on experience with Snowflake data warehousing platform and SQL databases (e.g., MS SQL Server, PostgreSQL, Oracle, MySQL). Strong SQL scripting and data pipeline development skills (preferably with Python, Scala, or similar). Thorough understanding of data quality concepts including profiling, cleansing, enrichment, and monitoring. Experience working with data integration technologies and ETL frameworks. Knowledge of data governance frameworks, standards, and best practices. Familiarity with REST APIs and metadata integration is highly desirable. Experience working in financial services, asset management, or highly regulated sectors is advantageous. Strong documentation and communication skills. Desirable Certification in Ataccama, Collibra, Snowflake or related data governance/data engineering technologies. Experience with cloud platforms (AWS, Azure) hosting Snowflake and data tooling. Prior experience supporting data stewardship or data governance initiatives. Skills Data Quality,Snowflake,Sql Database

Posted 2 months ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Pune

Work from Office

Data Catalog Lead (Collibra) : Position Overview : We are seeking an experienced Data Catalog Lead to lead the implementation and ongoing development of enterprise data catalog using Collibra. This role focuses specifically on healthcare payer industry requirements, including complex regulatory compliance, member data privacy, and multi-system data integration challenges unique to health plan operations. Key Responsibilities : - Data Catalog Implementation & Development - Configure and customize Collibra workflows, data models, and governance processes to support health plan business requirements - Develop automated data discovery and cataloging processes for healthcare data assets including claims, eligibility, provider networks, and member information - Design and implement data lineage tracking across complex healthcare data ecosystems spanning core administration systems, data warehouses, and analytics platforms - Healthcare-Specific Data Governance - Build specialized data catalog structures for healthcare data domains including medical coding systems (ICD-10, CPT, HCPCS), pharmacy data (NDC codes), and provider taxonomies - Configure data classification and sensitivity tagging for PHI (Protected Health Information) and PII data elements in compliance with HIPAA requirements - Implement data retention and privacy policies within Collibra that align with healthcare regulatory requirements and member consent management - Develop metadata management processes for regulatory reporting datasets (HEDIS, Medicare Stars, MLR reporting, risk adjustment) Technical Integration & Automation : - Integrate Collibra with healthcare payer core systems including claims processing platforms, eligibility systems, provider directories, and clinical data repositories - Implement automated data quality monitoring and profiling processes that populate the data catalog with technical and business metadata - Configure Collibra's REST APIs to enable integration with existing data governance tools and business intelligence platforms Required Qualifications : - Collibra Platform Expertise - 8+ years of hands-on experience with Collibra Data Intelligence Cloud platform implementation and administration - Expert knowledge of Collibra's data catalog, data lineage, and data governance capabilities - Proficiency in Collibra workflow configuration, custom attribute development, and role-based access control setup - Experience with Collibra Connect for automated metadata harvesting and system integration - Strong understanding of Collibra's REST APIs and custom development capabilities - Healthcare Payer Industry Knowledge - 4+ years of experience working with healthcare payer/health plan data environments - Deep understanding of healthcare data types including claims (professional, institutional, pharmacy), eligibility, provider data, and member demographics - Knowledge of healthcare industry standards including HL7, X12 EDI transactions, and FHIR specifications - Familiarity with healthcare regulatory requirements (HIPAA, ACA, Medicare Advantage, Medicaid managed care) - Understanding of healthcare coding systems (ICD-10-CM/PCS, CPT, HCPCS, NDC, SNOMED CT) Technical Skills : - Strong SQL skills and experience with healthcare databases (claims databases, clinical data repositories, member systems) - Knowledge of cloud platforms (AWS, Azure, GCP) and their integration with Collibra cloud services - Understanding of data modeling principles and healthcare data warehouse design patterns Data Governance & Compliance : - Experience implementing data governance frameworks in regulated healthcare environments - Knowledge of data privacy regulations (HIPAA, state privacy laws) and their implementation in data catalog tools - Understanding of data classification, data quality management, and master data management principles - Experience with audit trail requirements and compliance reporting in healthcare organizations Preferred Qualifications : - Advanced Healthcare Experience - Experience with specific health plan core systems (such as HealthEdge, Facets, QNXT, or similar platforms) - Knowledge of Medicare Advantage, Medicaid managed care, or commercial health plan operations - Understanding of value-based care arrangements and their data requirements - Experience with clinical data integration and population health analytics Technical Certifications & Skills : - Collibra certification (Data Citizen, Data Steward, or Technical User) - Experience with additional data catalog tools (Alation, Apache Atlas, IBM Watson Knowledge Catalog) - Knowledge of data virtualization tools and their integration with data catalog platforms - Experience with healthcare interoperability standards and API management

Posted 2 months ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Ahmedabad

Work from Office

Position Overview : We are seeking an experienced Data Catalog Lead to lead the implementation and ongoing development of enterprise data catalog using Collibra. This role focuses specifically on healthcare payer industry requirements, including complex regulatory compliance, member data privacy, and multi-system data integration challenges unique to health plan operations. Key Responsibilities : - Data Catalog Implementation & Development - Configure and customize Collibra workflows, data models, and governance processes to support health plan business requirements - Develop automated data discovery and cataloging processes for healthcare data assets including claims, eligibility, provider networks, and member information - Design and implement data lineage tracking across complex healthcare data ecosystems spanning core administration systems, data warehouses, and analytics platforms - Healthcare-Specific Data Governance - Build specialized data catalog structures for healthcare data domains including medical coding systems (ICD-10, CPT, HCPCS), pharmacy data (NDC codes), and provider taxonomies - Configure data classification and sensitivity tagging for PHI (Protected Health Information) and PII data elements in compliance with HIPAA requirements - Implement data retention and privacy policies within Collibra that align with healthcare regulatory requirements and member consent management - Develop metadata management processes for regulatory reporting datasets (HEDIS, Medicare Stars, MLR reporting, risk adjustment) Technical Integration & Automation : - Integrate Collibra with healthcare payer core systems including claims processing platforms, eligibility systems, provider directories, and clinical data repositories - Implement automated data quality monitoring and profiling processes that populate the data catalog with technical and business metadata - Configure Collibra's REST APIs to enable integration with existing data governance tools and business intelligence platforms Required Qualifications : - Collibra Platform Expertise - 8+ years of hands-on experience with Collibra Data Intelligence Cloud platform implementation and administration - Expert knowledge of Collibra's data catalog, data lineage, and data governance capabilities - Proficiency in Collibra workflow configuration, custom attribute development, and role-based access control setup - Experience with Collibra Connect for automated metadata harvesting and system integration - Strong understanding of Collibra's REST APIs and custom development capabilities - Healthcare Payer Industry Knowledge - 4+ years of experience working with healthcare payer/health plan data environments - Deep understanding of healthcare data types including claims (professional, institutional, pharmacy), eligibility, provider data, and member demographics - Knowledge of healthcare industry standards including HL7, X12 EDI transactions, and FHIR specifications - Familiarity with healthcare regulatory requirements (HIPAA, ACA, Medicare Advantage, Medicaid managed care) - Understanding of healthcare coding systems (ICD-10-CM/PCS, CPT, HCPCS, NDC, SNOMED CT) Technical Skills : - Strong SQL skills and experience with healthcare databases (claims databases, clinical data repositories, member systems) - Knowledge of cloud platforms (AWS, Azure, GCP) and their integration with Collibra cloud services - Understanding of data modeling principles and healthcare data warehouse design patterns Data Governance & Compliance : - Experience implementing data governance frameworks in regulated healthcare environments - Knowledge of data privacy regulations (HIPAA, state privacy laws) and their implementation in data catalog tools - Understanding of data classification, data quality management, and master data management principles - Experience with audit trail requirements and compliance reporting in healthcare organizations Preferred Qualifications : - Advanced Healthcare Experience - Experience with specific health plan core systems (such as HealthEdge, Facets, QNXT, or similar platforms) - Knowledge of Medicare Advantage, Medicaid managed care, or commercial health plan operations - Understanding of value-based care arrangements and their data requirements - Experience with clinical data integration and population health analytics Technical Certifications & Skills : - Collibra certification (Data Citizen, Data Steward, or Technical User) - Experience with additional data catalog tools (Alation, Apache Atlas, IBM Watson Knowledge Catalog) - Knowledge of data virtualization tools and their integration with data catalog platforms - Experience with healthcare interoperability standards and API management

Posted 2 months ago

Apply

4.0 - 9.0 years

4 - 9 Lacs

Pune, Chennai, Bengaluru

Work from Office

Roles and Responsibilities Collaborate with cross-functional teams to design, develop, test, deploy, and maintain Collibra DQ solutions. Ensure seamless integration of Collibra DQ with other systems using APIs. Provide technical guidance on data governance best practices to stakeholders. Troubleshoot issues related to Collibra DQ implementation and provide timely resolutions. Participate in agile development methodologies such as Scrum. Desired Candidate Profile 4-9 years of experience in Collibra Data Quality (DQ) development or similar roles. Strong understanding of SQL queries for data extraction and manipulation. Experience working with API integrations for system connectivity. Bachelor's degree in Any Specialization (BCA or B.Sc). Proficiency in Agilent tools for testing purposes.

Posted 2 months ago

Apply

10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Overview Keysight is on the forefront of technology innovation, delivering breakthroughs and trusted insights in electronic design, simulation, prototyping, test, manufacturing, and optimization. Our ~15,000 employees create world-class solutions in communications, 5G, automotive, energy, quantum, aerospace, defense, and semiconductor markets for customers in over 100 countries. Learn more about what we do. Our award-winning culture embraces a bold vision of where technology can take us and a passion for tackling challenging problems with industry-first solutions. We believe that when people feel a sense of belonging, they can be more creative, innovative, and thrive at all points in their careers. We are seeking a Lead Data Engineer to design, build, and optimize enterprise-grade data pipelines and platforms that serve as the foundation for advanced analytics, reporting, and AI solutions. This role combines technical leadership, architecture oversight, and hands-on development, with a strong focus on quality, scalability, and stakeholder partnership. Responsibilities Data Engineering Architect and implement data pipelines, data marts, and transformation layers using Snowflake as the primary cloud data platform. Develop and optimize ELT workflows with tools such as Matillion, dbt, or custom Python-based frameworks. Design Snowflake schemas, manage data modeling (star/snowflake) and performance tuning of queries, warehouses, and storage. Technical Leadership Set technical standards and provide leadership in building modular, reusable data engineering components in Snowflake. Lead code reviews, mentor data engineers, and guide development on CI/CD pipelines, version control, and Snowflake DevOps. Own the implementation of multi-environment Snowflake development frameworks for dev/test/prod promotion and change control. Platform Optimization & Monitoring Implement and monitor Resource Monitors, Query Profiles, and Warehouse auto-scaling to optimize cost and performance. Establish usage reporting, job monitoring, and error alerting using Snowflake native capabilities and integrated observability tools. Governance, Security, and Data Quality Define and enforce role-based access control (RBAC), data masking, and secure data sharing within and across domains. Support the implementation of data contracts, lineage, and data quality frameworks (e.g., Great Expectations). Collaborate with Data Governance teams to align platform use with compliance and policy requirements. Business Partnership & Collaboration Engage with product owners, analysts, and data scientists to deliver trusted, high-performance datasets. Translate business requirements into technical Snowflake data solutions. Serve as a Snowflake Data SME for cross-functional teams and stakeholders. Qualifications Careers Privacy Statement ***Keysight is an Equal Opportunity Employer.*** Required 10+ years of experience in data engineering, with 3+ years of hands-on work in Snowflake. Proficiency in SQL and Python, especially for ELT orchestration and automation. Expertise in data modeling (dimensional, normalized), warehouse optimization, and multi-cluster Snowflake configuration. Experience with modern ELT tools like Matillion, dbt, or similar. Familiarity with CI/CD practices, Git, and Snowflake-specific DevOps practices. Preferred Snowflake SnowPro Certification (Core or Advanced). Experience integrating Snowflake with Salesforce, Oracle ERP, or other SaaS/enterprise systems. Exposure to cataloging tools (e.g., Alation, Collibra), Airflow, and monitoring platforms. Knowledge of cost governance and Snowflake usage optimization strategies.

Posted 2 months ago

Apply

2.0 years

0 Lacs

India

On-site

About Billigence: Billigence is a boutique data consultancy with global outreach & clientele, transforming the way organizations work with data. We leverage proven, cutting-edge technologies to design, tailor, and implement advanced Business Intelligence solutions with high added value across a wide range of applications from process digitization through to Cloud Data Warehousing, Visualisation, Data Science, and Engineering or Data Governance. Headquartered in Sydney, Australia with offices around the world, we help clients navigate difficult business conditions, remove inefficiencies, and enable scalable adoption of analytics culture. About the Role: We are hiring a Collibra CDQ Developer to support rule development and deployment across prioritized systems as part of a strategic Data Governance programme for a leading financial institution. This role is ideal for someone with strong hands-on Collibra DQ experience and a passion for delivering enterprise-grade data quality solutions. Key Responsibilities: Configure and deploy Collibra Data Quality (CDQ) rules aligned to business-defined requirements. Translate COE-based logic into executable DQ validations. Build and maintain lineage and quality dashboards within Collibra. Support seamless onboarding of systems with minimal disruption to client teams. Participate in weekly progress reporting including rule coverage and adoption. Provide technical documentation and assist with knowledge handover. Required Skills & Experience: 2+ years of Collibra DQ Collibra Rangers Preferred broader Collibra development experience. Hands-on experience writing and configuring DQ rules. Strong understanding of data lineage, profiling, and governance best practices. SQL proficiency and experience working in enterprise data environments (ideally financial services). Strong team collaboration and communication skills.

Posted 2 months ago

Apply

6.0 years

0 Lacs

India

On-site

About Billigence: Billigence is a boutique data consultancy with global outreach & clientele, transforming the way organizations work with data. We leverage proven, cutting-edge technologies to design, tailor, and implement advanced Business Intelligence solutions with high added value across a wide range of applications from process digitization through to Cloud Data Warehousing, Visualisation, Data Science, and Engineering or Data Governance. Headquartered in Sydney, Australia with offices around the world, we help clients navigate difficult business conditions, remove inefficiencies, and enable scalable adoption of analytics culture. About the Role: We are seeking a Lead Collibra Consultant to spearhead the rollout of Collibra Data Governance and Collibra Data Quality (CDQ) capabilities for one of our major financial services clients. You will lead stakeholder engagement, shape business rules into actionable CDQ logic, and oversee a white-glove implementation that reduces operational burden on internal stakeholders. Key Responsibilities: Lead business and technical discovery sessions with data stewards, SMEs, and application owners. Drive COE (Critical Operational Elements) inventory and business rule capture across business units. Translate business requirements into prioritised Collibra DQ rules for implementation. Define and manage the delivery roadmap across systems and business units. Collaborate with delivery team and client Programme Manager for progress tracking and escalation. Ensure traceability from data elements through lineage, business rules, and data quality monitoring. Provide executive-level reporting and dashboards on rule coverage, adoption, and outcomes. Conduct knowledge transfer to empower internal teams post-implementation. Required Skills & Experience: Collibra Rangers Preferred 6+ years in Data Governance, with 3+ years hands-on Collibra experience. Demonstrated experience leading Collibra rollouts including CDQ implementation. Strong stakeholder engagement and business analysis skills. Understanding of financial services compliance, risk, or data governance frameworks. Familiarity with Collibra components: Business Glossary, Policy Manager, DQ, and Lineage. Excellent communication, leadership, and documentation skills.

Posted 2 months ago

Apply

1.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

JOB_POSTING-3-72487 Job Description Role Title: Analyst, Data Sourcing – Metadata (L08) Company Overview Synchrony (NYSE: SYF) is a premier consumer financial services company delivering one of the industry’s most complete digitally enabled product suites. Our experience, expertise and scale encompass a broad spectrum of industries including digital, health and wellness, retail, telecommunications, home, auto, outdoors, pet and more. We have recently been ranked #2 among India’s Best Companies to Work for by Great Place to Work. We were among the Top 50 India’s Best Workplaces in Building a Culture of Innovation by All by GPTW and Top 25 among Best Workplaces in BFSI by GPTW. We have also been recognized by AmbitionBox Employee Choice Awards among the Top 20 Mid-Sized Companies, ranked #3 among Top Rated Companies for Women, and Top-Rated Financial Services Companies. Synchrony celebrates ~52% women talent. We offer Flexibility and Choice for all employees and provide best-in-class employee benefits and programs that cater to work-life integration and overall well-being. We provide career advancement and upskilling opportunities, focusing on Advancing Diverse Talent to take up leadership roles. Organizational Overview Our Analytics organization comprises of data analysts who focus on enabling strategies to enhance customer and partner experience and optimize business performance through data management and development of full stack descriptive to prescriptive analytics solutions using cutting edge technologies thereby enabling business growth. Role Summary/Purpose The Analyst, Data Sourcing - Metadata (Individual Contributor) role is located in the India Analytics Hub (IAH) as part of Synchrony’s enterprise Data Office. This role is responsible for supporting metadata management processes within Synchrony’s Public and Private cloud and on-prem environments within the Chief Data Office. This role focuses on assisting with metadata harvesting, maintaining data dictionaries, and supporting the tracking of data lineage. The analyst will collaborate closely with senior team members to ensure access to accurate, well-governed metadata for analytics and reporting. Key Responsibilities Implement and maintain metadata management processes across Synchrony’s Public and Private cloud and on-prem environments, ensuring accurate integration with technical and business Metadata catalogs. Work with the Data Architecture and Data Usage teams to track data lineage, traceability, and compliance, identifying and escalating metadata-related issues. Document technical specifications, support solution design, participate in agile development, and release cycles for metadata initiatives. Adhere to data management policies, track KPIs for Metadata effectiveness and assist in assessment of metadata risks to strengthen governance. Maintain stable operations, troubleshoot metadata and lineage issues, and contribute to continuous process improvements to improve data accessibility. Required Skills Bachelor’s Degree, preferably in Engineering or Computer Science with more than 1 years’ hands-on Data Management experience or in lieu of a degree with more than 3 years’ experience. Minimum of 1 years’ experience in data management, focusing on metadata management, data governance, or data lineage, with exposure to cloud environments (AWS, Azure, or Google Cloud) and on-premise infrastructure. Basic understanding of metadata management concepts, familiarity with data cataloging tools (e.g., AWS Glue Data Catalog, AbInitio, Collibra), basic proficiency in data lineage tracking tools (e.g., Apache Atlas, AbInitio, Collibra), and understanding of data integration technologies (e.g., ETL, APIs, data pipelines). Good communication and collaboration skills, strong analytical thinking and problem-solving abilities, ability to work independently and manage multiple tasks, and attention to detail. Desired Skills AWS certifications such as AWS Cloud practitioner, AWS Certified Data Analytics – Specialty Familiarity with hybrid cloud environments (combination of cloud and on-prem). Skilled in Ab Initio Metahub development and support including importers, extractors, Metadata Hub database extensions, technical lineage, QueryIT, Ab Initio graph development, Ab Initio Control center and Express IT Experience with harvesting technical lineage and producing lineage diagrams. Familiarity with Unix, Linux, Stonebranch, and familiarity with database platforms such as Oracle and Hive Basic knowledge of SQL and data query languages for managing and retrieving metadata. Understanding of data governance frameworks (e.g., EDMC DCAM, GDPR compliance). Familiarity with Collibra Eligibility Criteria Bachelor’s Degree, preferably in Engineering or Computer Science with more than 1 years’ hands-on Data Management experience or in lieu of a degree with more than 3 years’ experience. Work Timings: 2PM - 11PM IST This role qualifies for Enhanced Flexibility and Choice offered in Synchrony India and will require the incumbent to be available between 06:00 AM Eastern Time – 11:30 AM Eastern Time (timings are anchored to US Eastern hours and will adjust twice a year locally). This window is for meetings with India and US teams. The remaining hours will be flexible for the employee to choose. Exceptions may apply periodically due to business needs. Please discuss this with the hiring manager for more details. For Internal Applicants Understand the criteria or mandatory skills required for the role, before applying Inform your manager and HRM before applying for any role on Workday Ensure that your professional profile is updated (fields such as education, prior experience, other skills) and it is mandatory to upload your updated resume (Word or PDF format) Must not be any corrective action plan (First Formal/Final Formal, LPP) L4 to L7 Employees who have completed 12 months in the organization and 12 months in current role and level are only eligible. L8 Employees who have completed 18 months in the organization and 12 months in current role and level are only eligible. L04+ Employees can apply Grade/Level: 08 Job Family Group Information Technology

Posted 2 months ago

Apply

1.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

JOB_POSTING-3-72479 Job Description Role Title: Analyst, Data Sourcing – Metadata (L08) Company Overview : Synchrony (NYSE: SYF) is a premier consumer financial services company delivering one of the industry’s most complete digitally enabled product suites. Our experience, expertise and scale encompass a broad spectrum of industries including digital, health and wellness, retail, telecommunications, home, auto, outdoors, pet and more. We have recently been ranked #2 among India’s Best Companies to Work for by Great Place to Work. We were among the Top 50 India’s Best Workplaces in Building a Culture of Innovation by All by GPTW and Top 25 among Best Workplaces in BFSI by GPTW. We have also been recognized by AmbitionBox Employee Choice Awards among the Top 20 Mid-Sized Companies, ranked #3 among Top Rated Companies for Women, and Top-Rated Financial Services Companies. Synchrony celebrates ~52% women talent. We offer Flexibility and Choice for all employees and provide best-in-class employee benefits and programs that cater to work-life integration and overall well-being. We provide career advancement and upskilling opportunities, focusing on Advancing Diverse Talent to take up leadership roles Organizational Overview Our Analytics organization comprises of data analysts who focus on enabling strategies to enhance customer and partner experience and optimize business performance through data management and development of full stack descriptive to prescriptive analytics solutions using cutting edge technologies thereby enabling business growth. Role Summary/Purpose The Analyst, Data Sourcing - Metadata (Individual Contributor) role is located in the India Analytics Hub (IAH) as part of Synchrony’s enterprise Data Office. This role is responsible for supporting metadata management processes within Synchrony’s Public and Private cloud and on-prem environments within the Chief Data Office. This role focuses on assisting with metadata harvesting, maintaining data dictionaries, and supporting the tracking of data lineage. The analyst will collaborate closely with senior team members to ensure access to accurate, well-governed metadata for analytics and reporting. Key Responsibilities Implement and maintain metadata management processes across Synchrony’s Public and Private cloud and on-prem environments, ensuring accurate integration with technical and business Metadata catalogs. Work with the Data Architecture and Data Usage teams to track data lineage, traceability, and compliance, identifying and escalating metadata-related issues. Document technical specifications, support solution design, participate in agile development, and release cycles for metadata initiatives. Adhere to data management policies, track KPIs for Metadata effectiveness and assist in assessment of metadata risks to strengthen governance. Maintain stable operations, troubleshoot metadata and lineage issues, and contribute to continuous process improvements to improve data accessibility. Required Skills & Knowledge Bachelor’s Degree, preferably in Engineering or Computer Science with more than 1 years’ hands-on Data Management experience or in lieu of a degree with more than 3 years’ experience. Minimum of 1 years’ experience in data management, focusing on metadata management, data governance, or data lineage, with exposure to cloud environments (AWS, Azure, or Google Cloud) and on-premise infrastructure. Basic understanding of metadata management concepts, familiarity with data cataloging tools (e.g., AWS Glue Data Catalog, AbInitio, Collibra), basic proficiency in data lineage tracking tools (e.g., Apache Atlas, AbInitio, Collibra), and understanding of data integration technologies (e.g., ETL, APIs, data pipelines). Good communication and collaboration skills, strong analytical thinking and problem-solving abilities, ability to work independently and manage multiple tasks, and attention to detail. Desired Skills & Knowledge AWS certifications such as AWS Cloud practitioner, AWS Certified Data Analytics – Specialty Preferred Qualifications Familiarity with hybrid cloud environments (combination of cloud and on-prem). Skilled in Ab Initio Metahub development and support including importers, extractors, Metadata Hub database extensions, technical lineage, QueryIT, Ab Initio graph development, Ab Initio Control center and Express IT Experience with harvesting technical lineage and producing lineage diagrams. Familiarity with Unix, Linux, Stonebranch, and familiarity with database platforms such as Oracle and Hive Basic knowledge of SQL and data query languages for managing and retrieving metadata. Understanding of data governance frameworks (e.g., EDMC DCAM, GDPR compliance). Familiarity with Collibra Eligibility Criteria: Bachelor’s Degree, preferably in Engineering or Computer Science with more than 1 years’ hands-on Data Management experience or in lieu of a degree with more than 3 years’ experience. Work Timings: This role qualifies for Enhanced Flexibility and Choice offered in Synchrony India and will require the incumbent to be available between 06:00 AM Eastern Time – 11:30 AM Eastern Time (timings are anchored to US Eastern hours and will adjust twice a year locally). This window is for meetings with India and US teams. The remaining hours will be flexible for the employee to choose. Exceptions may apply periodically due to business needs. Please discuss this with the hiring manager for more details. For Internal Applicants Understand the criteria or mandatory skills required for the role, before applying Inform your manager and HRM before applying for any role on Workday Ensure that your professional profile is updated (fields such as education, prior experience, other skills) and it is mandatory to upload your updated resume (Word or PDF format) Must not be any corrective action plan (First Formal/Final Formal, LPP) L4 to L7 Employees who have completed 12 months in the organization and 12 months in current role and level are only eligible. L8 Employees who have completed 18 months in the organization and 12 months in current role and level are only eligible. L04+ Employees can apply Grade/Level: 08 Job Family Group Information Technology

Posted 2 months ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

We are looking for a talented and motivated ETL Developer / Senior Developer to join our data engineering team. You will work on building scalable and efficient data pipelines using IBM DataStage (on Cloud Pak for Data) , AWS Glue , and Snowflake . You will collaborate with architects, business analysts, and data modelers to ensure timely and accurate delivery of critical data assets supporting analytics and AI/ML use cases. Key Responsibilities Design, develop, and maintain ETL pipelines using IBM DataStage (CP4D) and AWS Glue/Lambda for ingestion from varied sources like flat files, APIs, Oracle, DB2, etc. Build and optimize data flows for loading curated datasets into Snowflake , leveraging best practices for schema design, partitioning, and transformation logic. Participate in code reviews , performance tuning, and defect triage sessions. Work closely with data governance teams to ensure lineage, privacy tagging, and quality controls are embedded within pipelines. Contribute to CI/CD integration of ETL components using Git, Jenkins, and parameterized job configurations. Troubleshoot and resolve issues in QA/UAT/Production environments as needed. Adhere to agile delivery practices, sprint planning, and documentation requirements. Required Skills and Experience 4+ years of experience in ETL development with at least 1–2 years in IBM DataStage (preferably CP4D version) . Hands-on experience with AWS Glue (PySpark or Spark) and AWS Lambda for event-based processing. Experience working with Snowflake : loading strategies, stream tasks, zero-copy cloning, and performance tuning. Proficiency in SQL , Unix scripting , and basic Python for data handling or automation. Familiarity with S3 , version control systems (Git), and job orchestration tools. Experience with data profiling, cleansing, and quality validation routines. Understanding of data lake/data warehouse architectures and DevOps practices. Good to Have Experience with Collibra, BigID , or other metadata/governance tools Exposure to Data Mesh/Data Domain models Experience with agile/Scrum delivery and Jira/Confluence tools AWS or Snowflake certification is a plus

Posted 2 months ago

Apply

8.0 years

0 Lacs

India

Remote

Ready to embark on a journey where your growth is intertwined with our commitment to making a positive impact? Join the Delphi family - where Growth Meets Values. At Delphi Consulting Pvt. Ltd. , we foster a thriving environment with a hybrid work model that lets you prioritize what matters most. Interviews and onboarding are conducted virtually, reflecting our digital-first mindset . We specialize in Data, Advanced Analytics, AI, Infrastructure, Cloud Security , and Application Modernization , delivering impactful solutions that drive smarter, efficient futures for our clients. About the Role: As a Senior MDM Consultant , you will be responsible for driving the design, implementation, and governance of master data management solutions across critical business domains. This role demands a strong foundation in data architecture, integration, and governance best practices, as well as experience with leading MDM platforms. You will work closely with business and technical teams to ensure data quality, consistency, and availability that support enterprise-wide decision-making and operations. What you'll do: Master Data Strategy : Develop and execute comprehensive Master Data Management (MDM) strategies aligned with business goals and data governance objectives. Data Governance Frameworks : Establish robust data governance policies, procedures, and frameworks across critical data domains. Team Leadership & Standards : Lead cross-functional teams to implement data quality standards, best practices, and ensure consistent master data usage across systems. Documentation & Metadata : Create and maintain data dictionaries, data lineage documentation, and metadata repositories to ensure transparency and traceability. Operational MDM Implementation : Design and implement operational MDM approaches, ensuring effective management of reference and master data. Business Collaboration : Partner with business users to gather master data requirements and facilitate stakeholder alignment across departments. Technical Architecture : Collaborate with IT teams on integrations and architecture decisions involving MDM systems and platforms. Stewardship & Training : Facilitate data stewardship programs and provide training to business users on MDM processes and governance best practices. Executive Communication : Present MDM initiatives, metrics, and results to executive leadership and key stakeholders. What you'll bring: Relevant Experience : 6–8 years of hands-on experience in Master Data Management, Data Governance, or closely related roles. MDM Platforms Expertise : Proven experience implementing MDM solutions using tools such as Informatica, Profisee, or equivalent platforms. ETL & Integration : Ability to architect data integration workflows, configure matching algorithms, survivorship rules, and data quality validations. SQL & Data Modeling : Strong SQL skills and familiarity with relational databases; good understanding of data modeling and dimensional modeling concepts. Tool Proficiency : Experience with data integration tools and data governance platforms like Microsoft Purview, Collibra, or Informatica. Cloud Knowledge : Working knowledge of data services within cloud platforms such as AWS, Azure, or GCP. What We Offer: At Delphi, we are dedicated to creating an environment where you can thrive, both professionally and personally. Our competitive compensation package, performance-based incentives, and health benefits are designed to ensure you're well-supported. We believe in your continuous growth and offer company-sponsored certifications, training programs , and skill-building opportunities to help you succeed. We foster a culture of inclusivity and support, with remote work options and a fully supported work-from-home setup to ensure your comfort and productivity. Our positive and inclusive culture includes team activities, wellness and mental health programs to ensure you feel supported.

Posted 2 months ago

Apply

3.0 - 7.0 years

12 - 20 Lacs

Hyderabad, Bengaluru

Hybrid

OCI Data Catalog Oracle Cloud Infrastructure (OCI) Metadata Management Data Lineage Data Classification Data Stewardship Data Governance Cloud Data Integration Oracle Autonomous Database Oracle Object Storage PoC Delivery (Proof of Concept) Cloud Data Management SQL ETL Tools (e.g., Oracle Data Integrator, Informatica, etc.) Data Catalog Tools (e.g., Collibra, Alation, Azure Purview for comparative experience) On-premise to Cloud Integration API Integration (for metadata harvesting)

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies