Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 years
5 - 8 Lacs
bengaluru
On-site
About us At ExxonMobil, our vision is to lead in energy innovations that advance modern living and a net-zero future. As one of the world’s largest publicly traded energy and chemical companies, we are powered by a unique and diverse workforce fueled by the pride in what we do and what we stand for. The success of our Upstream, Product Solutions and Low Carbon Solutions businesses is the result of the talent, curiosity and drive of our people. They bring solutions every day to optimize our strategy in energy, chemicals, lubricants and lower-emissions technologies. We invite you to bring your ideas to ExxonMobil to help create sustainable solutions that improve quality of life and meet society’s evolving needs. Learn more about our What and our Why and how we can work together . ExxonMobil’s affiliates in India ExxonMobil’s affiliates have offices in India in Bengaluru, Mumbai and the National Capital Region. ExxonMobil’s affiliates in India supporting the Product Solutions business engage in the marketing, sales and distribution of performance as well as specialty products across chemicals and lubricants businesses. The India planning teams are also embedded with global business units for business planning and analytics. ExxonMobil’s LNG affiliate in India supporting the upstream business provides consultant services for other ExxonMobil upstream affiliates and conducts LNG market-development activities. The Global Business Center - Technology Center provides a range of technical and business support services for ExxonMobil’s operations around the globe. ExxonMobil strives to make a positive contribution to the communities where we operate and its affiliates support a range of education, health and community-building programs in India. Read more about our Corporate Responsibility Framework. To know more about ExxonMobil in India, visit ExxonMobil India and the Energy Factor India. What role you will play in our team The Data Catalog & Discovery Business Analyst is responsible for managing stakeholder operations related to Collibra Data Catalog (Data Intelligence Platform). Primary responsibilities include coordinating efforts with data asset stewards in the business, managing a portfolio of ongoing business support needs, serving as a Collibra power user (expert), and reinforcing a culture of data governance and data-centricity through efforts inside Collibra tool(s). The role will be based in Bengaluru, Whitefield and is a full-time office-based position. What you will do Coordinate efforts with data asset stewards in the business; Manage a portfolio of ongoing business support needs for Collibra; Collaborate with the CDO Agile System of Delivery (SoD) team to further mature services related to Collibra. Commit to the work to accomplish in a sprint or program increment and drive value through continuous delivery; Ensure the accuracy, completeness, and consistency of data within the Collibra Data Catalog by implementing and monitoring data quality standards and processes; Develop and deliver training programs for end-users and data stewards to enhance their understanding and effective use of Collibra tools; Assist in the creation and enforcement of data governance policies, procedures, and best practices to ensure compliance and data integrity; Generate and analyze reports on data catalog usage, data quality metrics, and other key performance indicators to inform decision-making and continuous improvement; Identify opportunities for process improvements and innovations within the data catalog and discovery functions; Identify and mitigate risks associated with data management and governance, ensuring the security and privacy of data assets. Should work as a liaison between Business and IT About You Required Skills and Qualifications: Develop business processes in Visio or related software; Translate business and technical processes into technical requirements reflecting opportunities for automation or self-service; Integrate external and internal best practices into business operations or processes; Effective understanding of the importance of data and metadata for the execution of business processes or decisions; Demonstrate critical thinking, analytical skills, and employ judgment to offer thoughtful, concise input toward resolutions of problems; Leadership skills needed to successfully promote ideas, coordinate work activities, and plan deliverables within a project team; Strong communication, interpersonal, and presentation skills with strong English proficiency; Professional must have great communication skills and previous experience as a liaison between business staff (requestors) and IT staff (developers) acting as a requirements translator; Must be comfortable to present product developments to CDO Leadership Team, Business Lines' Senior Management and Data Community at Large; Working knowledge of Microsoft Office (Outlook, Excel, Word, PowerPoint, OneNote), Microsoft SharePoint, Microsoft Windows, and major browsers (Microsoft Edge and Google Chrome); Basic understanding of SQL and data querying techniques to support data discovery and analysis tasks; Experience in project management methodologies (e.g., Agile, Scrum) to effectively manage and deliver data-related projects; Strong problem-solving skills to identify issues, analyze root causes, and implement effective solutions; A focus on understanding and meeting the needs of internal and external customers, ensuring a positive user experience with data catalog tools. Minimum Bachelor’s Degree in Engineering, Technology, Computer Science, or a related field with equivalent experience; Ability to leverage external best practices from the DAMA DMBOK. Minimum 2 years of Experience Exposure to Collibra Data Intelligence Platform configuration and customization; Exposure to Collibra automated workflows (not necessarily as a developer); Familiarity with data governance frameworks and best practices, including data stewardship, data quality management, and metadata management. Preferred skills & Qualification: Professional must have great communication skills and previous experience acting as a liaison between business units, IT, and data governance teams to facilitate clear communication and alignment on data-related initiatives. Relevant certifications such as Certified Data Management Professional (CDMP) or Collibra Ranger Certification are an important plus. Your benefits An ExxonMobil career is one designed to last. Our commitment to you runs deep: our employees grow personally and professionally, with benefits built on our core categories of health, security, finance and life. We offer you: Competitive compensation Medical plans, maternity leave and benefits, life, accidental death and dismemberment benefits Retirement benefits Global networking & cross-functional opportunities Annual vacations & holidays Day care assistance program Training and development program Tuition assistance program Workplace flexibility policy Relocation program Transportation facility Please note benefits may change from time to time without notice, subject to applicable laws. The benefits programs are based on the Company’s eligibility guidelines. Stay connected with us Learn more about ExxonMobil in India, visit ExxonMobil India and Energy Factor India. Follow us on LinkedIn and Instagram Like us on Facebook Subscribe our channel at YouTube EEO Statement ExxonMobil is an Equal Opportunity Employer: All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, age, national origin or disability status. Business solicitation and recruiting scams ExxonMobil does not use recruiting or placement agencies that charge candidates an advance fee of any kind (e.g., placement fees, immigration processing fees, etc.). Follow the LINK to understand more about recruitment scams in the name of ExxonMobil. Nothing herein is intended to override the corporate separateness of local entities. Working relationships discussed herein do not necessarily represent a reporting connection, but may reflect a functional guidance, stewardship, or service relationship. Exxon Mobil Corporation has numerous affiliates, many with names that include ExxonMobil, Exxon, Esso and Mobil. For convenience and simplicity, those terms and terms like corporation, company, our, we and its are sometimes used as abbreviated references to specific affiliates or affiliate groups. Abbreviated references describing global or regional operational organizations and global or regional business lines are also sometimes used for convenience and simplicity. Similarly, ExxonMobil has business relationships with thousands of customers, suppliers, governments, and others. For convenience and simplicity, words like venture, joint venture, partnership, co-venturer, and partner are used to indicate business relationships involving common activities and interests, and those words may not indicate precise legal relationships.
Posted 3 weeks ago
7.0 years
3 - 6 Lacs
No locations specified
On-site
Job Description: ETL Modernization Engineer (OSP Partner) This role is for an OSP provider resource to migrate legacy ETL pipelines (Informatica/Talend) to a modern data stack on Snowflake, dbt, and Fivetran, with strong DataOps, governance, and cost-aware design. Target experience: 7 to 8 years. Role Summary Lead end to end migration of legacy ETL workloads to Snowflake dbt Fivetran, including discovery, design, refactoring, validation, and cutover. Establish reusable patterns for ingestion, transformation, orchestration, testing, observability, and cost optimization. Collaborate with Enterprise Data Architecture, Security, and BI teams to ensure compliant, high-performance delivery. Key Responsibilities Assess current-state: inventory Informatica/Talend jobs, mappings, schedules, dependencies, SLAs, and data contracts. Design target-state: ingestion via Fivetran/ELT, dbt-based transformations, Snowflake schemas (raw/bronze, curated/silver, semantic/gold), and orchestration approach. Migrate and refactor: Convert mappings/workflows to dbt models, macros, seeds, and exposures. Replace hand-coded ingestions with managed connectors (Fivetran) or alternative ELT where required. Implement CDC patterns (e.g., Fivetran + dbt snapshots), SCD handling, and incremental strategies. Data quality and testing: implement dbt tests (schema, referential, accepted values, freshness), anomaly checks, and reconciliation with legacy outputs. Performance engineering: optimize Snowflake warehouses, clustering/partitioning strategies, query tuning, caching/materialization patterns, and costs. Security and governance: apply RBAC roles, masking, row access policies; integrate with lineage/catalog (e.g., OpenLineage/Marquez, Collibra/Atlan/Alation if applicable). Observability and reliability: configure logging/metrics, job run health, SLAs/SLOs, alerting, and incident runbooks. Cutover planning: parallel runs, backfills, data reconciliation, defect triage, rollout, and decommissioning legacy jobs. Documentation and knowledge transfer: architecture diagrams, runbooks, playbooks, and training for client teams. Required Skills and Experience 7 to 8 years in data engineering/ETL modernization with demonstrable migration projects from Informatica or Talend to ELT on cloud data warehouses. Hands on with Snowflake (warehouses, tasks, streams, time travel, Query Profile), performance tuning, security policies, and cost governance. Strong dbt expertise: model design, Jinja/macros, packages, exposures, snapshots, environment promotion, and CI/CD with Git. Practical Fivetran experience: connector configuration, sync scheduling, historical backfills, log based CDC, schema drift handling. Proficient SQL (advanced), with Python preferred for utilities e.g., migration scripts, validation Experience building star/snowflake schemas, dimensional modeling, and semantic layers for BI tools Power BI/Looker/Tableau DataOps and CI/CD: branching strategies, automated tests, deployment pipelines, environment management. Data quality frameworks and reconciliation techniques for migration sign off. Strong stakeholder management: work with architects, security, BI, and business SMEs in phased migration programs. Deliverables Current state inventory and dependency map of legacy pipelines. Target state architecture and migration plan, including cutover strategy. Re platformed pipelines: Fivetran connectors, dbt project(s), Snowflake schemas, roles, and policies. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 3 weeks ago
9.0 - 16.0 years
0 Lacs
bengaluru, karnataka, india
On-site
LTIMindtree is proud to be an equal opportunity employer. We are committed to equal employment opportunity regardless of race, ethnicity, nationality, gender, gender-identity, gender expression, language, age, sexual orientation, religion, marital status, veteran status, socio-economic status, disability or any other characteristic protected by applicable law. We are Hiring for Data Governance - Collibra !! Experience: 9 to 16 Years Location: Kolkata/Mumbai/Pune/Bangalore/Noida/Chennai/Hyderabad Job Description: Extensive understanding in Collibra Data Intelligence on cloud platform which includes: Data governance, master data management, Data Quality management, data privacy and security, data access and marketplace. Extensive experience in data governance processes. Implementation experience with data governance projects. Implementation experience in consulting engagements. Please share your updated resume – aditya.utekar2@ltimindtree.com
Posted 3 weeks ago
7.0 years
0 Lacs
chennai, tamil nadu, india
On-site
Job Description: ETL Modernization Engineer (OSP Partner) This role is for an OSP provider resource to migrate legacy ETL pipelines (Informatica/Talend) to a modern data stack on Snowflake, dbt, and Fivetran, with strong DataOps, governance, and cost-aware design. Target experience: 7 to 8 years. Role Summary Lead end to end migration of legacy ETL workloads to Snowflake dbt Fivetran, including discovery, design, refactoring, validation, and cutover. Establish reusable patterns for ingestion, transformation, orchestration, testing, observability, and cost optimization. Collaborate with Enterprise Data Architecture, Security, and BI teams to ensure compliant, high-performance delivery. Key Responsibilities Assess current-state: inventory Informatica/Talend jobs, mappings, schedules, dependencies, SLAs, and data contracts. Design target-state: ingestion via Fivetran/ELT, dbt-based transformations, Snowflake schemas (raw/bronze, curated/silver, semantic/gold), and orchestration approach. Migrate And Refactor Convert mappings/workflows to dbt models, macros, seeds, and exposures. Replace hand-coded ingestions with managed connectors (Fivetran) or alternative ELT where required. Implement CDC patterns (e.g., Fivetran + dbt snapshots), SCD handling, and incremental strategies. Data quality and testing: implement dbt tests (schema, referential, accepted values, freshness), anomaly checks, and reconciliation with legacy outputs. Performance engineering: optimize Snowflake warehouses, clustering/partitioning strategies, query tuning, caching/materialization patterns, and costs. Security and governance: apply RBAC roles, masking, row access policies; integrate with lineage/catalog (e.g., OpenLineage/Marquez, Collibra/Atlan/Alation if applicable). Observability and reliability: configure logging/metrics, job run health, SLAs/SLOs, alerting, and incident runbooks. Cutover planning: parallel runs, backfills, data reconciliation, defect triage, rollout, and decommissioning legacy jobs. Documentation and knowledge transfer: architecture diagrams, runbooks, playbooks, and training for client teams. Required Skills And Experience 7 to 8 years in data engineering/ETL modernization with demonstrable migration projects from Informatica or Talend to ELT on cloud data warehouses. Hands on with Snowflake (warehouses, tasks, streams, time travel, Query Profile), performance tuning, security policies, and cost governance. Strong dbt expertise: model design, Jinja/macros, packages, exposures, snapshots, environment promotion, and CI/CD with Git. Practical Fivetran experience: connector configuration, sync scheduling, historical backfills, log based CDC, schema drift handling. Proficient SQL (advanced), with Python preferred for utilities e.g., migration scripts, validation Experience building star/snowflake schemas, dimensional modeling, and semantic layers for BI tools Power BI/Looker/Tableau DataOps and CI/CD: branching strategies, automated tests, deployment pipelines, environment management. Data quality frameworks and reconciliation techniques for migration sign off. Strong stakeholder management: work with architects, security, BI, and business SMEs in phased migration programs. Deliverables Current state inventory and dependency map of legacy pipelines. Target state architecture and migration plan, including cutover strategy. Re platformed pipelines: Fivetran connectors, dbt project(s), Snowflake schemas, roles, and policies.
Posted 3 weeks ago
6.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Job Description Join our team to enhance data governance and quality, leveraging your expertise in metadata management. As a Quality Analyst I within the Data Analytics team, you will play a crucial role in improving data quality and consistency. You will be responsible for gathering, validating, and documenting metadata for critical business data elements, ensuring their fitness for use across various environments. Job Responsibilities Identify and remediate data quality and metadata issues. Contribute to metadata gathering, reviewing, and publishing efforts. Understand and apply data policies, standards, and processes. Provide regular status updates to management and stakeholders. Collaborate with SMEs to enhance understanding of business processes and data usage. Update terms and definitions for critical data elements. Develop tools and processes for efficient metadata management. Produce managerial and regulatory reporting of metadata issues. Required Qualifications, Capabilities, And Skills Bachelor’s degree in business or related field with 6+ years in Data consumption, Metadata, or Data Quality. 5+ years in a major financial services organization. 3+ years in Data usage, Process Analysis, or Technology Development/Support. Proficiency in Alteryx, Tableau, AWS, Snowflake, LLM, SQL. Strong analytical and problem-solving skills. Excellent communication skills. Advanced MS Office suite skills. Preferred Qualifications, Capabilities, And Skills Working knowledge of Tableau, JIRA, Collibra Databook, SharePoint, Confluence, Alteryx, Collibra Data Quality. ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team Our Consumer & Community Banking division serves our Chase customers through a range of financial services, including personal banking, credit cards, mortgages, auto financing, investment advice, small business loans and payment processing. We’re proud to lead the U.S. in credit card sales and deposit growth and have the most-used digital solutions – all while ranking first in customer satisfaction. The CCB Data & Analytics team responsibly leverages data across Chase to build competitive advantages for the businesses while providing value and protection for customers. The team encompasses a variety of disciplines from data governance and strategy to reporting, data science and machine learning. We have a strong partnership with Technology, which provides cutting edge data and analytics infrastructure. The team powers Chase with insights to create the best customer and business outcomes.
Posted 3 weeks ago
0 years
0 Lacs
bengaluru, karnataka, india
On-site
As a Data Catalog Consultant/Manager, you will be responsible for designing and implementing data catalog solutions that improve data discovery, accessibility, and governance across our clients' organizations. You will leverage your expertise in Collibra, IDMC, and Purview to develop robust data catalog frameworks and collaborate with cross-functional teams to align data assets with business objectives. Key Responsibilities: Data Catalog Strategy and Implementation: Engage with clients to understand their data catalog and governance needs. Design and implement comprehensive data catalog solutions using Collibra, IDMC, and Purview. Provide expert advice on industry best practices and emerging trends in data cataloging and governance. Collibra, IDMC, and Purview Expertise: Utilize Collibra, IDMC, and Purview to manage and enhance data catalog processes. Configure and customize data catalog tools to meet specific client requirements. Ensure seamless integration of data catalog solutions with existing data governance systems. Monitoring and Continuous Improvement: Establish data catalog metrics and KPIs to assess effectiveness and drive continuous improvement. Conduct regular audits and assessments to ensure data catalog standards are maintained. Facilitate workshops and training sessions to promote data catalog awareness and best practices. Collaboration and Leadership: Collaborate with data architects, data analysts, IT, legal, and compliance teams to integrate data cataloging into broader data management initiatives. Mentor and guide junior team members, fostering a culture of knowledge sharing and professional growth. Qualifications: Bachelor’s or Master’s degree in Information Systems, Data Management, Computer Science, or a related field. Proven experience in data cataloging and governance, preferably within a consulting environment. Expertise in Collibra, Informatica Data Management Cloud (IDMC), and Microsoft Purview. Strong understanding of data governance frameworks, tools, and technologies. Excellent analytical and problem-solving skills. Strong communication and interpersonal skills, with the ability to influence stakeholders at all levels. Relevant certifications in Collibra, Informatica, and Microsoft Purview are advantageous. What We Offer: A dynamic and inclusive work environment that values collaboration and innovation. Opportunities for professional development and career advancement. Competitive salary and benefits package. The chance to work with industry-leading clients and cutting-edge technologies.
Posted 3 weeks ago
0 years
0 Lacs
mumbai, maharashtra, india
On-site
Role Summary: As a Data Quality Consultant, you will play a crucial role in enhancing data quality across our clients' organizations. You will be responsible for implementing data quality frameworks and solutions using tools such as Collibra, IDMC, etc. ensuring data integrity, accuracy, and accessibility. You will collaborate with cross-functional teams to identify data quality gaps and deliver tailored solutions that support our clients’ business objectives. Key Responsibilities: Data Quality Strategy and Implementation: Engage with clients to understand their data quality requirements and business goals. Develop and implement data quality frameworks and solutions using tool such as Collibra and IDMC. Provide expert advice on industry best practices and emerging trends in data quality management. Tool Expertise: Utilize DQ tools such as Collibra, Talend, IDMC, etc. to manage and enhance data quality processes. Configure and customize Collibra workflows and IDMC data management solutions to meet specific client needs. Ensure seamless integration of data quality tools with existing data governance systems. Monitoring and Continuous Improvement: Establish data quality metrics and KPIs to assess effectiveness and drive continuous improvement. Conduct regular audits and assessments to ensure data quality standards are maintained. Facilitate workshops and training sessions to promote data quality awareness and best practices. Collaboration and Leadership: Work collaboratively with data architects, data analysts, IT, legal, and compliance teams to integrate data quality into broader data management initiatives. Mentor and guide junior team members, fostering a culture of knowledge sharing and professional growth. Qualifications: Bachelor’s or Master’s degree in Information Systems, Data Management, Computer Science, or a related field. Proven experience in data quality management, preferably within a consulting environment. Expertise in DQ Tools such as Collibra, Informatica Data Management Cloud (IDMC), etc. Strong understanding of data governance frameworks, tools, and technologies. Excellent analytical and problem-solving skills. Strong communication and interpersonal skills, with the ability to influence stakeholders at all levels. Relevant certifications in Collibra and Informatica are advantageous. What We Offer: A dynamic and inclusive work environment that values collaboration and innovation. Opportunities for professional development and career advancement. Competitive salary and benefits package. The chance to work with industry-leading clients and cutting-edge technologies.
Posted 3 weeks ago
4.0 - 9.0 years
6 - 14 Lacs
ahmedabad, chennai, bengaluru
Work from Office
Roles and Responsibilities Collaborate with cross-functional teams to design, develop, test, deploy, and maintain Collibra DQ solutions. Ensure seamless integration of Collibra DQ with other systems using APIs. Provide technical guidance on data governance best practices to stakeholders. Troubleshoot issues related to Collibra DQ implementation and provide timely resolutions. Participate in agile development methodologies such as Scrum. Desired Candidate Profile 4-9 years of experience in Collibra Data Quality (DQ) development or similar roles. Strong understanding of SQL queries for data extraction and manipulation. Experience working with API integrations for system connectivity. Bachelor's degree in Any Specialization (BCA or B.Sc). Proficiency in Agilent tools for testing purposes.
Posted 3 weeks ago
10.0 years
0 Lacs
hyderabad, telangana, india
On-site
Job Title: Data Governance Lead Location: Pune/Hyderabad Job Type: Full time Years of Experience-10+ years About Straive: Straive is a market leading Content and Data Technology company providing data services, subject matter expertise, & technology solutions to multiple domains. Data Analytics & Al Solutions, Data Al Powered Operations and Education & Learning form the core pillars of the company’s long-term vision. The company is a specialized solutions provider to business information providers in finance, insurance, legal, real estate, life sciences and logistics. Straive continues to be the leading content services provider to research and education publishers. Data Analytics & Al Services: Our Data Solutions business has become critical to our client's success. We use technology and Al with human experts-in loop to create data assets that our clients use to power their data products and their end customers' workflows. As our clients expect us to become their future-fit Analytics and Al partner, they look to us for help in building data analytics and Al enterprise capabilities for them. With a client-base scoping 30 countries worldwide, Straive’s multi-geographical resource pool is strategically located in eight countries - India, Philippines, USA, Nicaragua, Vietnam, United Kingdom, and the company headquarters in Singapore. Website: https://www.straive.com/ Overview /Objective: The Data Governance Lead will serve as a hands-on leader to drive the development and implementation of our enterprise-wide data governance framework. This role is critical to solving current data challenges, including inconsistent business rules, unclear data ownership, lack of data quality standards, fragmented business taxonomies, and absence of a centralized data catalog. The successful candidate will serve as a bridge between business and technical stakeholders to establish scalable governance practices that increase trust in data, reduce inefficiencies, and enable data-driven decision-making across the organization. Responsibilities: ● Governance Framework & Standards ○ Design and implement a fit-for-purpose enterprise data governance framework. ○ Develop and maintain policies, procedures, and standards for data governance, including data ownership, stewardship, access, and usage. ○ Lead the rollout of role definitions for data owners and stewards across the business. ● Business Rules & Taxonomy Management ○ Partner with business units to identify, standardize, and document core business rules and data definitions. ○ Drive alignment and version control of business taxonomies, ensuring they are centrally maintained and regularly updated. ○ Establish a governance process for creating, updating, and retiring taxonomy elements and business rules. ● Data Quality & Accountability ○ Define and implement data quality dimensions (accuracy, completeness, timeliness, consistency, etc.) across key data domains. ○ Collaborate with functional leads to establish and maintain quality rules and thresholds. ○ Create a feedback loop to monitor data issues and ensure accountability for resolution. ● Data Ownership & Stewardship ○ Lead the identification and onboarding of data owners and stewards. ○ Facilitate training and engagement programs to support the ongoing participation of governance roles. ○ Establish a RACI model to ensure clarity of responsibilities and decision-making authority. ● Metadata Management & Cataloging ○ Lead the selection and implementation of a data catalog and metadata management solution (if not already in place). ○ Work with technical teams to populate and maintain metadata in a centralized platform. ○ Promote catalog adoption and usage across the enterprise. ● Cross-Functional Engagement ○ Partner with data platform, engineering, analytics, legal, compliance, and business teams to embed governance in daily operations. ○ Serve as a subject matter expert and advisor on data governance for major initiatives and projects. Qualifications: ● 10+ years of experience in data management, with 7+ years in Data Governance roles. ● Proven track record designing and operationalizing governance frameworks at scale. ● Strong knowledge of data quality principles, metadata management, business rules management, and metadata standards. ● Hands-on knowledge with data governance platforms (e.g., Collibra, Alation, Informatic) is a plus. ● Familiarity with Google cloud platform(GCP) ● Exceptional communication and stakeholder management skills. ● Ability to operate both strategically and tactically in a cross-functional environment.
Posted 3 weeks ago
3.0 - 5.0 years
0 Lacs
hyderabad, telangana, india
On-site
About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Summary: Data Governance Tooling & Lifecycle Mgmt. Engineering Support: (Supervisor, Data Operations & Management) As the Manager of Data Governance Tooling & Lifecycle Management Engineering Support, you will play a key role in implementing, maintaining, and optimizing enterprise data governance tools and lifecycle automation processes. This hands-on role supports metadata management, policy execution, and data lifecycle tracking across cloud-native platforms including Google Cloud (BigQuery) and AWS (Redshift). You’ll work closely with governance, engineering, and compliance teams to ensure data is cataloged, classified, accessible, and managed throughout its lifecycle. Who we are looking for: Primary Responsibilities: Governance Tooling Implementation & Support Implement and manage data governance platforms such as Collibra, including configuration, workflow automation, integrations, and user management. Maintain metadata harvesting, classification, and cataloging across cloud environments. Ensure accurate population of business and technical metadata, including lineage and stewardship assignments. Lifecycle Management Automation: Engineer and support lifecycle governance for data assets—from creation to archival—across GCP and AWS. Develop automation scripts and pipelines to enforce data retention, purging, and archival policies. Collaborate with infrastructure teams to apply lifecycle rules across storage and warehouse systems. Metadata & Integration Enablement: Integrate governance tooling with cloud-native platforms like Big Query, Redshift, GCS, and S3 to maintain real-time visibility into data usage and quality. Support lineage capture across pipelines and systems, including orchestration tools (e.g., Airflow, Cloud Composer). Align metadata models with organizational taxonomy and business glossaries. Policy Execution & Compliance Support: Implement automated policy rules related to data classification, access control, and privacy. Ensure tooling compliance with internal governance standards and external regulatory requirements (e.g., GDPR, HIPAA, CCPA). Support audit processes by maintaining accurate lineage, ownership, and policy enforcement records. Collaboration & Documentation: Work with data stewards, engineers, and architects to support governance onboarding and issue resolution. Maintain documentation and training materials for platform users and governance workflows. Provide insights and recommendations for tooling improvements and scaling support across domains. Skill: 3 to 5 years of experience in data governance engineering, metadata management, or platform operations roles. Strong hands-on experience with: Data governance platforms (e.g., Collibra, Alation, Informatica) Cloud data platforms: GCP (Big Query, GCS) / AWS (Redshift, S3) SQL and Python for metadata extraction, pipeline integration, and automation API integrations between governance tools and cloud platforms Knowledge of data classification frameworks, retention policies, and regulatory compliance standards. Bachelor’s degree in Computer Science, Data Management, Information Systems, or a related field. Preferred Experience: Experience supporting Retail or QSR data environments with complex, multi-market governance needs. Exposure to CI/CD processes, Terraform / IaC, or cloud-native infrastructure tooling for lifecycle governance automation. Familiarity with data mesh concepts and distributed stewardship operating models. Current GCP Associates (or Professional) Certification. Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid. Additional Information: McDonald’s is committed to providing qualified individuals with disabilities with reasonable accommodations to perform the essential functions of their jobs. McDonald’s provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to sex, sex stereotyping, pregnancy (including pregnancy, childbirth, and medical conditions related to pregnancy, childbirth, or breastfeeding), race, color, religion, ancestry or national origin, age, disability status, medical condition, marital status, sexual orientation, gender, gender identity, gender expression, transgender status, protected military or veteran status, citizenship status, genetic information, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. McDonald’s Capability Center India Private Limited (“McDonald’s in India”) is a proud equal opportunity employer and is committed to hiring a diverse workforce and sustaining an inclusive culture. At McDonald’s in India, employment decisions are based on merit, job requirements, and business needs, and all qualified candidates are considered for employment. McDonald’s in India does not discriminate based on race, religion, colour, age, gender, marital status, nationality, ethnic origin, sexual orientation, political affiliation, veteran status, disability status, medical history, parental status, genetic information, or any other basis protected under state or local laws. Nothing in this job posting or description should be construed as an offer or guarantee of employment.
Posted 3 weeks ago
10.0 years
0 Lacs
pune, maharashtra, india
On-site
Job Title: Data Governance Architect Location: Pune/Hyderabad Job Type: Fulltime Years of Experience – 10+ years About Straive: Straive is a market leading Content and Data Technology company providing data services, subject matter expertise, & technology solutions to multiple domains. Data Analytics & Al Solutions, Data Al Powered Operations and Education & Learning form the core pillars of the company’s long-term vision. The company is a specialized solutions provider to business information providers in finance, insurance, legal, real estate, life sciences and logistics. Straive continues to be the leading content services provider to research and education publishers. Data Analytics & Al Services: Our Data Solutions business has become critical to our client's success. We use technology and Al with human experts-in loop to create data assets that our clients use to power their data products and their end customers' workflows. As our clients expect us to become their future fit Analytics and Al partner, they look to us for help in building data analytics and Al enterprise capabilities for them. With a client-base scoping 30 countries worldwide, Straive’s multi-geographical resource pool is strategically located in eight countries - India, Philippines, USA, Nicaragua, Vietnam, United Kingdom, and the company headquarters in Singapore. Website: https://www.straive.com/ Job Summary: We are seeking a highly skilled Data Governance Architect with strong expertise in defining and lead enterprise-wide Data governance strategies, design, governance architecture and experience in tools implementation like Informatica EDC/AXON, Collibra,Alation,MHUB and other leading Data Governance tool platforms. The ideal candidate will lead data quality, consistency, and accessibility across various Enterprise platforms and business units. Required Qualifications: ● Bachelor’s/master’s degree in information systems, Computer Science, or a related technical field. ● Strong knowledge of data governance, architecture techniques and methodologies and experience in data governance initiatives is must. ● 7 years of minimum experience in data governance architecture and implementation of Data Governance across business enterprise. ● Hands on experience in design and implement architectural patterns for data quality, metadata management, data lineage, data security, and master data management ● Strong hands-on expertise in Collibra (workflows, APIs, metadata integration, policy automation). ● Experience with ETL/ELT pipelines, data lineage capture, and data integration tools. ● Familiarity with data modeling (conceptual, logical, physical). ● Proficiency in SQL, Python/Java for integration and automation. ● Experience with SQL, back end scripting and API’s. ● Understanding of data governance principles and compliance practices ● Proficiency in working with cloud platforms (AWS, Azure, or GCP) ● Knowledge of big data technologies (Hadoop/Spark, etc.) and data visualization and BI tools ● is a plus ● Strong analytical and problem-solving skills ● Excellent communication and stakeholder management abilities Roles & responsibilities: ● Design, develop, and maintain enterprise-wide data governance architecture frameworks and meta data models ● Establish data governance strategies, policies, standards, and procedures for compliance processes ● Conduct maturity assessments and change management efforts. ● Evaluate and recommend Data governance framework and tools to meet enterprise business needs. ● Design and implement architectural patterns for data catalog, data quality, metadata management, data lineage, data security, and master data management (MDM) across various data platforms (e.g., data lakes, data warehouses, operational databases). ● Create and manage data dictionaries, metadata repositories, and data catalogs ● Architect technical and business metadata workflows and govern glossary approvals and workflows ● Validate end to end lineage across multiple sources and targets ● Design and enforce rules for, classification, Access, Retention and sharing data techniques ● Analyze/ Define the enterprise business KPI’s and validate data governance requirements. ● Collaborate with Data Stewards to define technical specifications for data quality rules, validation checks and KPI’s reporting. ● Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications. ● Ensure data compliance with relevant regulations like GDPR,HIPAA,CCPA,SOX etc. ● Excellent communication and ability to mentor and inspire teams.
Posted 3 weeks ago
0 years
0 Lacs
chennai, tamil nadu, india
On-site
Role Title: SAP Analytics Products & Services Director Location: Chennai We are one purpose-led global organisation. The enablers and innovators, ensuring that we can fulfil our mission to push the boundaries of science and discover and develop life-changing medicines. We take pride in working close to the cause, opening the locks to save lives, ultimately making a massive difference to the outside world. AstraZeneca (AZ) is in a period of strong growth and our employees have a united purpose to make a difference to patients around the world who need both our medicines and the ongoing developments from our science. In this journey AZ must continue to work across borders and with partners and new colleagues in a fast and seamless way. The ambition, size and complexity of the organisation, coupled with the opportunities afforded by new technology, has led the Board to approve a large-scale transformation programme – Axial. The Axial Programme will be powered by S/4HANA a new ERP (Enterprise Resource Planning) system which will be implemented right across the organisation and will provide our business with standardised processes, enhanced financial management, common data and real time reporting, transforming the way we work through our entire supply chain - from bench to patient. The new system will be used by more than 20,000 employees daily, is foundational to all AZ entities and is central to most core business processes. This is a once in a generation programme for AstraZeneca and will shape our ways of working globally for many years to come. The Axial programme needs the best talent to work in it. Whether it’s the technical skills, business understanding or change leadership, we want to ensure we have the strongest team deployed throughout. We are aiming to deliver a world class change programme that leaves all employees with a fuller understanding of their role in the end-to-end nature of our global company. This programme will provide AZ with a competitive edge, to the benefit of our employees, customers and patients. What You’ll Do We are looking for experienced engineering leader who can accelerate delivery of process automation, DataOps, Data as a Product culture, Data Marketplace and FinOps capabilities on AstraZeneca’s SAP Data & Analytics stack. We want to build a world class analytics and AI stack on Datasphere and SAP Analytics Cloud. To do this we need to establish capabilities around data catalog, documentation automation, self-serve capabilities, High quality Data for AI, DataOps, Data Products, Analytics Products, Data & Analytics Marketplace capabilities and establish service catalog and support services for those capabilities and tracking/measuring platform health with FinOps capabilities. You will run the team that builds out those capabilities working closely with cross functional enabling teams across the AstraZenca landscape like Data governance teams, Platform owners, Enterprise Data Office, Architects and more. This means you are likely to have significant experience in cutting edge engineering frameworks, deployed at scale, experience in platform management, Data Market place and Data-as-a-Product culture. This experience might come from the SAP world, or it might come from other platforms. Your roadmap of deliverables will come from your own vision for Engineering on the platform and from product owners. You will know you have been successful when: data services and its processes are automated; Data as product culture is established; data pipeline health and data lineage are transparent and fully automated; there are minimal data pipeline breakages; end user are able to find and shop for data products in Marketplace at ease with an enterprise grade UX for self-serve; dependent systems seamlessly receive data via products/sources; the services are catalogued and tracked with metrics & measures; services are consumptions are measurable in value for FinOps.; AI ready data quality is consistently available. We are open minded regarding your skills, experience and knowledge, as this is a broad role. This represents a significant and transformative shift for AstraZeneca, so if the idea of driving the transformation grabs your imagination, please apply. Essential For The Role Engineering leadership and Data Analytics platform mgmt. Implemented Data as products and Data marketplace and governance around it. DataOps and Data observability automation. Data Services - Self-service capabilities, Service support for BAU. Ability to set and drive a vision. Stakeholder management. Desirable for the role Engineering at scale, especially data warehousing, visualisation and AI, Agile delivery & Product organization principles Experience in tools like Collibra, Immuta Experience in FinOps SAP analytics (Datasphere, SAP Analytics Cloud), Digitising and automating analytics and data governance processes. Why AstraZeneca? At Astrazeneca we’re dedicated to being a Great Place to Work. Where you are empowered to push the boundaries of science and unleash your entrepreneurial spirit. There’s no better place to make a difference to medicine, patients and society. An inclusive culture that champions diversity and collaboration, and always committed to lifelong learning, growth and development. We’re on an exciting journey to pioneer the future of healthcare. So, what’s next? Are you already imaging yourself joining our team? Good, because we can’t wait to hear from you. Are you ready to bring new ideas and fresh thinking to the table? Brilliant! We have one seat available and hope its yours If you’re curious to know more then we welcome your application no later than Where can I find out more? Our Social Media, Follow AstraZeneca on LinkedIn https://www.linkedin.com/company/1603/ Follow AstraZeneca on Facebook https://www.facebook.com/astrazenecacareers/ Follow AstraZeneca on Instagram https://www.instagram.com/astrazeneca_careers/?hl=en Date Posted 31-Jul-2025 Closing Date AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements.
Posted 3 weeks ago
5.0 - 9.0 years
0 - 1 Lacs
bengaluru
Work from Office
Role & responsibilities Please share your resume at poonampal@kpmg.com Summary: 6-8 years of relevant professional experience in Data Management solutioning with major experience in Metadata Management with focus on Data Dictionary and Technical Lineage (Manual & Automatic) ; along with strong programming experience in either Java/Python Location: Bengaluru Status: Regular, Full-Time, 100% JOB REQUIREMENTS & RESPONSIBILITIES: i. Responsibilities: Development and implementation of Collibra/Informatica/Solidatus/Purview /other leading Metadata Management Platform Development and implementation of Automation use-cases using Java/Python programming Meet with business stakeholders to gather information and analyze existing processes, determine and document gaps and areas for improvement, prepare requirement documents/data flow mappings/metamodels, etc. in the area of Metadata Management ii. Education or Certifications: Bachelor's / Master's degree in engineering/technology/other related degrees. Relevant certifications from Collibra/Informatica/Talend/SAS and/or other leading platforms is a must Relevant certifications from DAMA, EDM Council and CMMI-DMM will be a bonus iii. Work Experience: 6-8 years of relevant experience in relevant Collibra/Informatica/Solidatus/Purview or/and other leading Metadata Management platforms Strong hands-on experience in capturing and maintaining Data Dictionary, Business Glossary and Lineage Models Experience in automatic technical lineage ingestion and implementation using Metadata Connectors/APIs is a must Demonstratable experience in developing Automation Use-Cases in Java/Python programming is expected Experience in developing custom connectors is a big plus Client-facing Big4 Consulting experience will be considered a plus Please share your resume at poonampal@kpmg.com
Posted 3 weeks ago
3.0 - 8.0 years
0 - 1 Lacs
hyderabad, bengaluru, delhi / ncr
Hybrid
Please share your resume directly at poonampal@kpmg.com Attached the job description for your reference - Metadata Management Metadata specialist or consultant 4+ years of experience in MM and data governance Experience in building data dictionaries and business glossary Hands-on Implementation experience using tools like Collibra, Alation, Informatica, Purview etc. Familiarity with data governance frameworks and compliance standards Data Catalog & lineage implementation experience Data Governance DG officer/Consultant 5+ years of experience in data governance, data management, or a related discipline like data strategy Strong understanding of DG frameworks such as DCAM, DAMA DMBOK or similar Experience in design & implementation of DG frameworks, organization structure, policies and standards Hands on experience with DG tools like Collibra, Alation, Informatica, Unity catalog Knowledge on compliance and risk management (e.g GDPR, data privacy, CCPA.) Certifications in DG (e.g CDMP, DCAM, collibra ranger, IDGC) Experience in conducting data management maturity assessments and strategy studies Master data management MDM architect or senior consultant/developer 4+ years of relevant experience in design, implement, roll-out, audit, and improve MDM solutions Implementation experience with tools like Informatica, Reltio, Talend etc. Experience with ERP systems (e.g SAP, Oracle) Strong knowledge of data quality management processes and concepts including matching, merging, creation of golden records for master data entities ETL and Data modeling experience Experience in master data governance is added advantage Data Quality Analysts DQ consultant / Analyst 4+ years of experience in design, implement, roll-out, audit, and improve Data Quality solutions Strong understanding and exposure of DQ principles and DQM processes including data lifecycle, data profiling, data quality remediation(cleansing, parsing, standardization etc.) etc. Experience with DQ tools and automation (e.g IDQ, Alteryx, Unity catalog, python, sql) Experience in creating metrics and scorecards Please share your resume directly at poonampal@kpmg.com
Posted 3 weeks ago
4.0 years
0 Lacs
hyderabad, telangana, india
Remote
About Tide At Tide, we are building a business management platform designed to save small businesses time and money. We provide our members with business accounts and related banking services, but also a comprehensive set of connected administrative solutions from invoicing to accounting. Launched in 2017, Tide is now used by over 1 million small businesses across the world and is available to UK, Indian and German SMEs. Headquartered in central London, with offices in Sofia, Hyderabad, Delhi, Berlin and Belgrade, Tide employs over 2,000 employees. Tide is rapidly growing, expanding into new products and markets and always looking for passionate and driven people. Join us in our mission to empower small businesses and help them save time and money. About The Role As part of the team, you will be responsible for building and running the data pipelines and services that are required to support business functions/reports/dashboard.. We are heavily dependent on BigQuery/Snowflake, Airflow, Stitch/Fivetran, dbt , Tableau/Looker for our business intelligence and embrace AWS with some GCP. As a Data Engineer You’ll Be Developing end to end ETL/ELT Pipeline working with Data Analysts of business Function. Designing, developing, and implementing scalable, automated processes for data extraction, processing, and analysis in a Data Mesh architecture Mentoring Fother Junior Engineers in the Team Be a “go-to” expert for data technologies and solutions Ability to provide on the ground troubleshooting and diagnosis to architecture and design challenges Troubleshooting and resolving technical issues as they arise Looking for ways of improving both what and how data pipelines are delivered by the department Translating business requirements into technical requirements, such as entities that need to be modelled, DBT models that need to be build, timings, tests and reports Owning the delivery of data models and reports end to end Perform exploratory data analysis in order to identify data quality issues early in the process and implement tests to ensure prevent them in the future Working with Data Analysts to ensure that all data feeds are optimised and available at the required times. This can include Change Capture, Change Data Control and other “delta loading” approaches Discovering, transforming, testing, deploying and documenting data sources Applying, help defining, and championing data warehouse governance: data quality, testing, coding best practices, and peer review Building Looker Dashboard for use cases if required What We Are Looking For You have 4+ years of extensive development experience using snowflake or similar data warehouse technology You have working experience with dbt and other technologies of the modern data stack, such as Snowflake, Apache Airflow, Fivetran, AWS, git ,Looker You have experience in agile processes, such as SCRUM You have extensive experience in writing advanced SQL statements and performance tuning them You have experience in Data Ingestion techniques using custom or SAAS tool like fivetran You have experience in data modelling and can optimise existing/new data models You have experience in data mining, data warehouse solutions, and ETL, and using databases in a business environment with large-scale, complex datasets You have experience architecting analytical databases (in Data Mesh architecture) is added advantage You have experience working in agile cross-functional delivery team You have high development standards, especially for code quality, code reviews, unit testing, continuous integration and deployment You have strong technical documentation skills and the ability to be clear and precise with business users You have business-level of English and good communication skills You have basic understanding of various systems across the AWS platform ( Good to have ) Preferably, you have worked in a digitally native company, ideally fintech Experience with python, governance tool (e.g. Atlan, Alation, Collibra) or data quality tool (e.g. Great Expectations, Monte Carlo, Soda) will be added advantage Our Tech Stack DBT Snowflake Airflow Fivetran SQL Looker What You’ll Get In Return Make work, work for you! We are embracing new ways of working and support flexible working arrangements. With our Working Out of Office (WOO) policy our colleagues can work remotely from home or anywhere in their assigned Indian state. Additionally, you can work from a different country or Indian state for 90 days of the year. Plus, you’ll get: Competitive salary Self & Family Health Insurance Term & Life Insurance OPD Benefits Mental wellbeing through Plumm Learning & Development Budget WFH Setup allowance 15 days of Privilege leaves 12 days of Casual leaves 12 days of Sick leaves 3 paid days off for volunteering or L&D activities Stock Options Tidean Ways Of Working At Tide, we champion a flexible workplace model that supports both in-person and remote work to cater to the specific needs of our different teams. While remote work is supported, we believe in the power of face-to-face interactions to foster team spirit and collaboration. Our offices are designed as hubs for innovation and team-building, where we encourage regular in-person gatherings to foster a strong sense of community. TIDE IS A PLACE FOR EVERYONE At Tide, we believe that we can only succeed if we let our differences enrich our culture. Our Tideans come from a variety of backgrounds and experience levels. We consider everyone irrespective of their ethnicity, religion, sexual orientation, gender identity, family or parental status, national origin, veteran, neurodiversity or differently-abled status. We celebrate diversity in our workforce as a cornerstone of our success. Our commitment to a broad spectrum of ideas and backgrounds is what enables us to build products that resonate with our members’ diverse needs and lives. We are One Team and foster a transparent and inclusive environment, where everyone’s voice is heard. At Tide, we thrive on diversity, embracing various backgrounds and experiences. We welcome all individuals regardless of ethnicity, religion, sexual orientation, gender identity, or disability. Our inclusive culture is key to our success, helping us build products that meet our members' diverse needs. We are One Team, committed to transparency and ensuring everyone’s voice is heard. You personal data will be processed by Tide for recruitment purposes and in accordance with Tide's Recruitment Privacy Notice .
Posted 3 weeks ago
5.0 - 10.0 years
0 Lacs
chennai, tamil nadu, india
On-site
Job Title: Senior Data Architect Year of Experience: 5 - 10 Years Job Description: The Senior Data Architect will design, govern, and optimize the entire data ecosystem for advanced analytics and AI workloads. This role ensures data is collected, stored, processed, and made accessible in a secure, performant, and scalable manner. The candidate will drive architecture design for structured/unstructured data, build data governance frameworks, and support the evolution of modern data platforms across cloud environments. Key responsibilities: · Architect enterprise data platforms using Azure/AWS/GCP and modern data lake/data mesh patterns · Design logical and physical data models, semantic layers, and metadata frameworks · Establish data quality, lineage, governance, and security policies · Guide the development of ETL/ELT pipelines using modern tools and streaming frameworks · Integrate AI and analytics solutions with operational data platforms · Enable self-service BI and ML pipelines through Databricks, Synapse, or Snowflake · Lead architecture reviews, design sessions, and CoE reference architecture development Technical Skills · Cloud Platforms: Azure Synapse, Databricks, Azure Data Lake, AWS Redshift · Data Modeling: ERWin, dbt, Power Designer · Storage & Processing: Delta Lake, Cosmos DB, PostgreSQL, Hadoop, Spark · Integration: Azure Data Factory, Kafka, Event Grid, SSIS · Metadata/Lineage: Purview, Collibra, Informatica · BI Platforms: Power BI, Tableau, Looker · Security & Compliance: RBAC, encryption at rest/in transit, NIST/FISMA Qualification · Bachelor’s or Master’s in Computer Science, Information Systems, or Data Engineering · Microsoft Certified: Azure Data Engineer / Azure Solutions Architect · Strong experience building cloud-native data architectures · Demonstrated ability to create data blueprints aligned with business strategy and compliance.
Posted 3 weeks ago
5.0 years
0 Lacs
chennai, tamil nadu, india
On-site
Job Description: Job Description: Collibra Techno-Functional Consultant Company: Guardian Life Insurance Company Location: Chennai Job Type: Full-Time About Guardian Life Guardian Life is a leading provider of life insurance, disability income, and employee benefits. We are dedicated to delivering exceptional products and services to meet our clients' needs. Our commitment to innovation and excellence makes us a great place to grow your career. Role Overview We are seeking a highly skilled Data Stewardship with expertise in Collibra implementation as Techno-Functional to join our Data & AI team. This role focuses on ensuring data quality, integrity, and compliance while driving the implementation and optimization of Collibra solutions within our organization. Manage business relationships partnering closely with technology, Enterprise Data, Enterprise Analytics, and others to ensure an integrated approach Key Responsibilities: Technical Skills Experience in Data Management including Business Analytics, Data Analytics, Data Management, Data Governance, Data Privacy, Data Lineage, Data Steward, Data Dictionary, Data Management, Data Quality. Configure workflows, metadata management, and data catalog functionalities within Collibra. Data Quality Logic Development: Proficiency in designing and implementing technical rules, validation checks, and business logic to ensure data accuracy, completeness, and consistency. Workflow Automation: Expertise in developing automated workflows using tools like Collibra, Informatica, or Alteryx to streamline data quality processes and monitoring. User Interface (UI) Design: Experience in creating intuitive and user-friendly interfaces for Collibra or similar data management tools to enhance usability and adoption by business users. Create and maintain dashboards, reports, and visualizations to support business decisions. Data Management Platform Configuration: Advanced skills in configuring Collibra's metadata and data governance modules, including customization of policies, data lineage, and collaboration features. Analytical Skills Root Cause Analysis: Strong ability to use data analysis techniques to identify root causes of data quality issues and recommend actionable solutions. Business Impact Assessment: Experience in quantifying the financial and operational impact of poor data quality and demonstrating the benefits of high-quality data products and subscriptions. Cost-Benefit Analysis: Skilled in evaluating the return on investment (ROI) of curated data products and data subscriptions, including their impact on decision-making and efficiency. Data and Technical Debt Reduction: Analytical expertise in identifying and mitigating redundant, outdated, or unused datasets and technologies to optimize data environments and reduce costs. Work closely with data engineering and business intelligence teams to address data challenges. Data Governance and Management Collaborate with stakeholders to define data standards, policies, and best practices. Monitor and ensure adherence to data governance and stewardship principles. Establish and maintain a data glossary and data lineage documentation. Facilitate communication between business and technical teams for data-related initiatives. Qualifications: Education: Bachelor’s degree in Computer Science, Information Systems, Data Management, or related field. Master’s degree preferred. Experience: 5+ years of work experience in Data Strategy, management & governance, preferably in a insurance market data intensive Industry Hands-on experience with Collibra Data Governance and Collibra Data Quality tools (certifications are a plus). Proven track record of implementing and managing data governance frameworks. Skills: Strong understanding of data management, metadata, data lineage, and data quality principles. Hands on experience in writing and developing SQL Queries. Working knowledge of Databricks is desirable Experience in Data Analysis and Visualization. Ability to work collaboratively across segments and cultures. Effective and Structured Communication skills. Location: This position can be based in any of the following locations: Chennai Current Guardian Colleagues: Please apply through the internal Jobs Hub in Workday
Posted 3 weeks ago
5.0 years
0 Lacs
chennai, tamil nadu, india
On-site
Job Description: Job Description: Collibra Techno-Functional Consultant Company: Guardian Life Insurance Company Location: Chennai Job Type: Full-Time About Guardian Life Guardian Life is a leading provider of life insurance, disability income, and employee benefits. We are dedicated to delivering exceptional products and services to meet our clients' needs. Our commitment to innovation and excellence makes us a great place to grow your career. Role Overview We are seeking a highly skilled Data Governance with expertise in Collibra implementation as Techno-Functional to join our Data & AI team. This role focuses on ensuring data quality, integrity, and compliance while driving the implementation and optimization of Collibra solutions within our organization. Manage business relationships partnering closely with technology, Enterprise Data, Enterprise Analytics, and others to ensure an integrated approach Key Responsibilities: Technical Skills Experience in Data Management including Business Analytics, Data Analytics, Data Management, Data Governance, Data Privacy, Data Lineage, Data Steward, Data Dictionary, Data Management, Data Quality. Configure workflows, metadata management, and data catalog functionalities within Collibra. Data Quality Logic Development: Proficiency in designing and implementing technical rules, validation checks, and business logic to ensure data accuracy, completeness, and consistency. Workflow Automation: Expertise in developing automated workflows using tools like Collibra, Informatica, or Alteryx to streamline data quality processes and monitoring. User Interface (UI) Design: Experience in creating intuitive and user-friendly interfaces for Collibra or similar data management tools to enhance usability and adoption by business users. Create and maintain dashboards, reports, and visualizations to support business decisions. Data Management Platform Configuration: Advanced skills in configuring Collibra's metadata and data governance modules, including customization of policies, data lineage, and collaboration features. Analytical Skills Root Cause Analysis: Strong ability to use data analysis techniques to identify root causes of data quality issues and recommend actionable solutions. Business Impact Assessment: Experience in quantifying the financial and operational impact of poor data quality and demonstrating the benefits of high-quality data products and subscriptions. Cost-Benefit Analysis: Skilled in evaluating the return on investment (ROI) of curated data products and data subscriptions, including their impact on decision-making and efficiency. Data and Technical Debt Reduction: Analytical expertise in identifying and mitigating redundant, outdated, or unused datasets and technologies to optimize data environments and reduce costs. Work closely with data engineering and business intelligence teams to address data challenges. Data Management and Governance Collaborate with stakeholders to define data standards, policies, and best practices. Monitor and ensure adherence to data governance and stewardship principles. Establish and maintain a data glossary and data lineage documentation. Facilitate communication between business and technical teams for data-related initiatives. Qualifications: Education: Bachelor’s degree in Computer Science, Information Systems, Data Management, or related field. Master’s degree preferred. Experience: 5+ years of work experience in Data Strategy, management & governance, preferably in a insurance market data intensive Industry Hands-on experience with Collibra Data Governance and Collibra Data Quality tools (certifications are a plus). Proven track record of implementing and managing data governance frameworks. Skills: Strong understanding of data management, metadata, data lineage, and data quality principles. Hands on experience in writing and developing SQL Queries. Working knowledge of Databricks is desirable Experience in Data Analysis and Visualization. Ability to work collaboratively across segments and cultures. Effective and Structured Communication skills. Location: This position can be based in any of the following locations: Chennai Current Guardian Colleagues: Please apply through the internal Jobs Hub in Workday
Posted 3 weeks ago
5.0 years
0 Lacs
hyderabad, telangana, india
On-site
We are looking for a Data Governance Collibra Specialist to join our clients' data governance initiatives. This role is vital for leveraging the Collibra platform to enhance data management practices, ensuring data quality, compliance, and effective governance. You will be responsible for implementing and optimizing Collibra solutions, enabling businesses to manage their data stewardship and governance processes effectively. Requirements Key Responsibilities: Implement and configure the Collibra platform to support data governance frameworks Develop workflows, cataloging, and data lineage within Collibra to facilitate better data management Collaborate with business stakeholders to gather requirements and align Collibra features with data governance objectives Provide training and support to teams on leveraging Collibra for data governance activities Monitor enhancements and upgrades in Collibra and assess their impact on existing governance processes Establish data quality metrics within Collibra and generate reports on data governance effectiveness Engage with cross-functional teams to promote understanding and adherence to data governance policies Stay abreast of trends and best practices in data governance and Collibra capabilities Required Qualifications: Bachelor's degree in Information Technology, Information Management, or a related field 5+ years of experience in data governance, data management, or related fields with a specific focus on Collibra Proven experience in implementing and managing Collibra as a data governance tool Strong understanding of data governance principles, practices, and related technologies Excellent analytical skills and proficiency in data modeling and data quality concepts Experience working with cross-functional teams and engaging stakeholders effectively Strong communication and presentation skills to convey technical concepts to non-technical audiences Ability to work independently and in a team-level environment
Posted 3 weeks ago
7.0 years
0 Lacs
india
Remote
Job Title: ETL Developer – DataStage, AWS, Snowflake Experience: 5–7 Years Location: Remote Job Type: Full-time About the Role We are looking for a talented and motivated ETL Developer / Senior Developer to join our data engineering team. You will work on building scalable and efficient data pipelines using IBM DataStage (on Cloud Pak for Data) , AWS Glue , and Snowflake . You will collaborate with architects, business analysts, and data modelers to ensure timely and accurate delivery of critical data assets supporting analytics and AI/ML use cases. Key Responsibilities Design, develop, and maintain ETL pipelines using IBM DataStage (CP4D) and AWS Glue/Lambda for ingestion from varied sources like flat files, APIs, Oracle, DB2, etc. Build and optimize data flows for loading curated datasets into Snowflake , leveraging best practices for schema design, partitioning, and transformation logic. Participate in code reviews , performance tuning, and defect triage sessions. Work closely with data governance teams to ensure lineage, privacy tagging, and quality controls are embedded within pipelines. Contribute to CI/CD integration of ETL components using Git, Jenkins, and parameterized job configurations. Troubleshoot and resolve issues in QA/UAT/Production environments as needed. Adhere to agile delivery practices, sprint planning, and documentation requirements. Required Skills and Experience 4+ years of experience in ETL development with at least 1–2 years in IBM DataStage (preferably CP4D version) . Hands-on experience with AWS Glue (PySpark or Spark) and AWS Lambda for event-based processing. Experience working with Snowflake : loading strategies, stream-task, zero-copy cloning, and performance tuning. Proficiency in SQL , Unix scripting , and basic Python for data handling or automation. Familiarity with S3 , version control systems (Git), and job orchestration tools. Experience with data profiling, cleansing, and quality validation routines. Understanding of data lake/data warehouse architectures and DevOps practices. Good to Have Experience with Collibra, BigID , or other metadata/governance tools Exposure to Data Mesh/Data Domain models Experience with agile/Scrum delivery and Jira/Confluence tools AWS or Snowflake certification is a plus
Posted 3 weeks ago
10.0 - 20.0 years
15 - 30 Lacs
hyderabad, chennai, bengaluru
Work from Office
10+ years of overall technical experience with any knowledge on technology around Python/GenAI (a plus) 5+ years in in any Data Governance tool Solutions/implementations. can be alation/collibra/atlan/purview/unity catalog/others Hands-on experience with Data Governance Solutions with a good understanding of the below Data Catalog Business Glossary Business metadata, technical metadata, operational Metadata Data Quality (IDQ/Anomalo/iceDQ/custom DQ APIs) Data Profiling Data Lineage Automation on any of the above
Posted 3 weeks ago
7.0 - 12.0 years
0 - 0 Lacs
bengaluru
Hybrid
Greetings, Dear Candidate, Urgent requirement for- Data Governance Lead - CMM Level 5 client- Permanent position Experience Required:7 to 12years overall Location-Bangalore Hybrid Mode NP-Immediate to 30days max 8+ yrs of experience in Data Governance domain by using Collibra & RDM Strong on Data Quality, SQL & Unix Strong knowledge on Support process & ITIL If anyone interested Kindly share your cv to manasa@skyonn.com pls refer your friends and colleagues also Thank you..
Posted 3 weeks ago
10.0 years
0 Lacs
pune, maharashtra, india
On-site
Job Title: Data Architect Experience Level: 10+ Years Job Overview We are seeking a highly experienced and versatile Data Architect with over 10 years of experience. This role requires deep expertise in data architecture and data engineering, along with exceptional client management and team leadership skills. The ideal candidate will be both strategic and hands-on able to lead complex data initiatives, mentor teams, manage client relationships, and directly contribute to the design and implementation of scalable data solutions. Key Responsibilities Enterprise Data Strategy & Client Engagement: Develop and maintain a comprehensive data architecture strategy that aligns with organizational and client business objectives. Serve as a key technical advisor for clients, translating business requirements into innovative data solutions. Build and maintain strong client relationships by providing expert guidance and managing expectations throughout project lifecycles. Data Modeling, Design & Engineering: Design and optimize both logical and physical data models to support enterprise-wide systems. Architect data warehousing solutions, overseeing the integration of data from multiple sources to enable robust business intelligence and analytics. Directly develop, test, and implement ETL processes and data pipelines, ensuring data quality, consistency, and performance. Technology Evaluation & Implementation: Evaluate emerging data technologies and tools to determine their fit within the existing architecture and potential for future scalability. Oversee the integration of new technologies into the enterprise data architecture, balancing innovation with risk management. Team Leadership & Hands-On Management: Lead cross-functional teams, providing mentorship and technical guidance to junior data engineers and architects. Maintain a hands-on approach by actively participating in coding, design sessions, and troubleshooting complex data issues. Ensure project milestones are met through effective resource management and team coordination. Performance, Security & Best Practices: Optimize data storage, retrieval, and processing performance across various systems. Collaborate with security teams to enforce data governance, compliance, and privacy standards. Establish and promote best practices in data management, data engineering, and architecture design. Documentation & Reporting: Develop and maintain comprehensive documentation covering data architecture designs, data flows, integration processes, and project status. Provide regular updates and reports to both internal stakeholders and clients on project progress and system performance. Required Qualifications Education: Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Science, or a related field. Experience: 10+ years of experience in data architecture, data engineering, or related roles. Proven experience in designing and implementing enterprise-level data solutions with a hands-on technical approach. Demonstrated track record of managing client relationships and leading technical teams. Technical Skills: Expertise in data modeling, data warehousing, and database design (both relational and NoSQL). Strong proficiency in data engineering, including experience with ETL tools, data integration frameworks, and big data technologies. Hands-on experience with cloud data platforms (e.g., Azure, Google Cloud) and modern data processing frameworks. Familiarity with scripting and programming languages (e.g., Python, SQL,) to support hands-on development and troubleshooting. Experience with data governance frameworks & solutions ( Informatica, Collibra, Purview etc) Soft Skills: Exceptional client management and communication skills, with the ability to interact confidently with both technical and non-technical stakeholders. Proven team management and leadership abilities, including mentoring, coaching, and project management. Strong analytical and problem-solving skills with a proactive, detail-oriented approach. Ability to work collaboratively in a fast-paced, dynamic environment while driving multiple projects to successful completion. Certifications (Preferred): Relevant certifications such as Azure Solutions Architect, Certified Data Management Professional (CDMP), or similar credentials.
Posted 3 weeks ago
5.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Greetings from HCLTECH! Role: Data Governance- Collibra Location: Pan India Experience: 5 Years to 10 Years Relevant Years in Collibra: 3 Years Notice Period: Immediate to 30 Days Collibra Data Quality: Mandatory Collibra Data Quality (DQ) including rule creation, monitoring, and troubleshooting. Application support, DevOps, or Data engineering. understanding of Kubernetes concepts including pods, services, deployments, config maps, and secrets Experience with Docker , Helm, and CI/CD tools (e.g., Jenkins, GitHub Actions). Proficiency in SQL and basic scripting (e.g., Shell, Python). Knowledge of JDBC/ODBC connectors , REST APIs, and enterprise data integration patterns Exposure to cloud environments ( AWS , Azure , or GCP ) and managing DQ components in cloud-native architectures Soft Skills: Strong analytical and problem-solving skills. Excellent verbal and written communication. Self-driven with a continuous improvement mindset. Should be flexible to work in shifts Should be open to learn new Tools and Technologies Interested candidates, please share the resume to amrin.a@hcltech.com, along with the below details Candidate Name contact number Email ID Total years of experience Relevant years Notice Period Current Company Current loc Pref loc CCTC ECTC
Posted 3 weeks ago
7.0 years
0 Lacs
india
On-site
About Company Papigen is a fast-growing global technology services company, delivering innovative digital solutions through deep industry experience and cutting-edge expertise. We specialize in technology transformation, enterprise modernization, and dynamic areas like Cloud, Big Data, Java, React, DevOps, and more. Our client-centric approach combines consulting, engineering, and data science to help businesses evolve and scale efficiently. About The Role We are seeking an experienced Senior Informatica MDM Engineer (Administrator) — Multidomain / Customer 360 to lead and support an enterprise-level data transformation program focused on master data management, data governance, and analytics enablement. This is a hands-on role that combines advanced Informatica MDM development (match/merge, survivorship, data modeling) with platform administration (installation, upgrades, performance tuning, security, and cloud integration). The successful candidate will help define and drive MDM best practices, contribute to platform architecture and operational runbooks, and collaborate closely with data engineering, governance, and business teams to deliver stable, scalable, and secure MDM solutions integrated with modern cloud platforms and governance tools. Key Responsibilities Development Design and implement MDM solutions using Informatica MDM (Multidomain/Customer 360). Develop match/merge rules, cleansing strategies, hierarchy management, and survivorship logic. Build data integrations with downstream apps, APIs, ETL (PowerCenter, IDMC), and cloud platforms (Azure, Databricks). Support metadata management, lineage tracking, and integration with governance tools (Collibra, Purview). Contribute to enterprise data catalog and marketplace initiatives. Administration Install, configure, and upgrade Informatica MDM/IDMC environments. Manage user provisioning, security roles, RBAC, and access policies. Monitor system performance, logs, and job executions; ensure uptime and SLA adherence. Troubleshoot platform, connectivity, and performance issues. Implement backup, recovery, and patch management strategies. Collaborate with infrastructure/cloud teams on environment scaling, optimization, and DevOps automation. Required Skills & Experience 7+ years hands-on experience in Informatica MDM (Multidomain/Customer 360). Strong experience with IDMC and PowerCenter for MDM and ETL operations. Proven expertise in MDM administration (installation, upgrades, patching, role-based access, monitoring). Strong data modeling, quality, match/merge, and hierarchy management experience. Working knowledge of Azure Data Services, Databricks, Azure SQL. Familiarity with Collibra, Purview, or metadata/catalog tools. Strong SQL and troubleshooting/performance tuning skills. Experience working in Agile environments (Azure DevOps, JIRA). Nice to Have Exposure to tools like OneTrust, Poolparty, SAP BO, Power BI. Knowledge of PII detection, policy enforcement, or policies-as-code. Familiarity with self-service analytics and data marketplace concepts. Soft Skills Strong analytical and problem-solving skills. Excellent communication and documentation ability. Ability to balance development with admin responsibilities. Strong understanding of data governance, compliance, and security. Skills: api-based mdm integration,data governance frameworks,rbac,informatica powercenter,match/merge,microsoft purview,performance tuning,azure data services,sql,data governance,informatica mdm,api integration,powercenter,etl,idmc,azure devops,databricks,survivorship,enterprise data marketplace,azure sql,collibra,hierarchy management,troubleshooting,data integration,data quality,master data management,data modeling,azure,administrator,informatica administration
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |