Jobs
Interviews

4985 Data Governance Jobs - Page 25

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

18.0 - 25.0 years

45 - 60 Lacs

ahmedabad

Hybrid

Associate Director, Data Architecture – Lead enterprise data strategy, governance, cloud platforms (Snowflake/AWS), AI/ML, CPG/FMCG analytics, team & vendor management.

Posted 1 week ago

Apply

1.0 - 6.0 years

2 - 5 Lacs

hyderabad

Hybrid

Qualifications and Experience: The successful candidate will meet the following criteria: • 1-2 years experience working in an enterprise CRM system. Working knowledge/experience of Salesforce would be an asset • Experience or knowledge of tools such as LinkedIn, Factiva, Hoovers, etc. • Experience in conducting secondary research (e.g., market, companies, industries) • Excellent oral and written communication skills • Attention to detail, and ability to be a self-starter • Ability to collaborate with culturally diverse offshore teams in different zones • Working knowledge of data quality management, data entry improvement and user requirements • Demonstrated ability to work effectively in cross-functional, virtual teams • Process oriented and must be able to work with a high degree of detail and have high quality standards • Ability to assist in development and implementation of policy, standards and procedures • Demonstrated PC skills: Microsoft Office-Excel, Word, Access, and querying tools like SQL • Strong analytical, conceptual, and problem-solving abilities • Ability to present ideas in a user-friendly language • Excellent organizational and time-management skills • Ability to prioritize and execute tasks in a high-pressure, fast-paced environment • Knowledge on CASL the Canadian anti-spam legislation and consent related processes • Experience with Tableau is preferred, particularly in measuring data synchronization and working with large data sets. • Experience managing marketing campaigns and handling consent-related processes is desirable. • Proficiency in French is an advantage. • Experience in project coordination will be an asset • Experience with Generative AI (Gen AI) technologies would be an added advantage. Must-have Requirements: Working knowledge of CRM Salesforce for managing customer relationships and data. • Experience in conducting secondary research. • Understanding of data governance principles and best practices. • Advanced proficiency in Microsoft Excel, including complex formulas and data analysis tools. • Foundational knowledge of SQL for querying and managing relational databases. • Awareness of data quality frameworks and techniques to ensure accurate and reliable information. Value-added Requirements: • Familiarity with CASL (Canada’s Anti-Spam Legislation) and its application in business communications. • Experience creating interactive dashboards and reports using Tableau. • Proficiency in developing process flows and diagrams using Microsoft Visio. • Demonstrated ability to support project coordination activities across cross-functional teams. • Exposure to Generative AI (Gen AI) technologies and their business applications. The job description is subject to change based on business/project requirements.

Posted 1 week ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

pune

Work from Office

Key Responsibilities: Implement and migrate tagging using Adobe Launch / AEP Web SDK with XDM schema alignment and Adobe Edge Network integration. Audit, validate, and maintain tagging consistency across Dev/QA/Prod. Set up campaign attribution (UTM/CID), visitor ID stitching (ECID/hashed IDs), and mapping to Analytics & CJA. Configure datasets, metrics, and segments for cross-channel analysis. Integrate and QA Adobe Target A/B testing & personalization, ensuring A4T reporting accuracy. Establish governance for naming, version control, change tracking, and performance optimization. Build and maintain Tableau dashboards with stakeholder-defined KPIs. Develop and validate SQL queries combining clickstream and non-clickstream datasets in Snowflake. Use LogRocket for behavioral insights, heatmaps, and UX improvement. Provide training, client consultation, and ongoing Adobe platform enablement. Qualifications: Proven expertise in Adobe Launch , AEP Web SDK , Adobe Analytics , Adobe Target , and CJA . Strong skills in Tableau , SQL (Snowflake) , and data governance. Experience with personalization strategy, tag performance optimization, and cross-team collaboration.

Posted 1 week ago

Apply

1.0 - 4.0 years

3 - 6 Lacs

bengaluru

Work from Office

At Infoblox, every breakthrough begins with a bold what if. What if your ideas could ignite global innovation What if your curiosity could redefine the future We invite you to step into the next exciting chapter of your career journey. Bring your creativity, drive, your daring spirit, and feel what it s like to thrive on a team big enough to make an impact, yet small enough to make a difference. Our cloud-first networking and security solutions already protect 70% of the Fortune 500 , and we re looking for creative thinkers ready to push that influence even further. Join us and discover how far your bold what if can take the world, your community, and your career. Here, how we empower our people is extraordinary: Glassdoor Best Places to Work 2025, Great Place to Work-Certified in five countries, and Cigna Healthy Workforce honors three years running and what we build is world-class: recognized as CybersecAsia s Best in Critical Infrastructure 2024 evidence that when first-class technology meets empowered talent, remarkable careers take shape. So, what if the next big idea, and the next great career story, comes from youBecome the force that turns every what if into what s next . In a world where you can be anything, Be Infoblox . BI Engineer We have an opportunity for a BI Engineer to join our IT Data & Analytics team in Bengaluru, India, reporting to the manager of Data & Analytics. In this role, you will work closely with various business functions and create insightful dashboards to help them navigate their day-to-day activities along with assisting them in their long-term vision. You will be part of a dynamic and enthusiastic team which has implemented an in-house Datalake system to cater to our reporting needs. You will also be part of a transformational journey as we embark on upgrading our systems to be AI-ready and get to experience new systems and processes along this transformation journey. Be a Contributor What You ll Do Architect complex, scalable Tableau dashboards and visualizations that align with business goals and KPIs Design and implement robust data models, including complex joins, LOD expressions, and calculated fields for deep analytics Mentor junior developers, conduct code reviews, and establish best practices for Tableau development within the team Act as a liaison between business units and IT, translating business requirements into technical solutions Optimize dashboard performance through query tuning, extract strategies, and efficient data source design Ensure data accuracy, consistency, and compliance with organizational data governance policies Stay updated with Tableau s latest features and BI trends and proactively suggest improvements to existing solutions Be Prepared What You Bring 2 5 years of experience in business intelligence Proficiency in Tableau or similar BI reporting tools and the ability to convert a business requirement into insightful dashboards Proficiency in SQL writing and knowledge of how cloud-hosted databases operate Good hands-on experience with Datalake and data warehouse concepts Working experience in cloud infrastructure like AWS, Azure etc. Technical knowledge of ETL flows and marketing, sales, lead to order, or order to cash functional flows is a bonus Bachelor s degree in engineering preferred Be Successful Your Path First 90 Days: Immerse in our culture, connect with mentors, and map the systems and stakeholders that rely on your work. Six Months: Deliver a signature win: ship a feature, close a marquee deal, launch a campaign, or roll out a game-changing process. One Year: Own your domain, mentor the next newcomer, and steer our roadmap with data-driven ideas. Belong Your Community Our culture thrives on inclusion, rewarding the bold ideas, curiosity, and creativity that move us forward. In a community where every voice counts, continuous learning is the norm. So, whether you code, create, sell, or care for customers, you ll grow and belong here. Be Rewarded Benefits That Help You Grow, Thrive, Belong Comprehensive health coverage, generous PTO, and flexible work options Learning opportunities, career-mobility programs, and leadership workshops Sixteen paid volunteer hours each year, global employee resource groups, and a No Jerks policy that keeps collaboration healthy Modern offices with EV charging, healthy snacks (and the occasional cupcake), plus hackathons, game nights, and culture celebrations Charitable Giving Program supported by Company Match We practice pay transparency and reward performance. Offers reflect role location, internal equity, experience, skills, education, and certifications Ready to Be the Difference Infoblox is an Affirmative Action and Equal Opportunity Employer, and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation, national origin, genetic information, age, disability, veteran status, or any other legally protected basis #LI-RH1 #LI-Hybrid

Posted 1 week ago

Apply

8.0 - 10.0 years

16 - 18 Lacs

hyderabad

Work from Office

This role involves developing robust data pipelines, implementing scalable data architectures, and establishing data governance frameworks to support data-driven decision making across the organization. Essential functions Design and implement efficient, scalable data pipelines Optimize data storage and retrieval processes Ensure data quality and consistency across systems Collaborate with cross-functional teams to understand data requirements Implement data security and compliance measures Qualifications 8-10 years of experience in data engineering or related field Expert knowledge of SQL Mid-level knowledge of Python (expert preferred) Experience with cloud platforms, particularly Google Cloud Strong understanding of data modeling and warehouse concepts Experience with ETL pipeline design and implementation Experience with big data technologies (Hadoop, Spark) Would be a plus Knowledge of data governance and security practices Experience with real-time data processing Familiarity with BI tools and reporting platforms Strong background in performance optimization and tuning Advanced debugging and troubleshooting skills We offer Opportunity to work on bleeding-edge projects Work with a highly motivated and dedicated team Competitive salary Flexible schedule Benefits package - medical insurance, sports Corporate social events Professional development opportunities Well-equipped office

Posted 1 week ago

Apply

0.0 - 2.0 years

2 - 5 Lacs

chennai

Work from Office

Role Overview We are seeking a skilled Associate Consultant Power BI with 2 years of experience in business intelligence and data visualization. The role involves designing, developing, and maintaining Power BI dashboards and reports to support business decision-making. You will collaborate with stakeholders to gather requirements, transform data into meaningful insights, and ensure data accuracy, security, and performance. Key Responsibilities Design, develop, and maintain Power BI dashboards, reports, and datasets . Collaborate with business users to understand requirements and translate them into BI solutions. Perform data modeling, DAX calculations, and Power Query transformations . Integrate data from multiple sources (SQL Server, Excel, Cloud, APIs, etc.). Ensure data quality, governance, and security in Power BI environments. Optimize dashboard performance and improve visualization best practices. Provide support, troubleshoot issues, and train end users on Power BI usage. Stay updated with the latest features of Power BI and related Microsoft technologies. Key Performance Indicators (KPIs) Timely delivery of dashboards and reports as per business needs. Accuracy and reliability of insights delivered. End-user adoption and satisfaction scores. Dashboard performance optimization (load speed, refresh success). Compliance with data governance and security policies. Required Qualifications & Skills Bachelor s degree in Computer Science, Information Technology, Data Analytics, or related field. 2 years of hands-on experience with Power BI in a professional environment. Strong skills in DAX, Power Query (M), and data modeling . Experience with relational databases (SQL Server, Oracle, MySQL, etc.) and writing SQL queries. Knowledge of ETL processes and integration with data sources. Understanding of data visualization principles and storytelling with data. Familiarity with Azure Data Services (Data Factory, Synapse, etc.) is a plus. Strong communication and problem-solving skills. Qualification: Graduate No. of Job Positions: 1 Total Experience: 0.6-2 years Domain Experience: Python

Posted 1 week ago

Apply

12.0 - 18.0 years

14 - 24 Lacs

bengaluru

Hybrid

Hiring Data Governance Policy Compliance Manager with strong policy governance background in banking. Will assess gaps, define controls, and align with audit, risk, GRC, and tech teams for policy compliance. Required Candidate profile Experienced in data governance, policy compliance, and banking. Strong in audits, controls, and cross-functional collaboration. Excellent communication and stakeholder management skills.

Posted 1 week ago

Apply

7.0 - 12.0 years

14 - 24 Lacs

bengaluru

Hybrid

Seeking a Data Governance Consultant (Technology) to define and implement data controls in alignment with policies. Must collaborate with EADA, GRC, ISG teams and understand technical governance requirements. Required Candidate profile Experienced consultant with deep knowledge of data governance, policy alignment, and technical controls. Strong collaboration skills with EADA, GRC, and ISG stakeholders.

Posted 1 week ago

Apply

7.0 - 12.0 years

14 - 24 Lacs

bengaluru

Hybrid

Looking for a skilled Data Governance Consultant to support commercial, retail, and treasury units. Will capture business data needs and implement them real-time within the organization’s data framework. Required Candidate profile Data governance expert with business-facing experience in retail, commercial & treasury. Skilled in data stewardship, real-time implementation, and aligning with enterprise data frameworks.

Posted 1 week ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

bengaluru

Work from Office

We are looking for a skilled Technical Support professional with 3-5 years of experience to join our team as an Analyst - L3. The ideal candidate will have a strong background in technical support and excellent problem-solving skills. Roles and Responsibility Provide technical support to customers via phone, email, or chat. Troubleshoot and resolve complex technical issues efficiently. Collaborate with internal teams to resolve customer complaints and concerns. Develop and maintain technical documentation and knowledge base articles. Analyze and report on customer feedback and suggest process improvements. Participate in training and development programs to enhance technical skills. Job Requirements Strong technical skills and knowledge of IT services and consulting. Excellent communication and problem-solving skills. Ability to work in a fast-paced environment and meet deadlines. Strong analytical and troubleshooting skills. Experience with technical support tools and software. Ability to collaborate effectively with cross-functional teams. Mandatory skills include technical support and title analyst.

Posted 1 week ago

Apply

10.0 - 17.0 years

25 - 40 Lacs

mumbai suburban

Work from Office

An Opportunity to Work with One of India's Leading Credit Card Tech Innovators BOBCARD (A Bank of Baroda Subsidiary) Education: BE/B.Tech, BCA/MCA, BSc/MSc in Computer Science, IT, or related field. Experience: 11 to 20 years Location: Goregaon, Mumbai (5 days' from Office) **Domain: Fintech/BFSI/NBFC (mandate) Role & responsibilities: Leadership: Lead and mentor a team of Data scientists and Data Engineers, fostering a culture of innovation, collaboration, and continuous improvement. Strategic Direction: Develop and execute a comprehensive data analytics strategy aligned with the organizations goals, ensuring that data-driven insights are integrated into decision-making processes. Data Governance: Establish and oversee data governance policies and practices to ensure data quality, integrity, and security across the organization. Analytical Solutions: Design and implement advanced analytical solutions, including predictive and prescriptive analytics, to drive business performance and identify new opportunities. Collaboration: Work closely with business leaders and stakeholders to understand their analytical needs and deliver actionable insights that support strategic initiatives. Technology Evaluation: Stay updated on the latest trends and technologies in data analytics and business intelligence, evaluating and recommending tools and platforms that enhance analytical capabilities. Performance Metrics: Define key performance indicators (KPIs) and metrics to measure the success of data analytics initiatives and report findings to senior leadership. Budget Management: Manage the data analytics budget, ensuring efficient allocation of resources and alignment with organizational priorities. Stakeholder Communication: Present analytical findings and insights to executive leadership and stakeholders, effectively communicating the value of data-driven decision-making. Desired candidate profile: Proven experience in developing and executing data analytics strategies that drive business value. Strong knowledge of Data analytics tools and technologies (e.g., Tableau, Power BI, SQL, Python, R) and data management platforms (e.g., Hadoop, Snowflake). Experience with data governance frameworks and best practices. Excellent analytical and problem-solving skills, with the ability to interpret complex data and present actionable insights. Strong communication and interpersonal skills, with the ability to collaborate effectively with technical and non-technical stakeholders.

Posted 1 week ago

Apply

3.0 - 8.0 years

4 - 9 Lacs

pune

Work from Office

Position: Team Lead - MDM Location: Pune Kharadi Company: Global MNC Shift time: 4 pm to 1 am (Hybrid) & pick & drop Facility Responsibilities for master data management Assist with data mappings, data modeling, data profiling, query design, data flow design, data strategy and data governance between multiple databases across multiple platforms Analyze main business processes and requirements and translate into IT solutions Analyse data quality Overall project support Develop a master data management strategy for site Identify, support or manage process improvement initiatives Audit data and facilitates resolution Own department Master Data documentation Provide SAP system trainings to new staff and existing staff Own non-conformances related to support system issues Qualifications for master data management Develop, administer and maintain Master Data Management environment (UOs Master Data Repository) by providing technical solutions in support of business objectives and ongoing operations Provide technical expertise with the Master Data Repository Comprehensive understanding of Material, Customer and VendorDomains Business Rules and workflows within an MDM application Experience designing, evolving MDM architecture and solutions for large enterprise You should love working with multiple technologies and be a technology enthusiast Interested ones share resume on dhanashree.chitre@weareams.com

Posted 1 week ago

Apply

3.0 - 8.0 years

25 - 35 Lacs

pune, gurugram, bengaluru

Hybrid

Job Qualifications Data Catalog Specialist Required Skills & Experience: Hands-on experience with Collibra Data Intelligence Platform, including: Metadata ingestion Data lineage stitching Workflow configuration and customization Strong understanding of metadata management and data governance principles Experience working with the following data sources/tools: Teradata (BTEQ, MLOAD) Tableau QlikView IBM DataStage Informatica Ability to interpret and map technical metadata from ETL tools, BI platforms, and databases into Collibra Familiarity with data lineage concepts , including horizontal lineage across systems Proficiency in SQL and scripting for metadata extraction and transformation Excellent communication skills to collaborate with data stewards, engineers, and business stakeholders Preferred Qualifications: Experience with Collibra APIs or Connectors Knowledge of data governance frameworks (e.g., DAMA DMBOK) Prior experience in a regulated industry (e.g., finance, healthcare)

Posted 1 week ago

Apply

3.0 - 6.0 years

0 - 0 Lacs

pune

On-site

Role Overview: We are looking for a skilled Azure Data Engineer to design, develop, and optimize data pipelines and solutions on the Azure cloud platform. The ideal candidate will have expertise in SQL, ETL, Azure, Python, PySpark, Databricks, and ADF, ensuring efficient data processing and integration. Key Responsibilities: - Design and implement ETL pipelines using Azure Data Factory (ADF), Databricks, and PySpark. - Develop and optimize SQL queries for data extraction, transformation, and loading. - Work with Azure cloud services such as Azure Data Lake, Azure Synapse, and Azure Blob Storage. - Build scalable data solutions using Python and PySpark for big data processing. - Ensure data integrity, security, and governance across all pipelines. - Collaborate with data analysts, scientists, and business teams to support data-driven decision-making. - Monitor and troubleshoot data workflows for performance optimization. Required Skills & Qualifications: - Strong proficiency in SQL for data manipulation and querying. - Hands-on experience with ETL tools and Azure Data Factory (ADF). - Expertise in Azure cloud services for data storage and processing. - Proficiency in Python and PySpark for data engineering tasks. - Experience with Databricks for big data processing and analytics. - Knowledge of data modeling, warehousing, and governance. - Familiarity with CI/CD pipelines for data deployment. Preferred Qualifications: - Experience with Azure Synapse Analytics. - Knowledge of Kafka or Event Hub for real-time data streaming. - Certifications in Microsoft Azure Data Engineering. Key Words for Job Posting & Resume Screening - SQL, ETL, Azure, Python, PySpark, Databricks, ADF - Azure Data Lake, Azure Synapse, Azure Blob Storage - Data Engineering, Big Data, Data Pipelines - Data Transformation, Data Governance, Data Security - CI/CD, Data Warehousing, Cloud Computing - Machine Learning, AI, Analytics, Business Intelligence

Posted 1 week ago

Apply

7.0 - 12.0 years

10 - 20 Lacs

navi mumbai

Work from Office

Mizuho Global Services India Pvt. Ltd. Mizuho Global Services Pvt Ltd (MGS) is a subsidiary company of Mizuho Bank, Ltd, which is one of the largest banks or so called Mega Banks of Japan. MGS was established in the year 2020 as part of Mizuhos long-term strategy of creating a captive global processing center for remotely handling banking and IT related operations of Mizuho Bank’s domestic and overseas offices and Mizuho’s group companies across the globe. At Mizuho we are committed to a culture that is driven by ethical values and supports diversity in all its forms for its talent pool. Direction of MGS’s development is paved by its three key pillars, which are Mutual Respect, Discipline and Transparency, which are set as the baseline of every process and operation carried out at MGS. What’s in it for you? Immense exposure and learning Excellent career growth Company of highly passionate leaders and mentors Ability to build things from scratch Know more about MGS: - https://www.mizuhogroup.com/asia-pacific/mizuho-global-services Position: AVP - Business Analyst – Data Governance Domain Shift :- General shift work from office Key Responsibilities: Data Source Visualization / Analyzation: Visualization / Analyzation of the data source systems, files and filed information for creating internal / external reports. Regulatory Reporting Compliance: Ensure accurate and timely submission of regulatory reports (e.g., RBI, MAS, HKMA etc.) in line with local and cross-border compliance requirements. Data Governance Execution: Implement and maintain data governance frameworks, including data ownership, lineage, quality controls, and stewardship across corporate banking platforms. Business Analysis & Documentation: Gather and document business requirements for data sourcing, transformation, and reporting. Prepare BRDs, FRDs, and data dictionaries aligned with APAC regulatory standards. Cross-Regional Coordination: Collaborate with India operations and APAC stakeholders (e.g., Singapore, Hong Kong, Australia etc.) to align data governance and reporting practices. Data Quality & Controls: Define and monitor data quality rules, perform root cause analysis on data issues, and drive remediation efforts across systems. Audit & Regulatory Readiness: Support internal and external audits by providing traceability, control documentation, and evidence of compliance with data policies. Tool Enablement & Reporting Automation: Leverage tools such as Power BI , SQL, Tableau, and Excel etc. to support data governance and reporting automation. Stakeholder Engagement: Act as a liaison between compliance, finance, IT, and operations teams to ensure alignment on data and reporting objectives. Role-Specific Duties: Assistant Vice President (AVP): Oversee project execution and coordinate between teams. Develop detailed analyses and maintain organized documentation. Planning and promotion of user tasks from BA’s perspective Required Skills/Experiences/Personalities: 1. Banking Experience: Experience working in a core banking areas/Back-Office operations for Trade/Payments/Lending/CASA as a bank employee/consultant. 2. Experience in Corporate banking , with a focus on regulatory reporting and data governance in India and APAC. (must have) 3. Strong understanding of regional regulatory frameworks (e.g., RBI, MAS 610, HKMA returns, BCBS 239 etc.). 4. Proficiency in SQL Queries , MS Access, Excel macro , UiPath (RPA) and data visualization tools (e.g., Power BI, Tableau etc.) . 5. Familiarity with data governance platforms and metadata management tools. (must have) 6. Excellent communication and stakeholder management skills across geographies. 7. Business Analysis Skills: Strong experience in creating Business Requirements Documents (BRD) , Functional Requirements Documents (FRD) & UAT plan/Execution is mandatory and Experience working in Agile and Waterfall environments. (must have) 8. An acute attention to detail and commitment to producing high-quality, precise, and extensive requirement documentation. 9. Understanding of downstream data flows for report creation (For AVP: Strong expertise is mandatory.) 10. Experience in utilization of AI (expected to consider how to use AI Technology/ Chabot such as ChatGPT 11. Experience in project implementation from user side (big project is preferred) 12. Travel Flexibility: Willingness to travel within the APAC region to meet Mizuho teams and gather requirements. Qualification: - A Master's Degree, preferably in Science, Finance, Business and IT, is preferred. Experience: Assistant Vice President - (Total Experience :10-15 Years) 5+ yrs in business analyst role, Relevant exp Minimum of 5 years' experience in Information system. Data Governance, Corporate banking, regulatory reporting, Data visualization and analyzation) Senior Officer - (Total Experience : 3 – 8 Years) 3+ years in business analyst role, Relevant exp Minimum of 3 yrs' experience in Information system. Data Governance, Corporate banking, regulatory reporting, Data visualization and analyzation) Address: Mizuho Global Services India Pvt. Ltd, 11th Floor, Q2 Building Aurum Q Park, Gen 4/1, Ttc, Thane Belapur Road, MIDC Industrial Area, Ghansoli, Navi Mumbai- 400710.

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You will be responsible for data engineering, ETL, and Snowflake development in Hyderabad (WFO). Your main tasks will include SQL scripting, performance tuning, working with Matillion ETL, and utilizing cloud platforms such as AWS, Azure, or GCP. It is essential to have proficiency in Python or scripting languages, experience with API integrations, and knowledge of data governance. Snowflake certifications (Snow Pro Core/Advanced) are a plus. With a minimum of 5 years of experience in data engineering, ETL, and Snowflake development, you should possess strong expertise in Snowflake, including SQL scripting, performance tuning, and data warehousing concepts. Hands-on experience with Matillion ETL for creating and managing ETL jobs is required. You should also demonstrate a solid understanding of cloud platforms and cloud-based data architectures. Furthermore, proficiency in SQL, Python, or other scripting languages for automation and data transformation is expected. Experience with API integrations, data ingestion frameworks, and knowledge of data governance, security policies, and access control within Snowflake environments are important aspects of this role. Excellent communication skills are necessary for engaging with both business and technical stakeholders. As a self-motivated professional, you should be capable of working independently and delivering projects on time.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

kochi, kerala

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself and a better working world for all. EY is seeking a highly skilled Manager to join our team, specializing in the end-to-end implementation of Informatica Master Data Management (MDM) with knowledge of Informatica Data Quality (IDQ). The ideal candidate will be responsible for the technical aspects of implementing and managing Informatica MDM and IDQ solutions to support data management and quality initiatives, with the ability to handle all required activities with minimal support. Responsibilities: - Lead the end-to-end implementation of Informatica MDM and IDQ, including requirement gathering, solution design, configuration, and maintenance to meet business requirements. - Work closely with data governance teams to integrate MDM and IDQ with other data management tools and systems. - Utilize Informatica MDM to create a centralized, consistent view of master data entities, ensuring alignment with organizational policies and standards. - Develop and maintain documentation related to system configuration, mapping, processes, and service records. - Define policies, develop and establish MDM processes, and document procedures for exception handling. - Evaluate and ensure infrastructure readiness within client environments for the proposed solution. - Evaluate and propose solutions with appropriate MDM hub designs to meet clients" requirements. - Monitor daily job execution, schedule jobs, and collaborate with Informatica support teams for installation, upgradation, and troubleshooting. - Perform data profiling, cleansing, and standardization activities within Informatica IDQ. - Define and implement data quality rules, business glossaries, and data quality profiling, and publish Data Quality results dashboards. - Train end-users and technical staff on the effective use of Informatica MDM and IDQ tools. - Support business development efforts that include proposal support, proposal response development, and client presentations. - Develop comprehensive training materials and conduct training sessions for client employees on the effective use of the implemented tools. Must-have skills: - 8+ years of relevant experience in Informatica MDM and IDQ. - Strong understanding of data management, data quality, and governance principles. - Proven experience in implementing data quality rules, business glossaries, data quality profiling, and publishing Data Quality results dashboards. - Ability to articulate and propose technology solutions to meet business requirements. - Ability to handle all required activities with minimal support. Good to have: - Experience in the Oil and Gas/Power and Utility/Manufacturing industries. - Good understanding of Data Management and Governance frameworks and leading practices such as DAMA, CMMI, or NDMO. - Proficiency with industry-standard data sources and systems (PI, SAP, LIMS, SPF, Maximo, SharePoint, SCADA, HYSYS). - Familiarity with related Data Governance and Management tools and technologies such as EDC, AXON, Microsoft Purview, and Collibra. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

chennai, tamil nadu

On-site

The Sr. Group Manager of Emerging Technologies and Analytics position at Acentra Health is a challenging and exciting opportunity for a candidate with a strong background in data analytics, statistics, and engineering. As the Leader of Emerging Technologies and Analytics, you will play a crucial role in establishing enterprise-level AI and ML programs that support the company's growth and expansion goals. In this role, you will be required to develop and implement innovative solutions using AI/ML and automation, collaborate with stakeholders, and create value to align with business objectives. Your responsibilities will include overseeing the research, development, and deployment of AI/ML models, integrating AI/ML solutions into products and services, and driving the design and implementation of cloud-native solutions. You will lead a team of machine learning engineers, business intelligence analysts, and data scientists, ensuring their professional growth and development. Additionally, you will work closely with executives and cross-functional teams to understand their data needs and provide timely insights to support decision-making. To be successful in this role, you should have 10+ years of technology leadership experience, a strong programming background in Python, and experience with cloud-based infrastructure and AI/ML platforms like AWS, Azure, or Google Cloud. A Bachelor's or advanced degree in Computer Science, Data Science, Data Analytics, Statistics, Mathematics, or Engineering is required, with a master's or Ph.D. preferred. You should also have a proven track record of developing and implementing machine learning models in real-world applications, as well as experience with natural language processing techniques and best practices. Furthermore, you should possess strong project management and organizational skills, with the ability to prioritize and manage multiple initiatives simultaneously using Agile methodologies. Your strategic mindset should be aligned with analytics initiatives and overall business goals, and you should have exceptional problem-solving skills to drive business value using data effectively. Excellent communication and presentation skills are essential to explain technical concepts to non-technical stakeholders effectively. If you are looking for a role that offers significant opportunities for growth, career development, and professional advancement in the field of emerging technologies and analytics, join our team at Acentra Health to help shape the future of healthcare technology.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

As a Technical Guidance Specialist in Master Data Management (MDM), you will play a crucial role in providing technical expertise and guidance to engineers and clients throughout the MDM implementation process. Your responsibilities will include assisting in data modelling, ensuring data quality, and establishing data governance practices to support effective MDM operations. Collaboration is key in this role as you will work closely with stakeholders to comprehend their data management requirements and deliver solutions that are in line with business objectives. Your contributions will be instrumental in driving successful MDM projects and ensuring alignment with organizational goals. Training and support are essential aspects of this position. You will be responsible for conducting training sessions for engineers and clients, equipping them with the necessary knowledge and tools to proficiently utilize the MDM system. Ongoing support will also be provided to address any queries or challenges that may arise post-implementation. Developing and implementing best practices for MDM processes will be a core part of your role. This includes establishing data quality standards, data governance policies, and data integration strategies to optimize MDM operations and enhance data management efficiency. To excel in this position, you should possess a minimum of 5 years of hands-on experience in MDM implementation, with a specific focus on Profisee. Proficiency in data modelling, data quality assurance, and data governance principles is crucial. A strong understanding of Profisee MDM software is highly desirable to effectively navigate and leverage its capabilities. Effective communication skills are essential for this role. You must be able to articulate technical concepts clearly and concisely to non-technical stakeholders. Your ability to communicate effectively will facilitate collaboration and ensure alignment between technical solutions and business requirements. As a problem-solver, you should have strong analytical skills and the capability to troubleshoot and resolve issues promptly and efficiently. Your adept problem-solving abilities will be instrumental in addressing challenges that may arise during MDM implementation and operation. Ideally, you should hold a Bachelor's degree in Computer Science, Information Technology, or a related field. A Master's degree in a relevant discipline would be advantageous and demonstrate a deeper understanding of complex technical concepts and practices. In summary, as a Technical Guidance Specialist in MDM, you will play a pivotal role in ensuring the successful implementation and operation of MDM solutions. Your expertise, collaboration skills, and commitment to best practices will contribute significantly to the effectiveness and efficiency of data management processes within the organization.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

The position available in our Digital Marketing team is a non-supervisory role focused on implementing the data strategy related to USPs Marketing Automation Platform (MAP) and ensuring data quality alignment between USPs Salesforce CRM and the MAP. Your responsibilities will include supporting data normalization, conducting marketing data audits, and providing internal training on data management practices. In this role at USP, you will play a crucial part in advancing USP's public health mission by enhancing access to safe medicine and promoting global health through established standards and programs. As part of our commitment to the professional development of all people managers, we regularly invest in training inclusive management styles and other competencies to foster engaged and productive work environments. Key Responsibilities: - Data Quality: Monitor and uphold data quality integrity between CRM and MAP, ensuring accuracy and completeness. Enhance customer and prospect information through data enrichment activities. - Marketing Data Audits: Conduct routine audits to identify and address duplicate, incomplete, or outdated records. - Seamless Integration of Data: Collaborate with marketing, sales, and IT teams to ensure smooth data integration across various platforms. - Data Normalization: Assist in standardizing and normalizing data by establishing validation rules, filters, and workflows. - Data Quality Reports: Provide regular updates on data quality initiatives, highlighting key metrics and areas for improvement. - Training: Educate internal teams on effective data management practices to minimize errors and maintain data quality. Qualifications and Experience: - Bachelor's degree in marketing, information systems, business, or related field - 5-7 years of experience in marketing operations, data management, or related field - Proficiency with Salesforce CRM system and Marketing Automation Platforms (preferably Marketo) - Familiarity with data visualization tools (e.g., Tableau), SQL, and data governance - Strong analytical skills and proficiency in Excel or similar tools - Excellent collaboration skills with IT, marketing, and sales teams - Strong written and verbal communication skills Desired Preferences: - Proficiency in SQL or other query languages - Understanding of building marketing models and attribution models - Interest in improving marketing processes and workflows - Strong organizational skills to manage multiple projects effectively This position does not have any supervisory responsibilities. USP offers comprehensive benefits to ensure your personal and financial well-being.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

The Natural Language Query (NLQ) platform is an innovative initiative designed to transform the way users interact with data. Our platform leverages advanced natural language processing (NLP) to convert user queries in plain language into executable SQL queries, enabling seamless data retrieval and analysis without the need for SQL expertise. The NLQ platform will be powered by the Enterprise Data Catalog, ensuring comprehensive and accurate metadata definitions for all table and column information. This platform empowers different business units to build external-facing conversational BI chatbots to handle customer requests, while also significantly reducing data exploration efforts for internal data analysts by more than 90%. Key Responsibilities: - Provide Platform-as-a-Service offerings that are easy to consume, scalable, secure, and reliable using open source-based solutions. - Develop and enhance the NLQ Copilot platform, ensuring it meets the evolving needs of multi-tenant environments. - Implement Context builder algorithms by leveraging different prompt engineering techniques to generate 100% accurate SQL as per the customer needs. - Collaborate with downstream clients to integrate business requirements, adding robust guardrails to prevent unauthorized query generation. - Work closely with data scientists, engineers, and product managers to optimize the performance of the NLQ platform. - Utilize cutting-edge NLP / LLM and machine learning techniques to improve the accuracy and efficiency of query transformations. - Ensure the platform's scalability and reliability through rigorous testing and continuous improvement. - Champion the adoption of open infrastructure solutions that are fit for purpose while keeping technology relevant. - Spend 80% of the time writing code in different languages, frameworks, and technology stacks. This is a hybrid position. Expectation of days in office will be confirmed by your hiring manager. Qualifications Basic Qualification - 4 yrs+ of experience in architecture design and development of large-scale data management platforms and data application with simple solutions - Bachelor's or masters degree in Computer Science or related technical discipline required - Must have extensive hands-on coding and designing skills on Java/Python for backend - MVC (model-view-controller) for end-to-end development - SQL/NoSQL technology. Familiar with Databases like Oracle, DB2, SQL Server, etc. - Web Services (REST/ SOAP/gRPC) - React/Angular for front-end (UI front-end nice to have) - Expertise in design and management of complex data structures and data processes - Expertise in efficiently leveraging the power of distributed big data systems, including but not limited to Hadoop Hive, Spark, Kafka streaming, etc. - Strong service architecture and development experience with high performance and scalability - Strong on driving for results and self-motivated, strong learning mindset, with good understanding of related advanced/new technology. Keep up with the technology development in the related areas in the industry, which could be leveraged to enhance current architectures and build durable new ones. - Strong leadership and team player. - Strong skills on mentoring/growing junior people Preferred Qualification - Deep knowledge and hands-on experience on big data and cloud computing technologies. - Experience with LLM / GenAI tools / applications and Prompt Engineering - Experience with ETL/ELT tools / applications - Experience with Apache NiFi and Apache Spark for processing large data sets - Experience on Elastic Search - Knowledge on Data Catalog tools - Experience in building Data Pipeline development tools - Experience with Data Governance and Data Quality tools,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Data Engineer specializing in SAP BW and Azure Data Factory (ADF), you will be responsible for leading the migration of SAP BW data to Azure. Your expertise in data integration, ETL, and cloud data platforms will be crucial in designing, implementing, and optimizing SAP BW-to-Azure migration projects. Your key responsibilities will include ensuring data integrity, scalability, and efficiency during the migration process. You will design and implement ETL/ELT pipelines using Azure Data Factory (ADF), Synapse, and other Azure services. Additionally, you will develop and optimize data ingestion, transformation, and orchestration workflows between SAP BW and Azure. Collaborating with business and technical stakeholders, you will analyze data models, define migration strategies, and ensure compliance with data governance policies. Troubleshooting and optimizing data movement, processing, and storage across SAP BW, Azure Data Lake, and Synapse Analytics will be part of your daily tasks. You will implement best practices for performance tuning, security, and cost optimization in Azure-based data solutions. Your role will also involve providing technical leadership in modernizing legacy SAP BW reporting and analytics by leveraging cloud-native Azure solutions. Working closely with cross-functional teams, including SAP functional teams, data architects, and DevOps engineers, you will ensure seamless integration of data solutions. Your expertise in SAP BW data modeling, ETL, reporting, Azure Data Factory (ADF), Azure Synapse Analytics, and other Azure data services will be essential in this role. Proficiency in SQL, Python, or Spark for data processing and transformation is required. Experience in Azure Data Lake, Azure Blob Storage, and Synapse Analytics for enterprise-scale data warehousing is a must. Preferred qualifications include experience with SAP BW/4HANA and its integration with Azure, knowledge of Databricks, Power BI, or other Azure analytics tools, and certification in Azure Data Engineer Associate (DP-203) or SAP BW. Experience in metadata management, data governance, and compliance in cloud environments is a plus. Your strong analytical and problem-solving skills, along with the ability to work in an agile environment, will contribute to the success of the migration projects.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

You are looking for candidates who are available to join immediately or within a maximum of 15 days. In this role, as a Finance Data Governance consultant supporting Regulatory Reporting Automation, you will be responsible for deploying Governance Policy and ensuring the appropriate accountability model and processes for data asset management, metadata management, data quality, and issue resolution. You will provide insight into the root-cause-analysis of data quality issues and assist in the remediation of Audit and Regulatory feedback. Additionally, you will be recommending strategic improvements to the data assessment process and making necessary enhancements to data analytical and data quality tools. Your responsibilities will also include supporting the current Regulatory Reporting needs via existing platforms by collaborating with upstream data providers, downstream business partners, and technologies. To excel in this role, you should have hands-on experience within Regulatory Reporting frameworks, especially around Liquidity reporting (FR 2052A, LCRs, NSFRs) and Capital Reporting (FRY 14, Y9Cs). Knowledge of 2052a is highly preferred as it is the MVP and is live in Oct 2024. Strong relationship skills and communication are essential as you will be partnering extensively with Business process owners, Technologies, and Enterprise Data Governance. Being self-motivated and proactive with the ability to manage multiple assignments and projects concurrently is crucial. SQL skills are a strong plus along with strong analytical and problem-solving abilities. You should also have demonstrated experience in collecting business requirements and using them to guide the implementation of technical solutions. Prior experience in defining and implementing data governance and data quality programs is preferred. Additionally, familiarity with data governance, metadata, and data lineage tools such as Collibra and MANTA is strongly preferred, although not required if you have strong SQL skills. Knowledge of Agile or SAFE project methodologies would be beneficial for this role.,

Posted 1 week ago

Apply

11.0 - 15.0 years

0 Lacs

karnataka

On-site

As an AI Architect Senior Consultant in Infosys Global Consulting Practice, you will play a crucial role in leading the design and implementation of cutting-edge AI solutions for clients across various industries. Your responsibilities will include designing AI solutions such as Generative AI and Conversational AI, ensuring scalability, security, compliance, and adherence to Responsible AI guidelines. You will also architect and deploy robust AI/ML models on both Cloud-based and on-premises infrastructures, optimizing for performance and efficiency in real-world applications. Your expertise in NLP, LLMs, and Computer Vision will be instrumental in guiding the team to identify and select the most optimized solutions for client needs. Building strong relationships with key decision-makers and technical teams at major hyperscalers like AWS, Azure, and GCP will be essential. Managing multiple projects concurrently, collaborating with cross-functional teams, and staying updated on the latest advancements in AI/ML research and technologies will be key aspects of your role. To qualify for this position, you must have a Master's degree in computer science, Artificial Intelligence, or a related field, along with 11+ years of experience in designing, developing, and deploying AI/ML solutions. You should have a proven track record of leading complex AI projects and expertise in Traditional Machine Learning, NLP, and cloud platforms such as AWS, GCP, and Azure. Strong problem-solving skills, excellent communication abilities, proficiency in Python, and knowledge of Probability Statistics, Optimization techniques, and containerization technologies are highly valued. Joining Infosys Consulting means becoming part of a global management consulting firm that helps renowned brands transform and innovate using disruptive technology. Our consultants are industry experts driving change agendas to create lasting competitive advantage. If you are passionate about AI architecture and ready to drive innovation in a dynamic environment, we invite you to explore this exciting opportunity with us. For more information about Infosys Consulting and to explore our transformative solutions across Business Transformation, Enterprise Technology Strategy, and Strategy and design domains, visit www.InfosysConsultingInsights.com.,

Posted 1 week ago

Apply

15.0 - 19.0 years

0 Lacs

hyderabad, telangana

On-site

As a Data Domain Architect Lead in our data management team, you will be part of a high-performing team focused on delivering innovative data analytics solutions. Your main responsibilities will include developing strategies for effective data analysis and reporting, selecting, configuring, and implementing analytics solutions. You will have the opportunity to promote commitment focused on the delivery of data management capabilities within the KYC Analytics Strategic Platform. In this role, you will collaborate with LOB, Consumers, and Technology to perform root cause analysis and develop strategic remediation paths for data quality issues in Reference Data and Vendor applications. You will coordinate with LOB Business Leads, Reference Data Domain Leads, and KYC Solutions to address data quality issues related to KYC and Reference Data sourced via the KYC Analytics Platform. Additionally, you will act as a subject matter expert on KYC data, assisting various teams with ad hoc requests. Your responsibilities will also include documenting processes and procedures for resolution and coordinating handover to the KYC & LOB Operations team when necessary. You will develop strategies for effective data analysis and reporting, defining company-wide metrics and relevant data sources. Moreover, you will select, configure, and implement analytics solutions while leading a team of data analysts to ensure quality and correct discrepancies. You will build systems to transform raw data into actionable business insights, monitor project progress, and provide regular status updates to management through presentations and materials. To qualify for this role, you must have a bachelor's degree in Computer Science, Statistics, Data Management, or related fields. You should have over 15 years of experience, particularly as a Data Management Lead with a focus on Data Analytics and Business Analysis. Demonstrated expertise in data management deliverables such as Data Mapping, Data Lineage, Data Taxonomy, Data Quality, and Data Governance is essential, along with experience in implementing Data Quality Frameworks and managing data quality issues. Strong analytical, critical thinking, and problem-solving skills are required, with the ability to develop and present conclusions and solutions while considering inherent risks. Proficiency in data analysis, reporting, and developing automated data solutions is essential. You should also possess excellent communication and management skills, capable of leading meetings, multitasking, and adapting to changing priorities. Proficiency in MS Office and data interrogation tools like SQL, Alteryx, Qlik Sense, Tableau, and Python is required, with subject matter expertise in KYC, AML, and Client Reference Data.,

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies