Jobs
Interviews

139 Data Lineage Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 5.0 years

0 Lacs

karnataka

On-site

4CRisk is an AI start-up uniquely positioned to identify and solve the annual $300 billion Risk and Compliance problem for banks, non-bank financial institutions, and FinTech companies. The company's mission is to help customers protect brand value and strengthen revenues by reducing risk and the cost of compliance. At 4CRisk, technology, data, UI, and products have all been conceived with a customer-centric approach, believing that Culture trumps aptitude. Our Engineering center (4CRisk.ai Software Private Ltd.) in Bangalore, India is seeking bright and passionate candidates who share our vision and wish to be part of a team of talented professionals. We are looking for a Data Quality Analyst to utilize regulatory data to drive product decisions. Collaborating with cross-functional teams comprising product managers, designers, and engineers, you will apply your expertise to deliver customer insights and help shape the products we offer. Leveraging rich user data through cutting-edge technology, you will witness your insights being transformed into real products. Key Responsibilities: - Performing statistical tests on large datasets to determine data quality and integrity. - Evaluating system performance and design and its impact on data quality. - Collaborating with AI and Data Engineers to enhance data collection and storage processes. - Running data queries to identify quality issues, data exceptions, and cleaning data. - Gathering data from primary or secondary sources to identify and interpret trends. - Reporting data analysis findings to management for informed business decisions and prioritizing information system needs. - Documenting processes and maintaining data records. - Adhering to best practices in data analysis and collection. - Staying updated on developments and trends in data quality analysis. Required Experience/Skills: - Data Quality analysis experience is a must, including root-cause analysis and data slicing. - Designing, building, and executing data quality plans for complex data management solutions on modern data processing frameworks. - Understanding data lineage and preparing validation cases to verify data at each stage of the data processing journey. - Planning, designing, and conducting validations of data-related implementations to achieve acceptable results. - Developing dataset creation scripts for data verification during extraction, transformation, and loading phases by validating data mapping and transformation rules. - Supporting AI and Product Management teams by contributing to the development of a data validation strategy focusing on building the regression suite. - Documenting issues and collaborating with data engineers to resolve issues and ensure quality standards. - Efficiently capturing business requirements and translating them into functional, non-functional, and semantic specifications. - Data Profiling, Data Modeling, and Data Validation Testing experience is a plus. - 1 to 3+ years of proven experience. - Excellent presentation, communication (oral and written) in English, and relationship-building skills across all management levels and customer interactions. - Ability to collaborate with team members globally and across departments. Location: Bangalore, India.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

maharashtra

On-site

You will be responsible for supporting the implementation and maintenance of data governance policies, procedures, and standards specific to the banking industry. Your role will involve hands-on experience in creating and maintaining activities associated with data life cycle management and various data governance activities. You will develop, update, and maintain the data dictionary for critical banking data assets, ensuring accurate definitions, attributes, and classifications. Collaboration with business units and IT teams will be essential to standardize terminology across systems for consistency and clarity. It will also be your responsibility to document end-to-end data lineage for key banking data processes such as customer data, transaction data, and risk management data. Additionally, you will create and maintain documentation of metadata, data dictionaries, and lineage for ongoing governance processes. Experience in preparing reports and dashboards for data quality scores and lineage status will be beneficial. Qualifications: - Bachelor's degree in Information Systems or a relevant field such as B. Tech, BCA, BSc (IT), etc. Experience: - Preferred 4 years of experience in Data management life cycle and Data governance activities. This is a full-time position that requires working in person.,

Posted 1 week ago

Apply

1.0 - 3.0 years

3 - 5 Lacs

Pune

Work from Office

Master Data Management Role at Pune Any Grad/PG with 2years Exp in Data Validation, Data Manipulation, Data Cleaning, Data Analysis Excellent Communication ONLY IMMEDIATE JOINERS Salary up-to 5LPA Call-Rukhsar-9899875055, Roshan- 9899078782

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

maharashtra

On-site

As a Data Services Senior Analyst at Citi, your role will involve ensuring that the data sourced and provisioned by Data Services meets all required quality standards. You will be responsible for assessing, evaluating, and analyzing data challenges, as well as providing recommendations for their resolution. Tracking the identified resolutions until closure and providing regular updates to Senior Management will be part of your responsibilities. Collaboration with various teams and groups will help you develop subject matter expertise and knowledge of industry practices and standards. Key Responsibilities: - Perform data quality analysis and identify data challenges - Lead measurement and engagement improvement initiatives - Drive data quality resolution and improvement initiatives - Work with data quality partners and Technology teams to implement data quality tools - Optimize metrics reporting process and lead project management activities - Support senior management strategic vision and mentor lower-level analysts - Influence decisions through advice, counsel, and facilitating services in your area of specialization - Define strategies to drive data quality measurement, produce data quality dashboards and reports, and implement data quality strategies for effective data governance and improvement Qualifications: - Bachelor's or Master's degree - 10+ years of relevant experience - Strong functional knowledge of Data reconciliation and root causing of issues - Knowledge of Tools like PowerBI or Knime will be an added advantage Critical Competencies: - Professionalism/Work Ethic - Leadership skill - Root cause analysis - Creative thinking - Problem solving - Self-awareness - Teamwork/Collaboration - Oral/Written communications - Leverage diversity - Career management If you are an innovative problem solver with a passion for delivering results and seeking a challenging opportunity in data analysis and management, we invite you to join our team at Citi. We value diversity and respect for individuals, promoting merit-based growth opportunities and personal development for all employees. Your authentic self and well-rounded background will complement our culture of excellence and pride in achieving success together. Come be a part of our team where growth and progress are enabled hand in hand.,

Posted 1 week ago

Apply

5.0 - 10.0 years

12 - 24 Lacs

Kochi

Work from Office

Responsibilities: * Ensure data accuracy & compliance with regulatory standards. * Develop data strategy, governance & quality plans. * Manage metadata, stewardship & lineage. * Collaborate on enterprise-wide data initiatives. Remote work & Saudi Annual bonus Health insurance

Posted 1 week ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Kanpur

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 week ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Coimbatore

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 week ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Chandigarh

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 week ago

Apply

5.0 - 12.0 years

0 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Familiarity with Data Management Standards Ability to work with high volumes of detailed technical & business metadata. Experience with documenting Data Element metadata (Business Elements vs. Technical Data Elements) Experience with understanding how data transformations materialize and determine appropriate controls required to ensure high-level of data quality. Ability to understand and document application and/or data element level flows (i.e., lineage). Ability to analyze both Process and Datasets to identity meaningful actionable outcomes. Understand and implement changes to business processes. Develop and influence business processes necessary to support data governance related outcomes. Manage and influence across vertical organizations to achieve common objectives. Intermediate to Expert level knowledge of MS products such as Excel, PowerPoint, Word, Skype, & Outlook Working knowledge of Metadata tools such as Collibra or equivalent. Familiarity with Data Analytics / BI tools such as Tableau, MicroStrategy etc. Communication Skills: Create both visually and verbally engaging informative materials for departmental leadership, business partners, executives, and stakeholders. Ability to tailor communication of topics to various levels of the organization (e.g., technical audiences vs. business stakeholders). Desired Skills (nice-to-have): General knowledge of Banking industry.

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

We are looking for an experienced Data Governance Architect with deep expertise in Alation and Azure cloud platforms. This role involves partnering with senior stakeholders to define and champion an enterprise data catalog and dictionary strategy, oversee the entire lifecycle of the data catalog from establishing metadata standards and initial MVPs to executing full-scale enterprise rollouts. You should have at least 10 years of experience in data governance and proven expertise in the Alation tool on the Azure platform. Understanding of the Snowflake platform is also required. Additionally, you should have proven expertise in at least two areas such as Data Governance Operating Models, Metadata Management, Data Cataloging, Data Lineage, or Data Quality. A deep understanding of governance frameworks such as DAMA or DCAM, with practical implementation experience, is essential. In this role, you will be responsible for assessing current cataloging and dictionary capabilities, identifying gaps, and developing actionable roadmaps to enrich metadata quality, accelerate catalog population, and drive adoption. You will also need to identify different data personas using the data catalog and design persona-specific playbooks to promote adoption. Your responsibilities will include designing, deploying, and managing scalable data catalog and dictionary solutions using platforms like Alation. Understanding of leading Data Governance tools like Collibra and Purview will be beneficial. You will oversee the entire lifecycle of the data catalog, from establishing metadata standards and initial MVPs to executing full-scale enterprise rollouts. Furthermore, you will define architecture and best practices for metadata management to ensure consistency, scalability, and sustainability of the catalog and dictionary. You will identify and catalog critical data elements by capturing clear business terms, glossaries, KPIs, lineage, and persona-specific guides to build a trusted, comprehensive data dictionary. Developing and enforcing policies to maintain metadata quality, manage access, and protect sensitive information within the catalog will be part of your responsibilities. You will need to implement robust processes for catalog population, including automated metadata ingestion, leveraging APIs, glossary management, lineage tracking, and data classification. Moreover, you will develop a workflow management approach to notify changes to certified catalog content to stewards. Creating reusable frameworks and templates for data definitions and best practices to streamline catalog adoption across teams will also be expected from you.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

maharashtra

On-site

You will be joining KPMG in India, a professional services firm affiliated with KPMG International Limited. Established in August 1993, our network of professionals across India, including Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara, and Vijayawada, are well-versed in global practices and local regulations. As part of KPMG in India, you will have the opportunity to work with national and international clients across various sectors. Our focus is on delivering efficient, industry-specific, and technology-driven services that draw from our deep understanding of global and local markets, as well as our extensive experience in the Indian business landscape. We are looking for an individual with expertise in Liquidity Reporting, Regulatory Reporting, Data Lineage, MS Visio, and strong verbal communication skills. At KPMG in India, we are committed to providing equal employment opportunities to all qualified candidates.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

maharashtra

On-site

As a Senior Business Analyst at BNP Paribas, you will play a crucial role in bridging the gap between business objectives and technical solutions. Your primary responsibility will be to drive business analysis, requirements gathering, and process improvement initiatives for the Know Your Customer (KYC) and due diligence functions within the organization. You will be translating business requirements into functional specifications that can be understood by both technical teams and non-technical stakeholders. Your role will ensure that KYC processes are efficient, compliant, and aligned with regulatory standards by leveraging Agile methodologies, Behavior-Driven Development (BDD), and automation testing strategies. This position will require you to work in a globally distributed organization. Your responsibilities will include leading the collection and documentation of business requirements for KYC, due diligence processes, and Tax and Regulations. You will collaborate with stakeholders from Operations, Front office, Compliance, or IT, conduct interviews/workshops to understand their needs, and translate them into Business Requirement Documents (BRD) and functional specifications. You will develop a deep understanding of business needs through data analysis, market trends, and conduct gap analysis and process mapping to identify areas for improvement. Additionally, you will take ownership of feasibility studies, design solutions in line with requirements and architecture best practices, conduct demos, proposal development, and represent in architectural committees. Your role will involve building delivery plans, defining EPICs, breaking them down into user stories, and writing acceptance criteria using tools like Gherkin. You will act as the primary liaison between business users, IT teams, and external vendors. It will be essential to ensure that Agile principles and practices are adhered to within the project team. You will lead automation efforts and guide teams to align with a shift-left and shift-right strategy by encouraging a mindset for automation first to reduce manual efforts. Collaboration with QA teams to ensure comprehensive test coverage using automation tools will be crucial. Overseeing User Acceptance Testing (UAT) processes to ensure solutions meet business requirements and quality standards will also be part of your responsibilities. Your role will involve managing the change process to minimize disruption and ensure successful adoption of new features. Defining Key Performance Indicators (KPIs) and using insights driven by KPI analysis to drive continuous improvement will be key. Additionally, you will be responsible for the level-up of team members on KYC functional skills, IT best practices, and assisting junior or new joiners in their growth. Mandatory skills for this role include proven experience as a Senior Business Analyst in designing and implementing complex systems with workflow and data flows between multiple modules. You should have strong analytical and process management skills, along with a good understanding of technical infrastructure and data governance. Excellent communication and interpersonal skills are essential, along with the ability to articulate complex processes into a simplified manner for diverse audiences. Hands-on experience with Automated Testing Tools like Cucumber, BDD tools like Gherkin, and API definition tools like Swagger is required. Proficiency in SQL, PL/SQL, and creating Functional Specification Documents (FSD) and BRDs specific to APIs is also necessary. Nice to have skills include experience in KYC Applications, specifically Fenergo application, and knowledge of BPMN tools like Camunda. This position requires a Bachelor's Degree or equivalent with at least 10 years of experience in a similar role. If you are looking to join a dynamic team in a globally recognized organization, this role at BNP Paribas could be the right fit for you.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

maharashtra

On-site

The primary responsibility of the Lead Data Governance role is to oversee data accuracy and identify data gaps within critical data repositories such as the Data Lake, BIU Data warehouse, and Regulatory data marts. This includes developing control structures to ensure data quality and lineage, conceptualizing and reviewing Data Quality Functional and Technical Rules, and collaborating with various teams to address data gaps and ensure metadata management. Furthermore, the Lead Data Governance is tasked with performing detailed data profiling, developing guidelines for data handling, implementing data strategies, and designing solution blueprints to meet current and future business needs. The role also involves utilizing contemporary techniques and dynamic visual displays to present analytical insights effectively and fostering innovation in problem-solving. In addition to primary responsibilities, the Lead Data Governance role involves stakeholder management, creating data remediation projects, understanding business requirements, leading by example, developing a tableau reporting team, upskilling the team, and ensuring team productivity and quality deliverables. Key success metrics for this role include maintaining accurate and consistent data, conducting timely data quality checks, and ensuring no data quality issues in BIU Datamarts. The ideal candidate for this position should hold a Bachelor's degree in relevant fields such as Bachelor of Science (B.Sc), Bachelor of Technology (B.Tech), or Bachelor of Computer Applications (BCA), along with a Master's degree such as Master of Science (M.Sc), Master of Technology (M.Tech), or Master of Computer Applications (MCA). Additionally, a minimum of 10 years of experience in data governance is required to excel in this role.,

Posted 1 week ago

Apply

12.0 - 17.0 years

15 - 20 Lacs

Hyderabad

Work from Office

Overview As Data Modelling Assoc Manager, you will be the key technical expert overseeing data modeling and drive a strong vision for how data modelling can proactively create a positive impact on the business. You'll be empowered to create & lead a strong team of data modelers who create data models for deploying in Data Foundation layer and ingesting data from various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data modelling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will independently be analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be a key technical expert performing all aspects of Data Modelling working closely with Data Governance, Data Engineering and Data Architects teams. You will provide technical guidance to junior members of the team as and when needed. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Independently complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, Data Bricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Advocates existing Enterprise Data Design standards; assists in establishing and documenting new standards. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper management: business and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the data science team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications 12+ years of overall technology experience that includes at least 6+ years of data modelling and systems architecture. 6+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 6+ years of experience developing enterprise data models. 6+ years in cloud data engineering experience in at least one cloud (Azure, AWS, GCP). 6+ years of experience with building solutions in the retail or in the supply chain space. Expertise in data modelling tools (ER/Studio, Erwin, IDM/ARDM models). Fluent with Azure cloud services. Azure Certification is a plus. Experience scaling and managing a team of 5+ data modelers Experience with integration of multi cloud services with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata, or Snowflake. Experience with version control systems like GitHub and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI). Excellent communication skills, both verbal and written, along with the ability to influence and demonstrate confidence in communications with senior level management. Proven track record of leading, mentoring, hiring and scaling data teams. Strong change manager. Comfortable with change, especially that which arises through company growth. Ability to understand and translate business requirements into data and technical requirements. High degree of organization and ability to manage multiple, competing projects and priorities simultaneously. Positive and flexible attitude to enable adjusting to different needs in an ever-changing environment. Strong leadership, organizational and interpersonal skills; comfortable managing trade-offs. Foster a team culture of accountability, communication, and self-management. Proactively drives impact and engagement while bringing others along. Consistently attain/exceed individual and team goals Ability to lead others without direct authority in a matrixed environment. Differentiating Competencies Required Ability to work with virtual teams (remote work locations); lead team of technical resources (employees and contractors) based in multiple locations across geographies Lead technical discussions, driving clarity of complex issues/requirements to build robust solutions Strong communication skills to meet with business, understand sometimes ambiguous, needs, and translate to clear, aligned requirements Able to work independently with business partners to understand requirements quickly, perform analysis and lead the design review sessions. Highly influential and having the ability to educate challenging stakeholders on the role of data and its purpose in the business. Places the user in the centre of decision making. Teams up and collaborates for speed, agility, and innovation. Experience with and embraces agile methodologies. Strong negotiation and decision-making skill. Experience managing and working with globally distributed teams

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 15 Lacs

Chennai, Bengaluru

Work from Office

Job Description: Job Title: Data Engineer Experience: 5-8 Years location: Chennai, Bangalore Employment Type: Full Time. Job Type: Work from Office (Mon-Fri) Shift Timing: 12:30 PM to 9:30 PM Required Skills: Strong Financial Servicies (prefered Banking) experience, Translate Finacial and accounting concepts into business and systems requirements, data analysis, identify data anomolies and providie remediation options, data mapping, strong data base design concepts, good familarity with SQL, assist in the creation of meta data, data lineage and data flow diagrams, support UAT planning and execution functions.

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

About us: Analytics Information Management (AIM) is a global community driving data-driven transformation across Citi in multiple functions to create actionable intelligence for business leaders. We are a fast-growing organization collaborating with Citi businesses and functions worldwide. What do we offer: The Data Management team oversees the implementation of best-in-class data quality measurement programs globally in the retail consumer bank. Key areas of support include: - Regulatory Support: Executing business data quality measurements in alignment with regulatory programs like CCAR, AML, etc. - Metrics Design: Identifying critical data elements in different systems, designing data quality rules, and testing and validating these rules. - Data Governance: Standardizing data definitions and ensuring measurement consistency per definitions across systems, products, and regions. - DQ Scorecards: Publishing monthly/quarterly scorecards at the country level and preparing executive summary reports for senior management. - Issue Management: Identifying defects, investigating root causes for issues, and following up with stakeholders for resolution within SLAs. - Audit Support: Identifying cases on control gaps, policy breaches, and providing data evidence for audit completion. Expertise Required: - Analytical Skills - Data analysis and visualization - Proficiency in formulating analytical methodology, identifying trends, and patterns in data - Generating actionable business insights (Preferred) - Tools and Platforms: - Proficiency in SAS, SQL, Python (Added advantage) - Proficiency in MS Excel, PowerPoint, and VBA Preferred - Domain Skills: - Good understanding of data definitions and data discovery - Data Lineage - Data quality framework - Process improvement experience related to compliance and data quality initiatives - Hands-on experience in KPI design, issue resolution, and remediation activities - Identifying control gaps and providing recommendations per data strategy (Preferred) - Knowledge of Banking products and Finance Regulations Soft Skills: - Ability to identify, articulate, and solve complex business problems and present them to management in a structured and simplified form - Excellent communication and interpersonal skills - Strong process/project management skills - Ability to collaborate effectively across multiple functional areas - Thrives in a dynamic and fast-paced environment Educational and Experience Requirements: - MBA / Master's degree in Economics / Statistics / Mathematics / Information Technology / Computer Applications / Engineering from a premier institute. BTech / B.E in Information Technology / Information Systems / Computer Applications (Preferred) Post Graduate in Computer Science, Mathematics, Operations Research, Econometrics, Management Science, and related fields - 5 to 8 years of hands-on experience in delivering data quality solutions, with a minimum of 2 years of experience in the Banking Industry,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

At EY, you will have the opportunity to develop a career tailored to your unique strengths, supported by a global network, inclusive culture, and cutting-edge technology to maximize your potential. Your individual voice and perspective will play a crucial role in shaping EY's future success. By joining us, you will not only create a rewarding experience for yourself but also contribute to building a better working world for all. As an OFSAA Senior at EY, your primary responsibility will be to lead and oversee OFSAA implementation and consulting projects. You will manage engagements at the practice level, drive business growth, and ensure the successful achievement of business objectives, budgets, strategic direction, and delivery quality by consultants under your supervision. Client Responsibilities: - Utilize effective communication and presentation skills to engage with clients at various stages of the implementation lifecycle. - Deliver multiple OFSAA implementation and consulting projects to meet client needs. - Identify innovative approaches and business opportunities to expand the practice's reach within the client ecosystem. - Direct business operations and consulting resources to support clients in implementing OFSAA solutions. - Assess and mitigate business risks while pursuing overall practice goals. - Maintain strategic direction, drive profitable practice growth, ensure high-quality consulting delivery, and uphold customer reference ability. People Responsibilities: - Demonstrate expertise in OFSAA implementations and/or a background in Financial Services with a focus on implementing similar solutions. - Lead large teams to deliver exceptional client services. - Manage ETL tools (e.g., ODI, INFORMATICA) and Reporting applications (e.g., OBIEE, POWERBI). - Oversee people management, portfolio/delivery management, and sales enablement within the practice. - Be accountable for operational, financial, and people metrics, as well as overall business outcomes. - Possess in-depth knowledge of solutions like OFSAA EPM, ERM, FCCM, and IFRS within the OFSAA suite. - Proficient in products, technologies, frameworks, business metadata management, and relevant architectural components. - Strong command of SQL-PL/SQL with the ability to design transformations. - Well-versed in OFSAA staging and reporting data models. - Experienced in data model enhancements and working as a data model architect. - Demonstrate business acumen by developing innovative approaches and focusing on automation. Additional Skills Requirements: - Lead large/medium OFSAA programs and demonstrate expert consulting skills with advanced OFSAA knowledge and industry expertise. - Play a role in business development through presales, practice development, and internal engagement. - Manage consultancy assignments and demonstrate leadership capabilities. - Proficient in data lineage and building load utility tools such as OFSAA Excel File Upload, File to Table (F2T), and Table to Table (T2T). - Ensure end-to-end accountability for customer satisfaction and delivery excellence. - Prioritize deliveries in collaboration with the implementation team. - Approach problem resolution proactively, logically, and systematically. - Clearly articulate problems and proposed solutions. - Display a willingness to learn and adapt quickly to evolving requirements. Join EY in building a better working world by leveraging data, technology, and the expertise of diverse teams across 150 countries to create long-term value for clients, people, and society. EY's global presence spans assurance, consulting, law, strategy, tax, and transactions, enabling teams to address complex challenges with innovative solutions.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

As a Finance Data Services Support Intermediate Analyst at Citi, you will play a crucial role in ensuring that the data sourced and provisioned by Finance Data Management meets all required data quality standards. Your responsibilities will include assessing, evaluating, and analyzing Contracts Match Exceptions for Genesis, providing recommendations, and driving the remediation of these exceptions. You will take ownership and accountability in recording data concerns and ensuring they are addressed promptly. Collaboration with various teams and groups will be essential in developing subject matter expertise and knowledge of industry practices and standards. Your key responsibilities will include: - Conducting data quality analysis and identifying data challenges for Genesis Contracts Match Exceptions (Stubs records) - Participating in data quality resolution and data improvement initiatives - Collaborating with other areas of the firm to understand data challenges and solutions - Prioritizing, supporting, escalating, and following through on data quality exceptions impacting regulatory commitments - Performing variance analysis on data quality improvement - Delivering metrics reporting on data quality issues and resolution - Participating in project management activities - Preparing meeting materials and updates for management and consumers To qualify for this role, you should have a Bachelor's degree and 5-8 years of relevant experience. Critical competencies required for this position include professionalism/work ethic, creative thinking, problem-solving, exposure to reporting tools like Tableau and PowerBI (an added advantage), self-awareness, teamwork/collaboration, and strong oral/written communication skills. Career management skills are also crucial for success in this role. This position falls under the Data Governance job family group and specifically within the Data Quality & Data Quality Analytics and Reporting job family. It is a full-time role that requires individuals with skills in change management, data analysis, data governance, data lineage, data management, data quality, internal controls, management reporting, program management, and risk management. Citi is committed to fostering diversity and creating an inclusive environment where individuals from all backgrounds can thrive. If you are an innovative problem solver who is passionate about your work and values personal development opportunities, we invite you to join us on our journey of growth and progress. For candidates with disabilities requiring accommodations for the application process, please review the Accessibility at Citi policy. To learn more about Citis EEO Policy Statement and your rights, please refer to the provided resources.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Product Manager at UBS, you will play a crucial role in supporting the Product Owner to develop a clear product vision, strategy, and roadmap covering the full product lifecycle. Your responsibilities will include performing analysis of the Current and To-Be states, collaborating with stakeholders to elicit, document, validate, and manage requirements, translating requirements into Features, User Stories, Acceptance Criteria, and Scenarios that are clear to the team members in the pod. Additionally, you will assist the Product Owner in creating, managing, and prioritizing the pod's backlog in alignment with quarterly Objectives and Key Results, as well as managing issues and dependencies within and outside the pod. In the agile operating model at UBS, you will be part of the IB Subledger - Account Posting Crew in London under the Business Division Control Stream. Your expertise should include a bachelor's degree preferably focusing on finance, accounting, economics, engineering, or computer science, along with experience in business analysis, product management, and delivery of complex products to clients within Finance. You should have a good understanding of Investment Banking, Treasury businesses, and front to back processes, as well as functional knowledge in financial accounting within the financial services industry. Proficiency in solution design and configuration of SAP FPSL for banking, deep knowledge of SAP S4 HANA, hands-on experience in SAP FSDM, FPSL, data sourcing, ETL, data lineage, data governance framework, change management, and Agile framework experience using GITAB are also required. Being curious to learn new technologies and practices, a strong communicator, self-starter with excellent analytical skills, meticulous attention to detail, innovative, and able to manage conflicting stakeholder needs effectively are essential qualities for this role. UBS, as the world's largest and only truly global wealth manager, operates through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management, and the Investment Bank. With a presence in all major financial centers in more than 50 countries, UBS offers a diverse and inclusive work environment where individual empowerment and diverse perspectives are valued. Please note that UBS is an Equal Opportunity Employer that respects and seeks to empower each individual, supporting diverse cultures, perspectives, skills, and experiences within its workforce. Furthermore, UBS welcomes applications from career returners and offers a Career Comeback program for interested individuals.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

maharashtra

On-site

As part of Risk Management and Compliance, you play a crucial role in maintaining the strength and resilience of JPMorgan Chase. Your responsibilities involve facilitating the responsible growth of the firm by proactively identifying new and emerging risks. Your expert judgement is essential in addressing real-world challenges that affect the company, its customers, and communities. In the Risk Management and Compliance culture, innovation and challenging the norm are highly valued, with a constant drive for excellence. Your primary focus will be on supporting data analytics and reporting for Risk Decision Engines and Third-party services within Consumer & Community Banking. You are expected to possess a comprehensive understanding of systems, data, and business requirements, along with the ability to establish data quality and lineage controls. Monitoring and reporting on data, as well as conducting post-implementation validations during releases to ensure decisioning accuracy and support root cause analysis, are also key aspects of your role. Success in this position requires a blend of initiative, leadership, influence, and matrixed management skills. The ideal candidate will be adept at working both independently and collaboratively in small project teams. Strong analytical skills, confidence, and effective communication abilities are crucial traits for this role. Your responsibilities include providing execution support and leadership for large, complex technology-dependent programs that span across various business areas. Collaboration with Business/Stakeholders to gather requirements, understand business logic, and define Data Quality rules/validation checks is essential. You will also engage with key business stakeholders to ensure clear specifications for vendors, analyze and interpret complex data for reconciliation purposes, and lead root cause/outlier analysis for production issues or defects. Furthermore, you will build Data/Detective Controls and data monitoring reports to mitigate risks resulting from changes affecting Risk Decision Engines & Third-party services. Utilizing analytical, technical, and statistical applications such as SAS, SQL, Python, and PySpark to analyze trends, data lineage, and statistical data quality will be part of your responsibilities. Automation of reporting processes, enhancement of current reports through interactive reporting tools like Tableau, Alteryx, Python, and PySpark, and identifying opportunities for process improvements are also key components of your role. Additionally, you will be responsible for data visualization, maintaining tracking and documentation for consumption engagements, processes, flows, and functional documentation. Minimum qualifications for this role include a Bachelors/Masters degree in Engineering or Computer Science, 8-10 years of experience in data analytics & reporting, strong leadership skills, excellent communication abilities, proficiency in database knowledge and analytical skills, experience in Agile framework, Unix, SAS, SQL, Python, PySpark, BI/data visualization tools, and cloud platforms like AWS/GCP. If you are excited about joining our organization and meet the minimum requirements mentioned above, we encourage you to apply for consideration for this critical role.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As an experienced Business Analyst in Regulatory domain, you will play a crucial role in leading the techno-functional aspects within our organization. Your expertise in credit risk, market risk, regulatory reporting, data lineage, as well as a solid understanding of governance, risk, and compliance processes are essential for this role. Your responsibilities will include: - Having a minimum of 5+ years of prior experience with regulatory reporting - Ability to comprehend applications and create information/data workflow diagrams - Proficiency in working in a large project environment - Capability to design mock-up dashboards/charts for user reviews during requirements finalization - Experience with Agile/JIRA and being SCRUM trained - Previous experience in GSIB is preferred - Participating in Scrum Calls for assigned EPIC - Formatting EPIC summaries - Creating EPIC Feature Lists and Product Designs - Analyzing, writing, and grooming stories - Classifying and maintaining stories - Conducting data analysis - Writing Application Services User Guides - Providing production support upon code delivery - Supporting Test Analysis for SIT/UAT - Possessing excellent communication skills You will also be responsible for: - Collaborating with stakeholders across business lines for transformation projects to understand their business and processes - Understanding current business processes and providing functional design inputs for proposed technology solutions - Creating high-quality documentation for Business and Functional Requirements - Managing traceability of requirements from BRD to Test Plan/Results - Analyzing large data sets, creating flow diagrams, preparing high-level summaries and workflows - Working closely with development leads on enhancements and defects, and assisting with troubleshooting/resolution of application defects - Successfully engaging with software developers and testers to ensure quality delivery on time - Planning, estimating, managing risks and issues, project reporting, managing stakeholders, and building strong relationships with the business - Assisting in project execution through JIRA, providing tracking to technical teams, and giving status updates to internal and business stakeholders,

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

We are seeking an experienced Data Governance Architect with specialized knowledge in Alation and Azure cloud platforms. In this role, you will collaborate with senior stakeholders to establish and advocate for an enterprise data catalog and dictionary strategy. Your responsibilities will encompass overseeing the complete data catalog lifecycle, from defining metadata standards and initial MVPs to executing large-scale enterprise rollouts. To qualify for this position, you should have over 10 years of experience in data governance and demonstrate proficiency in Alation tool on Azure platform. Additionally, familiarity with the Snowflake platform is required. Expertise in at least two of the following areas is essential: Data Governance Operating Models, Metadata Management, Data Cataloging, Data Lineage, or Data Quality. A deep understanding of governance frameworks like DAMA or DCAM, along with practical implementation experience, is crucial. As a Data Governance Architect, you must possess strong capabilities in conducting maturity assessments, gap analyses, and delivering strategic roadmaps. Excellent communication skills are necessary for articulating complex topics clearly and producing precise documentation. Key Responsibilities: - Evaluate existing cataloging and dictionary capabilities, identify gaps, and create roadmaps to enhance metadata quality, speed up catalog population, and foster adoption. - Recognize various data personas utilizing the data catalog and develop persona-specific playbooks to encourage adoption. - Plan, implement, and supervise scalable data catalog and dictionary solutions using platforms such as Alation. - Comprehend leading Data Governance tools like Collibra and Purview. - Supervise the entire data catalog lifecycle, including setting metadata standards, developing initial MVPs, and executing large-scale enterprise rollouts. - Define architecture and best practices for metadata management to ensure catalog and dictionary consistency, scalability, and sustainability. - Identify and categorize critical data elements by documenting clear business terms, glossaries, KPIs, lineage, and persona-specific guides to construct a reliable data dictionary. - Establish and enforce policies to uphold metadata quality, regulate access, and safeguard sensitive information within the catalog. - Implement robust processes for catalog population through automated metadata ingestion, API utilization, glossary management, lineage tracking, and data classification. - Create a workflow management approach to notify stewards of changes to certified catalog content. - Develop reusable frameworks and templates for data definitions and best practices to streamline catalog adoption across teams.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

maharashtra

On-site

The Data Quality Lead Analyst contributes to efforts to ensure data that is sourced and provisioned meet all required data quality standards. As the Data Quality Lead Analyst, your role involves leading the continuous Data Quality process by assessing, evaluating, and analyzing data. You will be responsible for setting controls and guidelines for measurement, evaluation, adoption, and communication of Data Quality and Data Quality risk. Collaboration with other team members is essential to monitor and remediate data concerns effectively. Your responsibilities include supporting activities to drive Data Quality measurement, producing Data Quality dashboards and reports, and implementing Data Quality strategies to govern data effectively and enhance Data Quality. Leading data improvement initiatives, processes, and creation of tools in line with requirements will be a key aspect of your role. You will also liaise with other areas of the firm to understand data challenges and solutions, as well as run data consumption demand and requirements. Reviewing quality analysis results and addressing data challenges through Citigroup's corresponding Data/Issue management process will be part of your routine tasks. Leading day-to-day activities to support data quality resolution and optimize metrics reporting process is crucial. You will report Data Quality issues through Citigroup's corresponding Data/Issue management process and support senior management in strategic vision. When making business decisions, it is important to appropriately assess risk and demonstrate consideration for the firm's reputation. Safeguarding Citigroup, its clients, and assets by ensuring compliance with applicable rules, laws, and regulations is essential. Applying sound ethical judgment regarding personal behavior, conduct, and business practices, as well as escalating, managing, and reporting control issues with transparency are important aspects of your role. Qualifications: - 6-10 years of experience with defining and implementing Data Quality programs; Banking or Finance industry experience preferred - Comprehensive understanding of how own area and others collectively integrate to contribute towards achieving business objectives - Ability to manage tight deadlines or unexpected requirement changes and balance needs of multiple stakeholders - Effective communication skills to develop and deliver multi-mode communications for different audiences - Collaboration skills to build partnerships and work effectively with others to meet shared objectives - Ability to work under pressure and facilitate discussions Education: Bachelor's/University degree, Master's degree preferred Please note that this job description provides a high-level review of the types of work performed, and other job-related duties may be assigned as required.,

Posted 2 weeks ago

Apply

1.0 - 5.0 years

0 Lacs

karnataka

On-site

As a member of the Sanctions team within the Global Financial Crimes (GFC) program at FIS, you will play a key role in supporting the Senior Director to establish a strong tech and data knowledge base, as well as assist in system implementations across Global Financial Crimes Compliance. Your responsibilities will include writing documentation on sanctions workflows, standards, guidelines, testing procedures, taxonomies, and operating procedures. You will also be tasked with developing and optimizing complex SQL queries to extract, manipulate, and analyze large volumes of financial data to ensure data accuracy and integrity. Additionally, you will be responsible for creating and maintaining comprehensive data lineage documentation, contributing to the development and maintenance of master data management processes, and generating regular reporting on Financial Crimes Data Governance KPIs, metrics, and activities. Your role will involve monitoring LOB compliance activities, verifying regulatory compliance deadlines are met, and tracking product data compliance deficiencies to completion. To excel in this role, you should possess a Bachelor's or Master's degree in a relevant field such as Computer Science, Statistics, or Engineering, along with 1-3 years of experience in the regulatory compliance field. Previous experience as a Data Analyst in the financial services industry, particularly with a focus on Financial Crimes Compliance, is highly desirable. Proficiency in SQL, data analysis tools, and experience with data governance practices is essential. Strong analytical, problem-solving, and communication skills are key to success in this position. If you have experience in regulatory oversight of high-risk product lines containing complex banking functions or are considered a subject matter expert in sanctions regulatory compliance, it would be considered an added bonus. At FIS, we offer a flexible and creative work environment, diverse and collaborative atmosphere, professional and personal development resources, opportunities for volunteering and supporting charities, as well as competitive salary and benefits. FIS is committed to protecting the privacy and security of all personal information processed to provide services to clients. Our recruitment model primarily relies on direct sourcing, and we do not accept resumes from recruitment agencies that are not on our preferred supplier list. We take pride in our commitment to diversity, inclusion, and professional growth, and we invite you to be part of our team to advance the world of fintech.,

Posted 2 weeks ago

Apply

1.0 - 4.0 years

2 - 4 Lacs

Hyderabad

Remote

Job description We have a vacancy with below details, Role : Analyst, Data Sourcing Metadata -Cloud Designations: Analyst Experience -1-4 Notice Period : Immediate to 60 days ( Currently serving) Work Mode : WFH(Remote) Working Days : 5 days Mandatory Skills : Data Management, SQL, Cloud tools(AWS/Azure/GCP),ETL Tools (Ab initio, Collibra, Informatica),Data Catalog, Data Lineage, Data Integration, Data Dictionary, Maintenance, RCA, Issue Analysis Required Skills/Knowledge: Bachelors Degree, preferably in Engineering or Computer Science with more than 1 years hands-on Data Management experience or in lieu of a degree with more than 3 years experience Minimum of 1 years experience in data management, focusing on metadata management, data governance, or data lineage, with exposure to cloud environments (AWS, Azure, or Google Cloud) and on-premise infrastructure. Basic understanding of metadata management concepts, familiarity with data cataloging tools (e.g., AWS Glue Data Catalog, AbInitio, Collibra), basic proficiency in data lineage tracking tools (e.g., Apache Atlas, AbInitio, Collibra), and understanding of data integration technologies (e.g., ETL, APIs, data pipelines). Good communication and collaboration skills, strong analytical thinking and problemsolving abilities, ability to work independently and manage multiple tasks, and attention to detail Desired Characteristics: AWS certifications such as AWS Cloud practitioner, AWS Certified Data Analytics Specialty Familiarity with hybrid cloud environments (combination of cloud and on-prem). Skilled in Ab Initio Metahub development and support including importers, extractors, Metadata Hub database extensions, technical lineage, QueryIT, Ab Initio graph development, Ab Initio Control center and Express IT Experience with harvesting technical lineage and producing lineage diagrams. Familiarity with Unix, Linux, Stonebranch, and familiarity with database platforms such as Oracle and Hive Basic knowledge of SQL and data query languages for managing and retrieving metadata. Understanding of data governance frameworks (e.g., EDMC DCAM, GDPR compliance). • Familiarity with Collibra

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies