Jobs
Interviews

455 Metadata Management Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 9.0 years

16 - 20 Lacs

hyderabad

Work from Office

Job Description Summary As Technical Product Manager for our Data Products, you will join our GridOS Data Fabric product management team who are delivering solutions designed to accelerate decarbonization by managing DERs at scale and proactively manage disruptions from climate change. Specifically, you will be accountable for managing technical product lifecycle activities around our core Data Products in partnership with our own & partner development teams to build trusted data products for our GridOS ADMS applications. Job Description Roles and Responsibilities Technical product management, responsible for delivering Data Products in partnership with both GE Vernova & partner development teams. Includes all activities related to sprint planning, backlog grooming, testing and release management. Collaborate with data engineers, data scientists, analysts, and business stakeholders to prioritize product epics & features. Ensure data products are reliable, scalable, secure, and align with regulatory and compliance standards. Advocate for data governance, data quality, and metadata management as part of product development. Evangelize the use of data products across the organization to drive data you can trust to fuel AI/ML predictive workflows. Accountability for functional, business, and broad company objectives. Integrate and develop processes that meet business needs across the organization, be involved in long-term planning, manage complex issues within functional area of expertise, and contribute to the overall business strategy. Developing specialized knowledge of latest commercial developments in own area and communication skills to influence others. Contributes towards strategy and policy development and ensure delivery within area of responsibility. Has in-depth knowledge of best practices and how own area integrates with others; has working knowledge of competition and the factors that differentiate them in the market Brings the right balance of tactical momentum and strategic focus and alignment and uses engineering team organization processes, like scrums, daily-stand-ups and not shy away from explaining deep technical requirements. Uses judgment to make decisions or solve moderately complex tasks or problems within projects, product lines, markets, sales processes, campaigns, or customers. Takes new perspective on existing solutions. Uses technical experience and expertise for data analysis to support recommendations. Uses multiple internal and limited external sources outside of own function to arrive at decisions. Acts as a resource for colleagues with less experience. May lead small projects with moderate risks and resource requirements. Explains difficult or sensitive information; works to build consensus. Developing persuasion skills required to influence others on topics within field. Required Qualifications This role requires significant experience in the Product Management & Digital Product Manager. Knowledge level is comparable to a Master's degree from an accredited university or college. Bachelors degree in Computer Science, Data Science, Engineering, or a related field. Desired Characteristics Strong oral and written communication skills. Strong interpersonal and leadership skills. Demonstrated ability to analyze and resolve problems. Demonstrated ability to lead programs projects. Ability to document, plan, market, and execute programs. Strong understanding of data infrastructure, data modeling, ETL pipelines, APIs, and cloud technologies (e.g., AWS, Azure). Experience with iterative product development and program management techniques including Agile, Safe, Scrum & DevOps. Familiarity with data privacy and security practices (e.g., GDPR, CCPA, HIPAA), and understanding of metadata, lineage, and data quality management. Knowledge and experience with electric utility industry practices.

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

mumbai, pune, chennai

Work from Office

Project description We are looking to hire skilled professionals to help build out an enterprise-grade data quality profiling solution. A working Proof of Concept (PoC) has already been implemented over Avaloq using the existing Oracle APEX infrastructure. The next phase is focused on industrialising the solution and scaling it for broader use. Responsibilities Enhance and productionise the existing data quality profiling PoC using Oracle APEX Collaborate with SMEs and technical stakeholders to capture and refine business and technical requirements Translate high-level discussions into structured documentation and user stories Work closely with DBAs and architects to optimise performance and scalability Ensure alignment with industry-standard data quality profiling frameworks and dimensions (e.g., completeness, accuracy, consistency) Support solution testing and deployment in both development and production environments Skills Must have Expertise in Oracle APEXProven experience developing complex, data-centric applications Strong Business Analysis (BA) capabilitiesSelf-starter with the ability to independently engage SMEs and document requirements Oracle PL/SQLSolid coding and troubleshooting skills; performance tuning and DBA knowledge is a strong advantage Experience with AvaloqFamiliarity with Avaloq's physical data layer is essential Knowledge of data quality frameworksUnderstanding of profiling dimensions and industry-standard approaches (e.g., DQ checks, data validation rules) Nice to have Experience with Avaloq Reporting & Data Extracts (ARD, ADF, SmartView) Familiarity with Avaloq-specific data export or reporting mechanisms would help accelerate development. Knowledge of Oracle Database Tuning / Performance Diagnostics In-depth understanding of execution plans, indexes, and session performance tools. Experience with Data Quality Tools or Frameworks Exposure to tools like Informatica DQ, Talend DQ, Collibra, or custom-built DQ engines. Familiarity with Data Governance Practices Understanding of data stewardship, lineage, and regulatory reporting requirements (e.g., BCBS 239, GDPR). Experience in Financial Services / Core Banking Systems Previous experience in banking, especially wealth or private banking, adds strong domain value. Exposure to Agile / Scrum Delivery Methodologies Ability to work in iterative delivery cycles, contribute to sprint planning, and maintain documentation in JIRA/Confluence. Front-End/UI Experience within APEX Skills in designing intuitive dashboards or user interfaces, particularly for DQ dashboards or exception reporting. Data Visualisation & Reporting Tools Experience with tools like Power BI, Oracle BI Publisher, or Tableau for representing data profiling results. Scripting/Automation Ability to automate DQ checks using Shell, Python, or other scripting tools is a plus. Understanding of Metadata Management Concepts Helps in building reusable and sustainable DQ frameworks across different domains. Location - Pune,Mumbai,Chennai,Banagalore

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

bengaluru

Remote

Job Description Summary This role will play a critical role in supporting the airline customers by designing, developing, and maintaining data pipelines and analytical solutions that drive operational efficiency and fuel savings. This role is responsible for ensuring high-quality data ingestion, transformation, and validation from diverse operational sources, enabling the team to monitor, analyze, and optimize fuel consumption across the airlines fleet. The engineer will work closely with fuel analysts, operations teams, and IT to deliver accurate, timely insights that support decision-making, sustainability goals, and cost reduction initiatives. Job Description Site Overview Role Overview Build and maintain scalable data pipelines to integrate fuel and operational data from multiple systems (flight data recorders, fuel receipts, aircraft telemetry, dispatch systems, etc.). Perform data profiling, validation, and quality checks to ensure accurate reporting and analysis of fuel usage. Develop analytical datasets that empower the Fuel Team to track consumption trends, identify anomalies, and measure the impact of fuel initiatives. Collaborate with data scientists and fuel efficiency experts to support advanced modeling, forecasting, and optimization initiatives. Manage metadata, data dictionaries, and documentation to ensure transparency and consistency in data usage. Provide post-deployment support and troubleshooting for data solutions, ensuring reliable operational performance. Partner with cross-functional teams (Flight Operations, Dispatch, Engineering, IT) to align fuel data initiatives with operational needs. Contribute to sustainability and efficiency efforts by enabling data-driven fuel conservation and emissions-reduction strategies. The Ideal Candidate Ideal candidate should have experience data engineering, preferably in aviation transportation, or other real-time operational environments. Required Qualifications Bachelor's Degree in Computer Science or STEM Majors (Science, Technology, Engineering and Math) with 5+Years experience. Proven experience in data engineering, preferably in aviation transportation, or other real-time operational environments. Proficiency in SQL, Python, PowerShell, or similar programming languages for data transformation and automation. Hands-on experience with ETL tools, data lakes/warehouses (Snowflake, Redshift, BigQuery, etc.), and cloud platforms (AWS, Azure, GCP). Preferred Qualifications Familiarity with aviation data standards (FOQA, QAR, ACARS, IATA fuel standards). Strong problem-solving skills with attention to detail, accuracy, and operational impact. Ability to work collaboratively in a fast-paced, safety-critical industry.

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

This role is for one of Weekday's clients. The ideal candidate should have a minimum of 3 years of experience and be located in Bengaluru for a full-time position. We are seeking a skilled and detail-oriented Informatica Data Quality (IDQ) Developer to join our data engineering team. As an IDQ Developer, your main responsibility will be to develop and implement data quality solutions using Informatica tools. This is crucial to support our organization's data governance, analytics, and reporting initiatives. The perfect candidate will possess a strong background in data profiling, cleansing, and standardization. They should also have a passion for enhancing data accuracy and reliability across enterprise systems. Your key responsibilities will include designing and developing IDQ solutions by building and configuring Informatica Data Quality mappings, workflows, and rules. You will also conduct data profiling and analysis on source systems to identify anomalies and inconsistencies. This will involve translating business data quality requirements into reusable rules and collaborating with ETL and data integration teams to embed data quality checks into workflows. Additionally, you will be responsible for monitoring data quality, troubleshooting technical issues, and maintaining technical documentation for IDQ solutions. Required skills and qualifications for this role include: - At least 3 years of experience in Informatica Data Quality development in a data-intensive or enterprise environment. - Strong hands-on experience with IDQ components such as mappings, mapplets, transformations, and data quality rules. - Proficiency in data profiling, cleansing, parsing, standardization, de-duplication, and address validation techniques. - Good knowledge of relational databases like Oracle, SQL Server, PostgreSQL, and the ability to write complex SQL queries. - Understanding of data governance, metadata management, and master data management concepts. - Experience working with data integration tools, especially Informatica PowerCenter, is a plus. - Strong problem-solving skills, attention to detail, and excellent communication and collaboration skills. - A Bachelor's degree in Computer Science, Information Systems, or a related field is required. Preferred qualifications include Informatica IDQ Certification, experience in regulated industries like banking, insurance, or healthcare, and familiarity with cloud-based data platforms such as AWS, GCP, or Azure.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As a Hybrid Position based in Bangalore/Hyderabad/Pune/Mumbai/NCR, you will be required to attend training on-site in Bangalore. Your role will involve utilizing your in-depth knowledge of document review processes on platforms like Veeva Vault PromoMats and/or MedComms, among others. You should be adept at effectively communicating and troubleshooting challenges through collaboration with various stakeholders, including cross-functional colleagues, external vendors, and customers. Prioritizing tasks and managing time efficiently to ensure the timely delivery of projects without compromising quality will be a key aspect of your responsibilities. Additionally, your familiarity with various deliverable types in the medical affairs and commercial space will be crucial. You will be responsible for understanding copyright management for references, images, etc., and ensuring compliance with PMC standards. This will involve ensuring that tactics are PMC approved before being routed for medical approval globally or uploaded to any repository. You will also be tasked with managing the tactics migration tracker from SharePoint to AEM, maintaining metadata accuracy while uploading PMC assets onto content galleries, and tactics onto Veeva Vault for approvals. Moreover, you will ensure that HE fulfillment requests are processed within defined timeframes. Desired Skills: - Experience in the medical domain - 2-6 years of relevant experience - Familiarity with MLR Review process - Effective communication and collaboration with internal and external stakeholders - Proficiency in time and stakeholder management - Good understanding of MA tactic types - Knowledge of copyright and license agreement management (PMC) - Adherence to processes - Expertise in routing platforms like AEM, SharePoint, Veeva Vault, Capacity Planner Tool, Wrike, etc.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As a DAM Content Support Analyst at Kenvue, you will play a crucial role in supporting the digital asset management platform and associated workflows to facilitate Marketing Operations processes. Your responsibilities will include ensuring that all digital assets are appropriately tagged, organized, and stored in the DAM system. You will collaborate with cross-functional teams to gather and input accurate metadata for each asset and provide L0/L1 technical support and guidance to end users of the platform. Your accountability will extend to optimizing asset management support processes to enhance efficiency, compliance, and speed to market for digital assets, thereby meeting SLAs effectively. Key Responsibilities Asset Ingestion and Management: - Supervise the ingestion of digital assets into the Marketing Lens Digital Asset Management (DAM) system. - Tag assets with relevant metadata, including product information, usage rights, and version history. - Collaborate with cross-functional teams to ensure accurate metadata input for each asset. Content Ownership and Curation: - Organize digital assets into logical classifications for easy retrieval through manual search or integration with other systems. - Ensure asset standards are maintained for downstream applications like PIM and DxP. - Recommend optimizations to global taxonomy aligning with brand guidelines and data model needs. Support Management & Optimization: - Maintain DAM system integrity, troubleshoot and resolve issues within SLAs. - Provide support and training to content creators and stakeholders on effective DAM system usage. - Analyze enhancement-based support requests, conduct feasibility tests, and provide technology guidance for enhancements. - Stay updated with industry trends and best practices for digital asset management. Continuous Improvement: - Stay informed about industry trends and best practices in digital asset management. - Understand segment brand standards and provide project support and test solutions. Required Qualifications: - 2-4 years of experience in digital asset management or related field. - Strong understanding of digital asset management systems and workflows. - Excellent organizational, project management, communication, and interpersonal skills. - Experience in technical support and advisory roles with strong analytical and problem-solving skills. - Ability to prioritize and manage multiple tasks and projects effectively. Desired Qualifications: - Experience with DAM software such as Aprimo, Widen Collective, or Bynder. - Proficiency in Adobe InDesign, Adobe Illustrator, Adobe Photoshop, and Figma. - Experience in configuring and integrating new functionality adhering to company standards. - Ability to work effectively in a team environment. Qualifications: - Bachelor's degree Location: Bangalore, India Job Function: Operations (IT),

Posted 1 week ago

Apply

3.0 - 8.0 years

25 - 35 Lacs

pune, gurugram, bengaluru

Hybrid

Job Qualifications Data Catalog Specialist Required Skills & Experience: Hands-on experience with Collibra Data Intelligence Platform, including: Metadata ingestion Data lineage stitching Workflow configuration and customization Strong understanding of metadata management and data governance principles Experience working with the following data sources/tools: Teradata (BTEQ, MLOAD) Tableau QlikView IBM DataStage Informatica Ability to interpret and map technical metadata from ETL tools, BI platforms, and databases into Collibra Familiarity with data lineage concepts , including horizontal lineage across systems Proficiency in SQL and scripting for metadata extraction and transformation Excellent communication skills to collaborate with data stewards, engineers, and business stakeholders Preferred Qualifications: Experience with Collibra APIs or Connectors Knowledge of data governance frameworks (e.g., DAMA DMBOK) Prior experience in a regulated industry (e.g., finance, healthcare)

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Data Engineer specializing in SAP BW and Azure Data Factory (ADF), you will be responsible for leading the migration of SAP BW data to Azure. Your expertise in data integration, ETL, and cloud data platforms will be crucial in designing, implementing, and optimizing SAP BW-to-Azure migration projects. Your key responsibilities will include ensuring data integrity, scalability, and efficiency during the migration process. You will design and implement ETL/ELT pipelines using Azure Data Factory (ADF), Synapse, and other Azure services. Additionally, you will develop and optimize data ingestion, transformation, and orchestration workflows between SAP BW and Azure. Collaborating with business and technical stakeholders, you will analyze data models, define migration strategies, and ensure compliance with data governance policies. Troubleshooting and optimizing data movement, processing, and storage across SAP BW, Azure Data Lake, and Synapse Analytics will be part of your daily tasks. You will implement best practices for performance tuning, security, and cost optimization in Azure-based data solutions. Your role will also involve providing technical leadership in modernizing legacy SAP BW reporting and analytics by leveraging cloud-native Azure solutions. Working closely with cross-functional teams, including SAP functional teams, data architects, and DevOps engineers, you will ensure seamless integration of data solutions. Your expertise in SAP BW data modeling, ETL, reporting, Azure Data Factory (ADF), Azure Synapse Analytics, and other Azure data services will be essential in this role. Proficiency in SQL, Python, or Spark for data processing and transformation is required. Experience in Azure Data Lake, Azure Blob Storage, and Synapse Analytics for enterprise-scale data warehousing is a must. Preferred qualifications include experience with SAP BW/4HANA and its integration with Azure, knowledge of Databricks, Power BI, or other Azure analytics tools, and certification in Azure Data Engineer Associate (DP-203) or SAP BW. Experience in metadata management, data governance, and compliance in cloud environments is a plus. Your strong analytical and problem-solving skills, along with the ability to work in an agile environment, will contribute to the success of the migration projects.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

You are looking for candidates who are available to join immediately or within a maximum of 15 days. In this role, as a Finance Data Governance consultant supporting Regulatory Reporting Automation, you will be responsible for deploying Governance Policy and ensuring the appropriate accountability model and processes for data asset management, metadata management, data quality, and issue resolution. You will provide insight into the root-cause-analysis of data quality issues and assist in the remediation of Audit and Regulatory feedback. Additionally, you will be recommending strategic improvements to the data assessment process and making necessary enhancements to data analytical and data quality tools. Your responsibilities will also include supporting the current Regulatory Reporting needs via existing platforms by collaborating with upstream data providers, downstream business partners, and technologies. To excel in this role, you should have hands-on experience within Regulatory Reporting frameworks, especially around Liquidity reporting (FR 2052A, LCRs, NSFRs) and Capital Reporting (FRY 14, Y9Cs). Knowledge of 2052a is highly preferred as it is the MVP and is live in Oct 2024. Strong relationship skills and communication are essential as you will be partnering extensively with Business process owners, Technologies, and Enterprise Data Governance. Being self-motivated and proactive with the ability to manage multiple assignments and projects concurrently is crucial. SQL skills are a strong plus along with strong analytical and problem-solving abilities. You should also have demonstrated experience in collecting business requirements and using them to guide the implementation of technical solutions. Prior experience in defining and implementing data governance and data quality programs is preferred. Additionally, familiarity with data governance, metadata, and data lineage tools such as Collibra and MANTA is strongly preferred, although not required if you have strong SQL skills. Knowledge of Agile or SAFE project methodologies would be beneficial for this role.,

Posted 1 week ago

Apply

10.0 - 14.0 years

18 - 22 Lacs

hyderabad

Work from Office

Role Purpose The purpose of the role is to define and develop Enterprise Data Structure along with Data Warehouse, Master Data, Integration and transaction processing with maintaining and strengthening the modelling standards and business information. Do 1. Define and Develop Data Architecture that aids organization and clients in new/ existing deals a. Partnering with business leadership (adopting the rationalization of the data value chain) to provide strategic, information-based recommendations to maximize the value of data and information assets, and protect the organization from disruptions while also embracing innovation b. Assess the benefits and risks of data by using tools such as business capability models to create an data-centric view to quickly visualize what data matters most to the organization, based on the defined business strategy c. Create data strategy and road maps for the Reference Data Architecture as required by the clients d. Engage all the stakeholders to implement data governance models and ensure that the implementation is done based on every change request e. Ensure that the data storage and database technologies are supported by the data management and infrastructure of the enterprise f. Develop, communicate, support and monitor compliance with Data Modelling standards g. Oversee and monitor all frameworks to manage data across organization h. Provide insights for database storage and platform for ease of use and least manual work i. Collaborate with vendors to ensure integrity, objectives and system configuration j. Collaborate with functional & technical teams and clients to understand the implications of data architecture and maximize the value of information across the organization k. Presenting data repository, objects, source systems along with data scenarios for the front end and back end usage l. Define high-level data migration plans to transition the data from source to target system/ application addressing the gaps between the current and future state, typically in sync with the IT budgeting or other capital planning processes m. Knowledge of all the Data service provider platforms and ensure end to end view. n. Oversight all the data standards/ reference/ papers for proper governance o. Promote, guard and guide the organization towards common semantics and the proper use of metadata p. Collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure a common understanding, consistency, accuracy and control q. Provide solution of RFPs received from clients and ensure overall implementation assurance i. Develop a direction to manage the portfolio of all the databases including systems, shared infrastructure services in order to better match business outcome objectives ii. Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution for the big/small data iii. Provide technical leadership to the implementation of custom solutions through thoughtful use of modern technology iv. Define and understand current issues and problems and identify improvements v. Evaluate and recommend solutions to integrate with overall technology ecosystem keeping consistency throughout vi. Understand the root cause problem in integrating business and product units vii. Validate the solution/ prototype from technology, cost structure and customer differentiation point of view viii. Collaborating with sales and delivery leadership teams to identify future needs and requirements ix. Tracks industry and application trends and relates these to planning current and future IT needs 2. Building enterprise technology environment for data architecture management a. Develop, maintain and implement standard patterns for data layers, data stores, data hub & lake and data management processes b. Evaluate all the implemented systems to determine their viability in terms of cost effectiveness c. Collect all the structural and non-structural data from different places integrate all the data in one database form d. Work through every stage of data processing: analysing, creating, physical data model designs, solutions and reports e. Build the enterprise conceptual and logical data models for analytics, operational and data mart structures in accordance with industry best practices f. Implement the best security practices across all the data bases based on the accessibility and technology g. Strong understanding of activities within primary discipline such as Master Data Management (MDM), Metadata Management and Data Governance (DG) h. Demonstrate strong experience in Conceptual, Logical and physical database architectures, design patterns, best practices and programming techniques around relational data modelling and data integration 3. Enable Delivery Teams by providing optimal delivery solutions/ frameworks a. Build and maintain relationships with delivery and practice leadership teams and other key stakeholders to become a trusted advisor b. Define database physical structure, functional capabilities, security, back-up and recovery specifications c. Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results d. Monitor system capabilities and performance by performing tests and configurations e. Integrate new solutions and troubleshoot previously occurred errors f. Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards g. Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects h. Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams i. Recommend tools for reuse, automation for improved productivity and reduced cycle times j. Help the support and integration team for better efficiency and client experience for ease of use by using AI methods. k. Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams l. Ensures architecture principles and standards are consistently applied to all the projects m. Ensure optimal Client Engagement i. Support pre-sales team while presenting the entire solution design and its principles to the client ii. Negotiate, manage and coordinate with the client teams to ensure all requirements are met iii. Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Mandatory Skills: Network Operations - Utilities .Experience: 10 YEARS .

Posted 1 week ago

Apply

4.0 - 7.0 years

4 - 7 Lacs

mumbai

Work from Office

Development experienceon OAS, OAC(DVCS and BICS), OBIA or FAW knowledge will be added advantage Experience on lift & shift of OBIEE to OAC Should have excellent debugging and troubleshooting skills. Should have experience in Metadata management (RPD) and Analytics Should have good knowledge on OAC/OBIEE security Experience in customization and configuration of OBIA (preferably with Fusion Saas Cloud), OBIEE, Dashboards, Administration Experience in interacting with the Business Users to analyze the business process and gathering requirements Experience in sourcing data from Oracle EBS Experience in basic admin activities of OAC and OAS in Unix and Windows environments, like server restarting etc. Experience in Configuration, Troubleshooting, Tuning of OAC reports

Posted 1 week ago

Apply

5.0 - 10.0 years

16 - 27 Lacs

gurugram

Remote

Role & responsibilities As a Senior Data Engineer, you will be responsible for designing, building, and optimizing data pipelines and lakehouse architectures on AWS. You will ensure data availability, quality, lineage, and governance across analytical and operational platforms. Your expertise will enable scalable, secure, and cost-effective data solutions that power advanced analytics and business intelligence. Responsibilities : Implement and manage S3 (raw, staging, curated zones), Glue Catalog, Lake Formation, and Iceberg/Hudi/Delta Lake for schema evolution and versioning. Develop PySpark jobs on Glue/EMR, enforce schema validation, partitioning, and scalable transformations. Build workflows using Step Functions, EventBridge, or Airflow (MWAA), with CI/CD deployments via CodePipeline & CodeBuild. Apply schema contracts, validations (Glue Schema Registry, Deequ, Great Expectations), and maintain lineage/metadata using Glue Catalog or third-party tools (Atlan, OpenMetadata, Collibra). Enable Athena and Redshift Spectrum queries, manage operational stores (DynamoDB/Aurora), and integrate with OpenSearch for observability. Design efficient partitioning/bucketing strategies, adopt columnar formats (Parquet/ORC), and implement spot instance usage/bookmarking. Enforce IAM-based access policies, apply KMS encryption, private endpoints, and GDPR/PII data masking. Prepare Gold-layer KPIs for dashboards, forecasting, and customer insights with QuickSight, Superset, or Metabase. Partner with analysts, data scientists, and DevOps to enable seamless data consumption and delivery. Preferred candidate profile Hands-on expertise with AWS data stack (S3, Glue, Lake Formation, Athena, Redshift, EMR, Lambda). Strong programming skills in PySpark & Python for ETL, scripting, and automation. Proficiency in SQL (CTEs, window functions, complex aggregations). Experience in data governance, quality frameworks (Deequ, Great Expectations). Knowledge of data modeling, partitioning strategies, and schema enforcement. Familiarity with BI integration (QuickSight, Superset, Metabase). Benefits This role offers the flexibility of working remotely in India.

Posted 1 week ago

Apply

0.0 years

0 Lacs

chennai

Work from Office

We are seeking a highly knowledgeable, detail-oriented, and passionate Music Data Analyst to join our dynamic content team. This pivotal role is responsible for ensuring the accuracy, richness, and consistency of our vast music metadata catalog, with a particular focus on Western music. The ideal candidate will possess an encyclopedic knowledge of artists, albums, tracks, and genres, coupled with exceptional web research capabilities and outstanding communication skills. You will be instrumental in maintaining the integrity of our music data, directly impacting user experience, content discoverability, and internal data-driven initiatives. Key Responsibilities: Data Research & Enrichment: Conduct in-depth web research using various authoritative sources to gather, verify, and enrich music-related data, including artist biographies, album details, track information, release dates, credits (composers, producers, engineers), lyrical content, and more. Metadata Quality Assurance: Meticulously review, audit, and correct existing music metadata to ensure accuracy, completeness, and adherence to internal style guides and industry standards. Content Categorization: Apply expert knowledge to accurately classify and tag music content based on genre, sub-genre, mood, theme, instrumentation, lyrical content, and other relevant attributes. Discrepancy Resolution: Identify, investigate, and resolve discrepancies, inconsistencies, or gaps in music data, collaborating with internal teams as needed. Trend Monitoring: Stay abreast of new music releases, emerging artists, industry trends, and evolving music data standards to proactively enhance our catalog. Documentation & Guidelines: Contribute to the development and refinement of internal data guidelines, best practices, and research methodologies. Required Qualifications: Exceptional Western Music Knowledge: A demonstrable, deep, and broad understanding of Western music across various genres (e.g., Pop, Rock, Hip-Hop, R&B, Country, Electronic, Classical, Jazz, Folk, etc.), eras, artists, albums, and their historical context. This is paramount. Superior Web Research Skills: Proven ability to efficiently and effectively find, evaluate, and synthesize information from diverse online sources (e.g., official artist websites, reputable music databases, industry news, academic resources). Excellent Language & Communication Skills: Written: Impeccable grammar, spelling, and punctuation. Ability to write clear, concise, and accurate data entries, reports, and internal communications. Verbal: Strong ability to articulate complex information, ask clarifying questions, and collaborate effectively with team members. General Skills: Data Accuracy, Attention to Detail, Communication Skills, Language Skills, Problem-Solving, Analytical Thinking, Information Management, Research Skills, Digital Literacy.

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

The Applications Development Intermediate Programmer Analyst position at our organization involves actively participating in the establishment and implementation of new or updated application systems and programs in collaboration with the Technology team. Your role will focus on contributing to applications systems analysis and programming activities. Your responsibilities will include developing and maintaining application development for complex enterprise data lineage, optimizing industry-based tools to simplify enterprise-level data complexity via data lineage, debugging and resolving graph-related issues, collaborating on designing and implementing new features to simplify complex problems, conducting code reviews for quality assurance, writing and maintaining documentation for functionalities and APIs, integrating and validating third-party libraries and tools, managing source code using version control systems, implementing algorithms for code generation and optimization, performing code refactoring for better maintainability and efficiency, staying updated with advancements in Data lineage technology, profiling and benchmarking compiler performance on various platforms, developing automated testing and verification of the code base and functionality, providing technical support to teams using technical expertise, analyzing performance metrics to identify areas for improvement, participating in design and architecture discussions, using static and dynamic analysis tools to enhance code quality, collaborating with cross-functional teams, researching new techniques and methodologies, and contributing to and engaging with open-source compiler projects. Qualifications: - Strong understanding of Data Lineage, metadata management, and reference data development and data analytics. - Good knowledge of relational databases like Oracle, SQL / PLSQL. - Strong knowledge in one or more areas such as Data lineage, application development, python, or Java coding experience. - Hands-on experience with any coding language and tool-based configuration prior experience. - Full Software Development Kit (SDK) development cycle experience. - Pragmatic problem-solving skills and ability to work independently or as part of a team. - Proficiency in ab-initio mHub or Python programming languages. - Proficiency with 1 or more of the following programming languages: Java, API, Python. - 2+ years of non-internship professional software development experience. - A passion for development, strong work ethic, and continuous learning. - Experience with code optimization techniques for different hardware architectures. Education: - Bachelor's degree/University degree or equivalent experience. Preferred Qualifications: - Bachelor's in computer science or related field. - Experience with relational databases i.e. SQL / PLSQL, Oracle, etc. - Experience with code development, metadata management, reference data, Lineage tool. - Experience with developing data lineage using a tool or custom code. - Experience in Data management and coding language. - At least 4+ years of application Ab-initio Metadata hub development experience. - Engage with open-source compiler projects. Please note that this job description offers a high-level overview of the work performed. Other job-related duties may be assigned as required. If you are a person with a disability and require a reasonable accommodation to use our search tools and/or apply for a career opportunity, please review Accessibility at Citi. View Citis EEO Policy Statement and the Know Your Rights poster.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As an Enterprise Data Architect at Ramboll Tech, you will play a vital role in transforming data into a strategic asset, ensuring it is well-structured, governed, and effectively leveraged for business growth. Your responsibilities will include identifying, analyzing, and recommending how information assets drive business outcomes, as well as sharing consistent data throughout Ramboll. By joining our Technology & Data Architecture team, you will collaborate with Domain Enterprise Architects, Data Strategy, and Data Platform teams to shape the enterprise data layer. Additionally, you will partner with Innovation and Digital Transformation Directors to drive digitalization, innovation, and scaling of digital solutions across various business domains. Your focus will be on delivering value by developing data strategies, roadmaps, and solutions that directly address the challenges and opportunities within our business areas. You will design and implement modern data architectures using cutting-edge technologies and ensure alignment with business objectives. Furthermore, you will work on integrating disparate business systems and data sources to facilitate seamless data flow across the organization. In this role, you will play a crucial part in designing and developing data models that support business processes, analytics, and reporting requirements. Additionally, you will collaborate with cross-functional teams, including business stakeholders, data scientists, and data engineers, to understand data requirements and deliver solutions that meet business needs. Your expertise in data architecture, cloud platforms, data integration, and data modeling will be essential in driving our digital transformation journey. We are looking for a candidate with a Bachelor's or Master's degree in Computer Science, Engineering, or a related field, along with at least 5 years of professional experience in data architecture. Experience with cloud platforms such as Microsoft Azure, GCP, or AWS, as well as a deep understanding of modern data stack components, is required. Strong skills in data modeling, ETL processes, and data integration are essential, along with experience in data governance practices. Your exceptional analytical and problem-solving skills, combined with your ability to design innovative solutions to complex data challenges, will be key to your success in this role. Effective communication and interpersonal skills will enable you to convey technical concepts to non-technical stakeholders and influence within a matrixed organization. By continuously evaluating and recommending new tools and technologies, you will contribute to improving the efficiency and effectiveness of data engineering processing within Ramboll Tech. Join us in shaping a more sustainable future through data-centric principles and innovative solutions. Apply now to be part of our dynamic team at Ramboll Tech and make a meaningful impact on our digital transformation journey.,

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

maharashtra

On-site

As a Data Engineer with 7-10 years of experience, you will be responsible for architecting, creating, and maintaining data pipelines and ETL processes in AWS. Your role will involve supporting and optimizing the current desktop data tool set and Excel analysis pipeline to a transformative Cloud-based highly scalable architecture. You will work in an agile environment within a collaborative agile cross-functional product team using Scrum and Kanban methodologies. Collaboration is key in this role, as you will work closely with data science teams and business analysts to refine data requirements for various initiatives and data consumption needs. Additionally, you will be required to educate and train colleagues such as data scientists, analysts, and stakeholders in data pipelining and preparation techniques to facilitate easier integration and consumption of data for their use cases. Your expertise in programming languages like Python, Spark, and SQL will be essential, along with prior experience in AWS services such as AWS Lambda, Glue, Step function, Cloud Formation, and CDK. Knowledge of building bespoke ETL solutions, data modeling, and T-SQL for managing business data and reporting is also crucial for this role. You should be capable of conducting technical deep-dives into code and architecture and have the ability to design, build, and manage data pipelines encompassing data transformation, data models, schemas, metadata, and workload management. Furthermore, your role will involve working with data science teams to refine and optimize data science and machine learning models and algorithms. Effective communication skills are essential to collaborate effectively across departments and ensure compliance and governance during data use. In this role, you will be expected to work within and promote a DevOps culture and Continuous Delivery process to enhance efficiency and productivity. This position offers the opportunity to be part of a dynamic team that aims to drive positive change through technology and innovation. Please note that this role is based in Mumbai, with the flexibility to work remotely from anywhere in India.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Data Architect at Astellas, your role involves close collaboration with study leads, research scientists, statisticians, clinicians, regulatory experts, and DigitalX professionals to establish and uphold robust data architecture frameworks that align with business objectives, regulatory mandates, and industry standards. Your expertise in Data Engineering, Data Modeling, and governance processes is pivotal in maintaining data integrity, security, and accessibility. By leveraging data, you play a strategic role in advancing the company's mission, driving scientific progress, improving patient outcomes, and introducing innovative therapies securely to the market. Your key responsibilities include managing data operations on AWS architecture, overseeing ETL processes, ensuring data quality, and automating data loads. You will collaborate with internal and external teams for data curation, cataloging, and metadata management. Proficiency in RWD assets and data sources like IQVIA, SYMPHONY, and OMICS sources is crucial. Enforcing data governance, information management policies, and maintaining data security are essential aspects of your role. Your proficiency in Python, Django framework, web technologies, RESTful APIs, and database management systems such as PostgreSQL or MySQL will be highly valuable. Moreover, you will identify areas for improvement in analytics tools, participate in concurrent projects, ensure optimal RWD Analytics environments, and drive process improvements initiatives. Collaborating with Advanced Analytics Solution teams, you will design and implement impactful solutions aligned with long-term strategies. Your problem-solving skills, Agile methodology experience, and excellent communication abilities will be key in successful collaboration with cross-functional teams and stakeholders. Required Qualifications: - Bachelor of Science degree in Computer Science, Information Systems, Data Science, or related field - 5+ years of relevant experience in data architecture or engineering roles within the healthcare industry Preferred Qualifications: - Master of Science degree in Computer Science, Information Systems, Data Science, or related field - 3+ years of experience in Life Sciences industry - Expertise in ETL, data modeling, data integration techniques - Proficiency in programming languages like Python, Django, web technologies, RESTful APIs, and database management - Strong understanding of life sciences business processes and Agile methodology experience - Excellent communication, interpersonal skills, and project management capabilities - Relevant certifications in cloud computing and data engineering tools/platforms are advantageous Category: Bold X Astellas is dedicated to promoting equality of opportunity in all aspects of employment, including Disability/Protected Veterans.,

Posted 1 week ago

Apply

4.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As an Oracle DBA with 4 to 9 years of experience, your primary responsibilities will include the installation and configuration of Oracle 10g and 11g versions. You should possess strong procedural skills to aid in the design, debugging, implementation, and maintenance of stored procedures, triggers, and user-defined functions within the DBMS. Your role will also involve utilizing your System Analysis and Design skills, data modeling expertise, and database design knowledge to contribute to the overall performance management and tuning of the database system. Additionally, you will be expected to conduct SQL code reviews and walk-throughs, as well as engage in capacity planning, metadata management, and repository usage. It is essential that you have experience in data security practices, storage management techniques, and an understanding of related technologies such as Java (JDeveloper, J2EE, Apache) and Oracle Application Server. Strong communication skills are necessary in this role. While not mandatory, possessing Oracle DBA certifications would be beneficial. If you meet these requirements and are interested in this position based in Chennai, please send your resume to ram.prabhakaran@valuewingcoe.com.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You will be responsible for assessing data availability, data quality, and data readiness for various New Business Initiatives in R&D, clinical trials, supply chain, and regulatory compliance within a leading pharmaceutical company in Europe. As a Data Extraction & Analysis Specialist, your role will involve working on multiple innovation initiatives and collaborating with Business & IT Teams to ensure data integrity, compliance, and quality standards are met. Your key responsibilities will include assessing & extracting data from multiple systems, analyzing data quality & availability for various projects, ensuring Pharma GxP Compliance, performing data cleansing & transformation, and collaborating with stakeholders in regulatory, IT, and business units. You will be expected to have expertise in Pharma GxP, strong hands-on experience with SQL, ETL tools, and cloud-based data lakes, as well as the ability to translate business requirements into data needs and assess data availability. The ideal candidate will have at least 5 years of experience in Data Analysis, Data Governance, and Pharma GxP Compliance, with a focus on GMP, GCP, and GLP data management to support regulatory and compliance standards. Knowledge of regulatory frameworks such as EMA, FDA, HIPAA, and GDPR, as well as experience with AI/ML in Pharma Data Analytics, will be advantageous for this role. Strong collaboration, communication, and decision support skills will be essential for working with cross-functional teams and advising stakeholders on project feasibility.,

Posted 1 week ago

Apply

6.0 - 10.0 years

15 - 20 Lacs

gurugram

Work from Office

IMEA (India, Middle East, Africa) India LIXIL INDIA PVT LTD Employee Assignment Not remote Full Time 31 August 2025 Lixil India is Hiring for a Leader Accounting and Treasury to be based at the Gurgaon Location This Position will be responsible for managing the end-to-end processes of Accounting, Reporting, and Audits Main Activities And Responsibilities Handling Banking Operations Managing EmployeesTravel Claims Knowledge transfer and create back-ups Ensure timely and accurate reporting of financial results to India management, BSAR review on a Monthly / Quarterly basis, Ensure the proper recording of AP invoices in the SAP system after compliance with GST/ TDS, etc, in a timely manner, Ensure proper accounting of AR function, i-e , customer collections, credit notes, etc MIS reporting to be done at monthly and quarterly intervals in accordance with IFRS, Coordination with the bank for forward contracts, demand loans, and other banking day-to-day requirements, Ensure timely processing of travel claims and approvals in the Concur system, Successful resolution of all internal audit observations, Timely Closure of External Audits with no major significant observations, Knowledge transfer and effective delegation to the team to make sure Accounting and taxation processes run seamlessly, Creating backups for different positions within the Business unit, Education And Experience CA with relevant 7-10 years of experience Key Competencies Creativity/Innovation Problem Solving/Analysis Managing Conflict Critical thinking Effective communication Decision Making/Judgment General Knowledge And Technical Skills Good MS Excel knowledge SAP System knowledge

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a proactive Senior Associate at Dashverse, you will play a crucial role in supporting content programming and operations. Your responsibilities will include executing programming strategies, ensuring timely content delivery, and optimizing performance to effectively engage our audience. Dashverse is an early-stage startup supported by esteemed investors like Z47, Peak XV, and Stellaris. Our mission is to create a new creative ecosystem with innovative products such as Frameo.ai, Dashreels, Dashtoon Studio, and Dashtoon, all driven by generative AI technology. In this role, you will collaborate closely with content, marketing, and product teams to align programming strategies. You will be responsible for managing content metadata, rights, and contracts to ensure compliance. Tracking content performance and providing actionable insights will be a key part of your responsibilities. Additionally, you will optimize content schedules based on performance data and audience insights, as well as assist in content scheduling to ensure timely delivery across the platform. The ideal candidate will have proven experience in content programming or operations within OTTs or the digital content space. Familiarity with content management systems (CMS) and content workflows is essential. Strong organizational skills with keen attention to detail, an analytical mindset to interpret data and optimize content, and knowledge of content rights management and compliance are mandatory for this role. Effective communication and collaboration skills across teams are also crucial. Desirable skills for this position include experience with content analytics and audience tracking, knowledge of metadata management and content delivery, an understanding of programming trends and audience behavior, and the ability to manage relationships with external content providers. A passion for entertainment and content curation will be beneficial in excelling in this role.,

Posted 2 weeks ago

Apply

1.0 - 5.0 years

0 Lacs

karnataka

On-site

The role of a Data Governance Specialist at Hitachi Energy involves being a key enabler in shaping and operationalizing the enterprise-wide data governance framework. Your focus will be on the implementation and evolution of the Data Catalog, Metadata Management, and Data Compliance initiatives to ensure that data assets are trusted, discoverable, and aligned with business value. You will play a critical role in defining and maintaining the roadmap for the Enterprise Data Catalog and Data Supermarket. This includes configuring and executing the deployment of cataloging tools such as metadata management, lineage, and glossary, while ensuring alignment with DAMA-DMBOK principles. Collaboration with Data Owners, Stewards, and Custodians will be essential in defining and enforcing data policies, standards, and RACI mode. Additionally, you will support the Data Governance Council and contribute to the development of governance artifacts like roles, regulations, and KPIs. Partnering with domain experts, you will drive data profiling, cleansing, and validation initiatives to ensure data quality and support remediation efforts across domains. Providing training and support to business users on catalog usage and governance practices will be part of your responsibilities, acting as a liaison between business and IT to ensure data needs are met and governance is embedded in operations. Staying current with industry trends and tool capabilities like Databricks and SAP MDG, you will propose enhancements to governance processes and tooling based on user feedback and analytics. To qualify for this role, you should have a Bachelor's degree in information systems, Data Science, Business Informatics, or a related field, along with 1-3 years of experience in data governance, data management, or analytics roles. Familiarity with DAMA DMBOK2 framework and data governance tools is required, as well as strong communication and collaboration skills to work across business and technical teams. Being proactive, solution-oriented, and eager to learn are important traits for this role, along with autonomy and ambiguity management capacities as competitive advantages. Preference will be given to candidates with CDMP certifications. Joining Hitachi Energy offers a purpose-driven role in a global energy leader committed to sustainability and digital transformation. You can expect mentorship and development opportunities within a diverse and inclusive team, working with cutting-edge technologies and a culture that values integrity, curiosity, and collaboration, in line with Hitachi Energy's Leadership Pillars. Individuals with disabilities requiring accessibility assistance or accommodations in the job application process can request reasonable accommodations through the Hitachi Energy career site to support them during the application process.,

Posted 2 weeks ago

Apply

3.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

As a Metadata Manager in the Data Management team at Crisil, you will be at the forefront of driving metadata management and data catalog implementation. Your role will be crucial in contributing to the growth of the Open metadata application, enhancing Catalog content, and promoting user adoption. Your success will be gauged by the continuous expansion of Catalog content, enhancement of features and services, and the adoption rate among Crisil data users. Your responsibilities will include designing, implementing, and maintaining a data catalog that offers a comprehensive view of the organization's data assets. This will involve close collaboration with data owners, data stewards, business users, and IT teams to ensure accuracy, completeness, and ease of accessibility of the data catalog. Additionally, you will oversee metadata creation, maintenance, and governance, as well as conduct training sessions and create user guides to facilitate broad stakeholder adoption. Key skills and attributes for success in this role include a strong curiosity and passion for data, the ability to build relationships and influence stakeholders, proficiency in data cataloging and metadata management, technical expertise in data cataloging tools, database technologies, SQL, and data modeling, effective communication skills to explain complex data concepts, experience in training team members, and a proactive approach to resolving data management challenges. To qualify for this position, you should have a minimum of 8 years of experience in data management roles, with a focus on data cataloging and metadata management for 3-5 years. Familiarity with leading data catalog and metadata management platforms and tools, as well as industry standards and best practices in data management, will be beneficial. Joining Crisil in this role will provide you with a unique opportunity to deeply engage with the company's data landscape and support business functions in managing their data effectively. You will have the chance to work with diverse stakeholders and play a pioneering role in establishing data management processes within the organization.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

gandhinagar, gujarat

On-site

As a Power BI Analyst, you will be responsible for designing and facilitating Data Analytics, Data Interpretation, and Data Visualization for the relevant Division. Your role will involve integrating key MI data points into SAS/PowerBI to ensure accurate and timely MI for business performance monitoring and customer reporting. This is an exciting opportunity to join a team that is currently designing and implementing its MI Suite. Your key responsibilities will include collaborating with IT and the wider business to evangelize effective data management practices and promote a better understanding of data and analytics. You will work closely with the Senior Data Steward, key business stakeholders, IT experts, and subject-matter experts to plan and deliver optimal analytics and data science solutions. You will be trained to analyze and verify all financial data against new entities during the Ingestion process, as well as perform regular checks on financial reports to ensure their accuracy. Additionally, you will produce reporting and analysis of the current book to help the business review, identify, and monitor insurance policies placement for revenue optimization. Automation through effective metadata management will be a key focus, where you will utilize innovative tools and techniques to automate data preparation and integration tasks. Collaboration across departments is essential, requiring strong collaboration skills to work with various stakeholders within the organization to fulfill business MI objectives. Furthermore, compliance with the FCA Conduct Rules, including acting with integrity, due skill, care, and diligence, as well as observing proper standards of market conduct, is mandatory. You will also be assessed regularly to maintain fitness and propriety in line with PIB's requirements. To qualify for this role, you should have expertise in computer science, statistics, analytics, applied mathematics, data management, or a related quantitative field. A strong background in Analytics and Interpretation, experience working in cross-functional teams, and previous experience in Data Analytics within a commercial fast-paced environment are essential qualifications. Experience with data discovery, analytics, and BI software tools like SAS, Power BI, Tableau, and QlikView, as well as database programming languages including Power BI and SQL, is required. Strong Excel skills and knowledge of designing, building, and managing data pipelines using MS Azure Data Factory, Azure Data Lake Storage, and Azure Data Bricks are also necessary. The ideal candidate for this role would be creative, collaborative, and have strong interpersonal skills. You should be able to collaborate effectively with both business and IT teams, demonstrate good judgment, a sense of urgency, and a commitment to high ethical standards, regulatory compliance, customer service, and business integrity. This is a full-time, permanent position with benefits including health insurance, paid sick time, paid time off, and performance bonuses. The work schedule is during the day shift from Monday to Friday with weekend availability, and the work location is in person.,

Posted 2 weeks ago

Apply

4.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

You are an experienced and strategic leader being sought to join the Business Intelligence & Reporting organization as Deputy Director - BI Governance. In this role, you will be responsible for leading the design, implementation, and ongoing management of BI governance frameworks across sectors and capability centers. Your deep expertise in BI governance, data stewardship, demand management, and stakeholder engagement will be crucial in ensuring a standardized, scalable, and value-driven BI ecosystem across the enterprise. Your key responsibilities will include defining and implementing the enterprise BI governance strategy, policies, and operating model. You will drive consistent governance processes across sectors and global capability centers, setting standards for BI solution lifecycle, metadata management, report rationalization, and data access controls. Additionally, you will serve as a trusted partner to sector business leaders, IT, data stewards, and COEs to ensure alignment with business priorities. Leading governance councils, working groups, and decision forums will be part of your role to drive adoption and compliance. You will establish and enforce policies related to report publishing rights, tool usage, naming conventions, and version control. Implementing approval and exception processes for BI development outside the COE will also fall under your responsibilities. Moreover, you will lead the governance of BI demand intake and prioritization processes, ensuring transparency and traceability of BI requests and outcomes across business units. Defining KPIs and dashboards to monitor BI governance maturity and compliance, as well as identifying areas for process optimization and leading continuous improvement efforts, will be essential. Your qualifications should include 12+ years of experience in Business Intelligence, Data Governance, or related roles, with at least 4+ years in a leadership capacity. Domain expertise in BI platforms, data management practices, and governance frameworks is crucial. A strategic mindset, operational excellence, and a Bachelor's degree are required, while an MBA or Masters in Data/Analytics is preferred.,

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies