Jobs
Interviews

455 Metadata Management Jobs - Page 10

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

0 Lacs

karnataka

On-site

As an AEM Developer with over 6 years of experience, you will be responsible for working with AEM Assets, a digital asset management system integrated into Adobe Experience Manager (AEM). Your role will involve storing, organizing, and managing various digital assets like images, videos, documents, and multimedia files efficiently within a centralized repository. AEM Assets plays a crucial role in streamlining content creation and delivery processes by offering robust tools for asset management and enhancing collaboration among teams. Key Features and Benefits of AEM Assets: Asset Management: Easily store, organize, and retrieve digital assets in a centralized repository, supporting various file formats and integrating with Adobe Creative Cloud tools. Metadata Management: Attach metadata to assets for improved searchability and categorization, including descriptions, tags, and custom fields. Version Control: Manage and track changes made to assets over time with versioning support, enabling the restoration of previous asset versions if required. Dynamic Media: Create and deliver dynamic media in different formats and resolutions optimized for various devices without manual edits. Smart Tagging: Utilize Adobe Sensei to automatically generate tags for assets based on content, enhancing asset discovery accuracy and speed. Asset Collections and Folders: Group similar assets in collections or folders for better organization and collaboration. Workflow Automation: Automate approval workflows, asset updates, and content delivery processes to enhance efficiency and reduce manual tasks. Integration with Adobe Creative Cloud: Seamless integration with tools like Photoshop, Illustrator, and InDesign for easy asset management and upload. Role-based Access Control: Granular permission settings for secure access management, controlling asset viewing, editing, and publishing. Content Delivery Network (CDN) Integration: Integration with a CDN for fast and reliable global delivery of digital assets. Benefits of AEM Assets: Improved Collaboration: Enhance team effectiveness by providing a single source of truth for digital assets, reducing duplication and improving communication. Enhanced Brand Consistency: Maintain brand consistency across channels with centralized asset management, ensuring the availability of approved asset versions. Faster Time to Market: Streamline asset workflows and reduce manual processes to deliver content faster to the market. Scalability: AEM Assets can handle large volumes of digital assets, making it suitable for organizations with growing content needs. Better User Experience: Quick asset discovery and use facilitated by metadata, smart tagging, and search capabilities. AEM Assets is primarily used by marketing teams, content creators, and developers seeking an efficient way to manage extensive libraries of digital assets, ensuring proper organization, accessibility, and optimization for diverse marketing and content delivery channels. Join Programming.com, a software solution and digital transformation partner with over 13 years of growth and global presence. With a 100% successful delivery rate and a dedicated team of 2000+ employees, we offer Agile development services and collaborate with diverse clients across industries. As an AEM Developer at Programming.com in Bengaluru, you will manage and maintain Adobe Experience Manager platforms, create and optimize digital content, implement web solutions, and collaborate with cross-functional teams to deliver exceptional user experiences. The role requires analytical skills, project management abilities, customer service, experience with AEM or similar CMS, strong web technology understanding, and the ability to work collaboratively in a team environment. A Bachelor's degree in Computer Science, Information Technology, or a related field is preferred.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be joining a leading provider of drilling solutions, HMH, known for offering a wide range of products and services designed to be the safest and most efficient in the industry. With a global presence and offices in 16 countries across five continents, HMH is continuously expanding its expertise within various industries including subsea mining, geothermal, onshore and offshore construction, and offshore wind. As an Oracle Financial Consolidation and Close Cloud Service (FCCS) Analyst at HMH, you will play a crucial role in supporting our enterprise financial systems. Your responsibilities will include collaborating with Finance, Accounting, and IT teams to ensure efficient system performance and meet evolving business needs. The ideal candidate will have a strong expertise in Oracle FCCS and related EPM cloud tools, along with a solid understanding of financial close processes, intercompany eliminations, consolidation, reporting, and compliance requirements. Key responsibilities of this role include maintaining, supporting, and enhancing the Oracle FCCS application, including metadata management, data integration, consolidation logic, and reporting. You will also be involved in tasks such as user provisioning, participating in testing cycles for system upgrades and new features, and troubleshooting integrations with the use of Data Management, Data Exchange, or Oracle Data Integration tools. To excel in this role, you should have a strong understanding of EPM tools and methodologies, knowledge of metadata, mappings, and logic from ERP to EPM, and familiarity with foreign currencies and their impact on financials. Prior experience with FCCS, ARCs, and S4 Hana is a plus. Additionally, strong communication and interpersonal skills are essential for effective collaboration with cross-functional teams and stakeholders. Qualifications for this position include a Bachelor's or Master's degree from an accredited university or college in Information Technology, Finance, Business Management, or a related field. You should also possess strong analytical and problem-solving skills, along with the ability to multitask effectively and knowledge of ITSM, PMLC, and SDLC concepts. If you are looking for a challenging opportunity to innovate and contribute to the future of drilling solutions, HMH welcomes you to join our team.,

Posted 1 month ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Vadodara

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 month ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Agra

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 month ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Nagpur

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 month ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Jaipur

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 month ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Faridabad

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 month ago

Apply

12.0 - 17.0 years

15 - 20 Lacs

Hyderabad

Work from Office

Overview As Data Modelling Assoc Manager, you will be the key technical expert overseeing data modeling and drive a strong vision for how data modelling can proactively create a positive impact on the business. You'll be empowered to create & lead a strong team of data modelers who create data models for deploying in Data Foundation layer and ingesting data from various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data modelling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will independently be analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be a key technical expert performing all aspects of Data Modelling working closely with Data Governance, Data Engineering and Data Architects teams. You will provide technical guidance to junior members of the team as and when needed. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Independently complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, Data Bricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Advocates existing Enterprise Data Design standards; assists in establishing and documenting new standards. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper management: business and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the data science team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications 12+ years of overall technology experience that includes at least 6+ years of data modelling and systems architecture. 6+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 6+ years of experience developing enterprise data models. 6+ years in cloud data engineering experience in at least one cloud (Azure, AWS, GCP). 6+ years of experience with building solutions in the retail or in the supply chain space. Expertise in data modelling tools (ER/Studio, Erwin, IDM/ARDM models). Fluent with Azure cloud services. Azure Certification is a plus. Experience scaling and managing a team of 5+ data modelers Experience with integration of multi cloud services with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata, or Snowflake. Experience with version control systems like GitHub and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI). Excellent communication skills, both verbal and written, along with the ability to influence and demonstrate confidence in communications with senior level management. Proven track record of leading, mentoring, hiring and scaling data teams. Strong change manager. Comfortable with change, especially that which arises through company growth. Ability to understand and translate business requirements into data and technical requirements. High degree of organization and ability to manage multiple, competing projects and priorities simultaneously. Positive and flexible attitude to enable adjusting to different needs in an ever-changing environment. Strong leadership, organizational and interpersonal skills; comfortable managing trade-offs. Foster a team culture of accountability, communication, and self-management. Proactively drives impact and engagement while bringing others along. Consistently attain/exceed individual and team goals Ability to lead others without direct authority in a matrixed environment. Differentiating Competencies Required Ability to work with virtual teams (remote work locations); lead team of technical resources (employees and contractors) based in multiple locations across geographies Lead technical discussions, driving clarity of complex issues/requirements to build robust solutions Strong communication skills to meet with business, understand sometimes ambiguous, needs, and translate to clear, aligned requirements Able to work independently with business partners to understand requirements quickly, perform analysis and lead the design review sessions. Highly influential and having the ability to educate challenging stakeholders on the role of data and its purpose in the business. Places the user in the centre of decision making. Teams up and collaborates for speed, agility, and innovation. Experience with and embraces agile methodologies. Strong negotiation and decision-making skill. Experience managing and working with globally distributed teams

Posted 1 month ago

Apply

4.0 - 7.0 years

4 - 7 Lacs

Mumbai

Work from Office

Development experienceon OAS, OAC(DVCS and BICS), OBIA or FAW knowledge will be added advantage Experience on lift & shift of OBIEE to OAC Should have excellent debugging and troubleshooting skills. Should have experience in Metadata management (RPD) and Analytics Should have good knowledge on OAC/OBIEE security Experience in customization and configuration of OBIA (preferably with Fusion Saas Cloud), OBIEE, Dashboards, Administration Experience in interacting with the Business Users to analyze the business process and gathering requirements Experience in sourcing data from Oracle EBS Experience in basic admin activities of OAC and OAS in Unix and Windows environments, like server restarting etc. Experience in Configuration, Troubleshooting, Tuning of OAC reports

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

hyderabad, telangana

On-site

As a Data Quality Specialist, you will be responsible for monitoring and supporting the execution of Ataccama Data Quality rules and profiling jobs. Your role will include troubleshooting data validation, anomaly detection, and scorecard generation issues. Additionally, you will be expected to perform patching, software upgrades, and ensure compliance with the latest platform updates. Collaborating with business teams to resolve data integrity and governance-related incidents will be a key aspect of your responsibilities. Maintaining SLA commitments for incident resolution and ensuring data accuracy will also be part of your day-to-day tasks. The ideal candidate for this role will possess 2-5 years of experience in data quality, governance, and metadata management. Proficiency in using the Ataccama ONE platform and knowledge of SQL for data validation are essential skills required for this position.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

coimbatore, tamil nadu

On-site

You are a seasoned Confluent & Oracle EBS Cloud Engineer with over 10 years of experience, responsible for leading the design and implementation of scalable, cloud-native data solutions. Your role involves modernizing enterprise data infrastructure, driving real-time data streaming initiatives, and migrating legacy ERP systems to AWS-based platforms. Your key responsibilities include architecting and implementing cloud-based data platforms using AWS services such as Redshift, Glue, DMS, and Data Lake solutions. You will lead the migration of Oracle E-Business Suite or similar ERP systems to AWS while ensuring data integrity and performance. Additionally, you will design and drive the implementation of Confluent Kafka for real-time data streaming across enterprise systems. It is essential for you to define and enforce data architecture standards, governance policies, and best practices. Collaborating with engineering, data, and business teams to align architecture with strategic goals is also a crucial aspect of your role. Furthermore, you will optimize data pipelines and storage for scalability, reliability, and cost-efficiency. To excel in this role, you must possess 10+ years of experience in data architecture, cloud engineering, or enterprise systems design. Deep expertise in AWS services, including Redshift, Glue, DMS, and Data Lake architectures, is required. Proven experience with Confluent Kafka for real-time data streaming and event-driven architectures is essential. Hands-on experience in migrating large-scale ERP systems (e.g., Oracle EBS) to cloud platforms is a must. Strong understanding of data governance, security, and compliance in cloud environments, as well as proficiency in designing scalable, fault-tolerant data systems, are also necessary. Preferred qualifications include experience with data modeling, metadata management, and lineage tracking, familiarity with infrastructure-as-code and CI/CD practices, and strong communication and leadership skills to guide cross-functional teams.,

Posted 1 month ago

Apply

8.0 - 12.0 years

10 - 20 Lacs

Gurugram

Work from Office

Job Summary: We are seeking a highly experienced and motivated Snowflake Data Architect & ETL Specialist to join our growing Data & Analytics team. The ideal candidate will be responsible for designing scalable Snowflake-based data architectures, developing robust ETL/ELT pipelines, and ensuring data quality, performance, and security across multiple data environments. You will work closely with business stakeholders, data engineers, and analysts to drive actionable insights and ensure data-driven decision-making. Key Responsibilities: Design, develop, and implement scalable Snowflake-based data architectures . Build and maintain ETL/ELT pipelines using tools such as Informatica, Talend, Apache NiFi, Matillion , or custom Python/SQL scripts. Optimize Snowflake performance through clustering, partitioning, and caching strategies. Collaborate with cross-functional teams to gather data requirements and deliver business-ready solutions. Ensure data quality, governance, integrity, and security across all platforms. Migrate legacy data warehouses (e.g., Teradata, Oracle, SQL Server) to Snowflake . Automate data workflows and support CI/CD deployment practices. Implement data modeling techniques including dimensional modeling, star/snowflake schema , normalization/denormalization. Support and promote metadata management and data governance best practices. Technical Skills (Hard Skills): Expertise in Snowflake : Architecture design, performance tuning, cost optimization. Strong proficiency in SQL , Python , and scripting for data engineering tasks. Hands-on experience with ETL tools: Informatica, Talend, Apache NiFi, Matillion , or similar. Proficient in data modeling (dimensional, relational, star/snowflake schema). Good knowledge of Cloud Platforms : AWS, Azure, or GCP. Familiar with orchestration and workflow tools such as Apache Airflow, dbt, or DataOps frameworks . Experience with CI/CD tools and version control systems (e.g., Git). Knowledge of BI tools such as Tableau, Power BI , or Looker . Certifications (Preferred/Required): Snowflake SnowPro Core Certification Required or Highly Preferred SnowPro Advanced Architect Certification – Preferred Cloud Certifications (e.g., AWS Certified Data Analytics – Specialty, Azure Data Engineer Associate) – Preferred ETL Tool Certifications (e.g., Talend, Matillion) – Optional but a plus Soft Skills: Strong analytical and problem-solving capabilities. Excellent communication and collaboration skills. Ability to translate technical concepts into business-friendly language. Proactive, detail-oriented, and highly organized. Capable of multitasking in a fast-paced, dynamic environment. Passionate about continuous learning and adopting new technologies. Why Join Us? Work on cutting-edge data platforms and cloud technologies Collaborate with industry leaders in analytics and digital transformation Be part of a data-first organization focused on innovation and impact Enjoy a flexible, inclusive, and collaborative work culture

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

At PwC, our managed services team focuses on providing outsourced solutions and support to clients across various functions. We help organizations streamline operations, reduce costs, and enhance efficiency by managing key processes and functions on their behalf. Our team is skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC are responsible for transitioning and running services, managing delivery teams, programs, commercials, performance, and delivery risk. Your role will involve continuous improvement and optimization of managed services processes, tools, and services. As an Associate at PwC, you will work as part of a team of problem solvers, assisting in solving complex business issues from strategy to execution. Professional skills and responsibilities at this level include using feedback and reflection to develop self-awareness, demonstrating critical thinking, and bringing order to unstructured problems. You will be involved in ticket quality review, status reporting for projects, adherence to SLAs, incident management, change management, and problem management. Additionally, you will seek opportunities for exposure to different situations, environments, and perspectives, uphold the firm's code of ethics, demonstrate leadership capabilities, and work in a team environment that includes client interactions and cross-team collaboration. Required Skills: - AWS Cloud Engineer - Minimum 2 years of hands-on experience in building advanced data warehousing solutions on leading cloud platforms - Minimum 1-3 years of Operate/Managed Services/Production Support Experience - Extensive experience in developing scalable, repeatable, and secure data structures and pipelines - Designing and implementing data pipelines for data ingestion, processing, and transformation in AWS - Building efficient ETL/ELT processes using industry-leading tools like AWS, PySpark, SQL, Python, etc. - Implementing data validation and cleansing procedures - Monitoring and troubleshooting data pipelines - Implementing and maintaining data security and privacy measures - Strong communication, problem-solving, quantitative, and analytical abilities Nice To Have: - AWS certification In our Managed Services platform, we deliver integrated services and solutions grounded in deep industry experience and powered by talent. Our team provides scalable solutions that add value to our clients" enterprise through technology and human-enabled experiences. We focus on empowering clients to navigate and capture the value of their Data & Analytics portfolio while cost-effectively operating and protecting their solutions. As a member of our Data, Analytics & Insights Managed Service team, you will work on critical offerings, help desk support, enhancement, optimization work, and strategic roadmap and advisory level work. Your contribution will be crucial in supporting customer engagements both technically and relationally.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

We are seeking an experienced Data Governance Architect with specialized knowledge in Alation and Azure cloud platforms. In this role, you will collaborate with senior stakeholders to establish and advocate for an enterprise data catalog and dictionary strategy. Your responsibilities will encompass overseeing the complete data catalog lifecycle, from defining metadata standards and initial MVPs to executing large-scale enterprise rollouts. To qualify for this position, you should have over 10 years of experience in data governance and demonstrate proficiency in Alation tool on Azure platform. Additionally, familiarity with the Snowflake platform is required. Expertise in at least two of the following areas is essential: Data Governance Operating Models, Metadata Management, Data Cataloging, Data Lineage, or Data Quality. A deep understanding of governance frameworks like DAMA or DCAM, along with practical implementation experience, is crucial. As a Data Governance Architect, you must possess strong capabilities in conducting maturity assessments, gap analyses, and delivering strategic roadmaps. Excellent communication skills are necessary for articulating complex topics clearly and producing precise documentation. Key Responsibilities: - Evaluate existing cataloging and dictionary capabilities, identify gaps, and create roadmaps to enhance metadata quality, speed up catalog population, and foster adoption. - Recognize various data personas utilizing the data catalog and develop persona-specific playbooks to encourage adoption. - Plan, implement, and supervise scalable data catalog and dictionary solutions using platforms such as Alation. - Comprehend leading Data Governance tools like Collibra and Purview. - Supervise the entire data catalog lifecycle, including setting metadata standards, developing initial MVPs, and executing large-scale enterprise rollouts. - Define architecture and best practices for metadata management to ensure catalog and dictionary consistency, scalability, and sustainability. - Identify and categorize critical data elements by documenting clear business terms, glossaries, KPIs, lineage, and persona-specific guides to construct a reliable data dictionary. - Establish and enforce policies to uphold metadata quality, regulate access, and safeguard sensitive information within the catalog. - Implement robust processes for catalog population through automated metadata ingestion, API utilization, glossary management, lineage tracking, and data classification. - Create a workflow management approach to notify stewards of changes to certified catalog content. - Develop reusable frameworks and templates for data definitions and best practices to streamline catalog adoption across teams.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

The role of Workday Adaptive Planning+Workday Finance requires an experienced professional with 5-8 years of experience in PAN India. As an ideal candidate, you will be responsible for managing the Workday Financials system, which includes tasks such as maintenance, configuration of allocations, integrations testing, and review with 3rd party systems. You will also be involved in maintaining the structure of Workday Financials and Adaptive Planning, including the chart of accounts, organizational hierarchy, and calculation logic. Your key responsibilities will include building and updating complex financial models within Adaptive Planning to support budgeting, forecasting, and scenario analysis. Additionally, you will provide training to end-users on effectively utilizing Workday Financials and Adaptive Planning features. You will assist the team in maintaining metadata, business processes, security groups, and user-raised support tickets for both systems. Adhering to established Service Level Agreements for support tickets and commitments is crucial. You will also work closely with the business to gather requirements, develop fit-gap analysis, provide training on new features, and make adoption recommendations of new or deprecated functionality from Workday Financials and Adaptive Planning releases and updates. Creating customized dashboards and reports utilizing data from both systems to provide key insights to stakeholders is a vital part of your role. You will actively participate in implementations, upgrades, integration support, and enhancements of financial systems. Timely submission of external auditor requests related to IT support of financial systems is also expected from you. Collaboration with finance teams including FPA and accounting to understand their business needs and translate them into system configurations and reporting requirements is an essential aspect of this role. Having a general accounting knowledge of financial statements, system consolidation, varying ledger and reporting currencies, and complex intercompany transactions is necessary. If you find this opportunity interesting and aligning with your skills and experience, please share your CV on Sneha.Gedam@ltimindtree.com.,

Posted 1 month ago

Apply

8.0 - 10.0 years

22 - 27 Lacs

Bengaluru

Work from Office

Role Purpose The purpose of the role is to define and develop Enterprise Data Structure along with Data Warehouse, Master Data, Integration and transaction processing with maintaining and strengthening the modelling standards and business information. Do 1. Define and Develop Data Architecture that aids organization and clients in new/ existing deals a. Partnering with business leadership (adopting the rationalization of the data value chain) to provide strategic, information-based recommendations to maximize the value of data and information assets, and protect the organization from disruptions while also embracing innovation b. Assess the benefits and risks of data by using tools such as business capability models to create an data-centric view to quickly visualize what data matters most to the organization, based on the defined business strategy c. Create data strategy and road maps for the Reference Data Architecture as required by the clients d. Engage all the stakeholders to implement data governance models and ensure that the implementation is done based on every change request e. Ensure that the data storage and database technologies are supported by the data management and infrastructure of the enterprise f. Develop, communicate, support and monitor compliance with Data Modelling standards g. Oversee and monitor all frameworks to manage data across organization h. Provide insights for database storage and platform for ease of use and least manual work i. Collaborate with vendors to ensure integrity, objectives and system configuration j. Collaborate with functional & technical teams and clients to understand the implications of data architecture and maximize the value of information across the organization k. Presenting data repository, objects, source systems along with data scenarios for the front end and back end usage l. Define high-level data migration plans to transition the data from source to target system/ application addressing the gaps between the current and future state, typically in sync with the IT budgeting or other capital planning processes m. Knowledge of all the Data service provider platforms and ensure end to end view. n. Oversight all the data standards/ reference/ papers for proper governance o. Promote, guard and guide the organization towards common semantics and the proper use of metadata p. Collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure a common understanding, consistency, accuracy and control q. Provide solution of RFPs received from clients and ensure overall implementation assurance i. Develop a direction to manage the portfolio of all the databases including systems, shared infrastructure services in order to better match business outcome objectives ii. Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution for the big/small data iii. Provide technical leadership to the implementation of custom solutions through thoughtful use of modern technology iv. Define and understand current issues and problems and identify improvements v. Evaluate and recommend solutions to integrate with overall technology ecosystem keeping consistency throughout vi. Understand the root cause problem in integrating business and product units vii. Validate the solution/ prototype from technology, cost structure and customer differentiation point of view viii. Collaborating with sales and delivery leadership teams to identify future needs and requirements ix. Tracks industry and application trends and relates these to planning current and future IT needs 2. Building enterprise technology environment for data architecture management a. Develop, maintain and implement standard patterns for data layers, data stores, data hub & lake and data management processes b. Evaluate all the implemented systems to determine their viability in terms of cost effectiveness c. Collect all the structural and non-structural data from different places integrate all the data in one database form d. Work through every stage of data processing: analysing, creating, physical data model designs, solutions and reports e. Build the enterprise conceptual and logical data models for analytics, operational and data mart structures in accordance with industry best practices f. Implement the best security practices across all the data bases based on the accessibility and technology g. Strong understanding of activities within primary discipline such as Master Data Management (MDM), Metadata Management and Data Governance (DG) h. Demonstrate strong experience in Conceptual, Logical and physical database architectures, design patterns, best practices and programming techniques around relational data modelling and data integration 3. Enable Delivery Teams by providing optimal delivery solutions/ frameworks a. Build and maintain relationships with delivery and practice leadership teams and other key stakeholders to become a trusted advisor b. Define database physical structure, functional capabilities, security, back-up and recovery specifications c. Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results d. Monitor system capabilities and performance by performing tests and configurations e. Integrate new solutions and troubleshoot previously occurred errors f. Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards g. Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects h. Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams i. Recommend tools for reuse, automation for improved productivity and reduced cycle times j. Help the support and integration team for better efficiency and client experience for ease of use by using AI methods. k. Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams l. Ensures architecture principles and standards are consistently applied to all the projects m. Ensure optimal Client Engagement i. Support pre-sales team while presenting the entire solution design and its principles to the client ii. Negotiate, manage and coordinate with the client teams to ensure all requirements are met iii. Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Mandatory Skills: Data Governance. Experience : 8-10 Years.

Posted 1 month ago

Apply

2.0 - 4.0 years

6 - 12 Lacs

Hyderabad

Work from Office

We are seeking experienced Data Analysts / Data Engineers with strong expertise in U.S. pharmaceutical commercial datasets to support critical Data Operations initiatives. This role will be focused on onboarding third-party data, ensuring data quality , and implementing outlier detection techniques . Familiarity with ML/A I approaches for anomaly detection is highly desirable. Key Responsibilities: Pharma Data Integration: Work extensively with U.S. pharmaceutical commercial datasets. Ingest and onboard third-party data sources such as IQVIA, Symphony Health, Komodo Health etc. Ensure alignment of data schemas, dictionary mapping, and metadata integrity. Data Quality & Governance: Design and implement QC protocols for data integrity and completeness. Track data lineage and maintain proper documentation of data flow and transformations. Outlier Detection & Analytics: Apply statistical or algorithmic techniques to identify anomalies in data related to sales, claims, or patient-level records. Utilize ML/AI tools (if applicable) for automated outlier detection and trend analysis. Collaboration & Reporting: Work cross-functionally with business teams, data scientists, and IT to ensure timely delivery of reliable data. Provide detailed reports and insights for stakeholders to support commercial decision-making. Required Skills & Qualifications: 3+ years of experience in Pharmaceutical Data Operations , preferably with U.S. market data. Strong hands-on experience with third-party commercial healthcare data sources (IQVIA, Symphony, Komodo, etc.). Solid understanding of ETL pipelines, data ingestion frameworks, and metadata management . Proficient in SQL, Python, or R for data processing and quality checks. Experience in outlier detection techniques both statistical (Z-score, IQR, etc.) and ML-based (Isolation Forest, Autoencoders, etc.). Familiarity with Snowflake, Databricks, AWS, or similar cloud platforms is a plus. Excellent problem-solving, documentation, and communication skills

Posted 1 month ago

Apply

10.0 - 15.0 years

35 - 40 Lacs

Hyderabad

Work from Office

Role Purpose The purpose of the role is to define and develop Enterprise Data Structure along with Data Warehouse, Master Data, Integration and transaction processing with maintaining and strengthening the modelling standards and business information. Do 1. Define and Develop Data Architecture that aids organization and clients in new/ existing deals a. Partnering with business leadership (adopting the rationalization of the data value chain) to provide strategic, information-based recommendations to maximize the value of data and information assets, and protect the organization from disruptions while also embracing innovation b. Assess the benefits and risks of data by using tools such as business capability models to create an data-centric view to quickly visualize what data matters most to the organization, based on the defined business strategy c. Create data strategy and road maps for the Reference Data Architecture as required by the clients d. Engage all the stakeholders to implement data governance models and ensure that the implementation is done based on every change request e. Ensure that the data storage and database technologies are supported by the data management and infrastructure of the enterprise f. Develop, communicate, support and monitor compliance with Data Modelling standards g. Oversee and monitor all frameworks to manage data across organization h. Provide insights for database storage and platform for ease of use and least manual work i. Collaborate with vendors to ensure integrity, objectives and system configuration j. Collaborate with functional & technical teams and clients to understand the implications of data architecture and maximize the value of information across the organization k. Presenting data repository, objects, source systems along with data scenarios for the front end and back end usage l. Define high-level data migration plans to transition the data from source to target system/ application addressing the gaps between the current and future state, typically in sync with the IT budgeting or other capital planning processes m. Knowledge of all the Data service provider platforms and ensure end to end view. n. Oversight all the data standards/ reference/ papers for proper governance o. Promote, guard and guide the organization towards common semantics and the proper use of metadata p. Collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure a common understanding, consistency, accuracy and control q. Provide solution of RFPs received from clients and ensure overall implementation assurance i. Develop a direction to manage the portfolio of all the databases including systems, shared infrastructure services in order to better match business outcome objectives ii. Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution for the big/small data iii. Provide technical leadership to the implementation of custom solutions through thoughtful use of modern technology iv. Define and understand current issues and problems and identify improvements v. Evaluate and recommend solutions to integrate with overall technology ecosystem keeping consistency throughout vi. Understand the root cause problem in integrating business and product units vii. Validate the solution/ prototype from technology, cost structure and customer differentiation point of view viii. Collaborating with sales and delivery leadership teams to identify future needs and requirements ix. Tracks industry and application trends and relates these to planning current and future IT needs 2. Building enterprise technology environment for data architecture management a. Develop, maintain and implement standard patterns for data layers, data stores, data hub & lake and data management processes b. Evaluate all the implemented systems to determine their viability in terms of cost effectiveness c. Collect all the structural and non-structural data from different places integrate all the data in one database form d. Work through every stage of data processing: analysing, creating, physical data model designs, solutions and reports e. Build the enterprise conceptual and logical data models for analytics, operational and data mart structures in accordance with industry best practices f. Implement the best security practices across all the data bases based on the accessibility and technology g. Strong understanding of activities within primary discipline such as Master Data Management (MDM), Metadata Management and Data Governance (DG) h. Demonstrate strong experience in Conceptual, Logical and physical database architectures, design patterns, best practices and programming techniques around relational data modelling and data integration 3. Enable Delivery Teams by providing optimal delivery solutions/ frameworks a. Build and maintain relationships with delivery and practice leadership teams and other key stakeholders to become a trusted advisor b. Define database physical structure, functional capabilities, security, back-up and recovery specifications c. Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results d. Monitor system capabilities and performance by performing tests and configurations e. Integrate new solutions and troubleshoot previously occurred errors f. Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards g. Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects h. Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams i. Recommend tools for reuse, automation for improved productivity and reduced cycle times j. Help the support and integration team for better efficiency and client experience for ease of use by using AI methods. k. Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams l. Ensures architecture principles and standards are consistently applied to all the projects m. Ensure optimal Client Engagement i. Support pre-sales team while presenting the entire solution design and its principles to the client ii. Negotiate, manage and coordinate with the client teams to ensure all requirements are met iii. Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Mandatory Skills: AI Application Integration. Experience: 10 YEARS.

Posted 1 month ago

Apply

1.0 - 4.0 years

9 - 13 Lacs

Pune

Work from Office

Overview Data Technology group in MSCI is responsible to build and maintain state-of-the-art data management platform that delivers Reference. Market & other critical datapoints to various products of the firm. The platform, hosted on firms’ data centers and Azure & GCP public cloud, processes 100 TB+ data and is expected to run 24*7. With increased focus on automation around systems development and operations, Data Science based quality control and cloud migration, several tech stack modernization initiatives are currently in progress. To accomplish these initiatives, we are seeking a highly motivated and innovative individual to join the Data Engineering team for the purpose of supporting our next generation of developer tools and infrastructure. The team is the hub around which Engineering, and Operations team revolves for automation and is committed to provide self-serve tools to our internal customers. Responsibilities Implement & Maintain Data Catalogs Deploy and manage data catalog tool Collibra to improve data discoverability and governance. Metadata & Lineage Management Automate metadata collection, establish data lineage, and maintain consistent data definitions across systems. Enable Data Governance Collaborate with governance teams to apply data policies, classifications, and ownership structures in the catalog. Support Self-Service & Adoption Promote catalog usage across teams through training, documentation, and continuous support. Cross-Team Collaboration Work closely with data engineers, analysts, and stewards to align catalog content with business needs. Tooling & Automation Build scripts and workflows for metadata ingestion, tagging, and monitoring of catalog health. Leverage AI tools for automation of cataloging activities Reporting & Documentation Maintain documentation and generate usage metrics, ensuring transparency and operational efficiency. Qualifications Self-motivated, collaborative individual with passion for excellence E Computer Science or equivalent with 5+ years of total experience and at least 2 years of experience in working with Azure DevOps tools and technologies Good working knowledge of source control applications like git with prior experience of building deployment workflows using this tool Good working knowledge of Snowflake YAML, Python Tools: Experience with data catalog platforms (e.g., Collibra, Alation, DataHub). Metadata & Lineage: Understanding of metadata management and data lineage. Scripting: Proficient in SQL and Python for automation and integration. APIs & Integration: Ability to connect catalog tools with data sources using APIs. Cloud Knowledge: Familiar with cloud data services (Azure, GCP). Data Governance: Basic knowledge of data stewardship, classification, and compliance. Collaboration: Strong communication skills to work across data and business teams What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com

Posted 1 month ago

Apply

5.0 - 10.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Responsibilities: Development of workflows and Connectors for the Collibra Platform Administration and configuration of Collibra Platform Duties: Collibra DGC Administration and Configuration Collibra Connect Administration and Configuration Collibra Development of Workflows and MuleSoft Connectors Ingesting metadata from any external sources into Collibra. Installation, upgrading and Administration Collibra Components Setup, support, deployment & migration of Collibra Components Implement Application changes: review and deploy code packages, perform post implementation verifications. Participate in group meetings (including business partners) for problem solving, decision making and implementation planning Senior Collibra Developer- Mandatory Skills MUST HAVE SKILLS: Collibra Connect Collibra DGC Java Advanced hands-on working knowledge of Unix/Linux Advanced hands on experience wit UNIX scripting SQL Server Groovy Nice to have: Knowledge and interest in data governance and/or metadata management Working knowledge of Jira would be an asset

Posted 2 months ago

Apply

8.0 - 12.0 years

0 Lacs

vadodara, gujarat

On-site

The role aims to define and develop Enterprise Data Structure including Data Warehouse, Master Data, Integration, and transaction processing while maintaining and enhancing modelling standards and business information. You will be responsible for defining and developing Data Architecture to support organization and clients in new/existing deals. This involves partnering with business leadership to provide strategic recommendations, creating data strategy and road maps, implementing data governance models, ensuring data storage technologies align with enterprise infrastructure, monitoring compliance with Data Modelling standards, and collaborating with various stakeholders to maximize the value of data architecture. Additionally, you will be tasked with building enterprise technology environment for data architecture management by developing standard patterns for data layers, data stores, data hub & lake, and data management processes, evaluating system implementations for cost-effectiveness, building conceptual and logical data models, implementing best security practices, and demonstrating strong experience in Master Data Management, Metadata Management, and Data Governance. Furthermore, you will enable Delivery Teams by providing optimal delivery solutions/frameworks, maintaining relationships with key stakeholders, defining database physical structure and specifications, establishing relevant technical and business process metrics, monitoring system capabilities and performance, identifying and mitigating risks, ensuring quality assurance of architecture/design decisions, recommending tools for improved productivity, and supporting integration teams for better efficiency and client experience. In conclusion, Wipro is seeking individuals who are inspired by reinvention and are looking to evolve in their careers. The company is dedicated to digital transformation and welcomes applications from individuals with disabilities. If you are motivated by constant evolution and wish to be part of a purpose-driven organization, consider joining Wipro to realize your ambitions.,

Posted 2 months ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

You have a total of 4-6 years of development/design experience with a minimum of 3 years experience in Big Data technologies on-prem and on cloud. You should be proficient in Snowflake and possess strong SQL programming skills. Your role will require strong experience with data modeling and schema design, as well as extensive experience in using Data warehousing tools like Snowflake/BigQuery/RedShift and BI Tools like Tableau/QuickSight/PowerBI (at least one must be a must-have). You must also have experience with orchestration tools like Airflow and transformation tool DBT. Your responsibilities will include implementing ETL/ELT processes and building data pipelines, workflow management, job scheduling, and monitoring. You should have a good understanding of Data Governance, Security and Compliance, Data Quality, Metadata Management, Master Data Management, Data Catalog, as well as cloud services (AWS), including IAM and log analytics. Excellent interpersonal and teamwork skills are essential, along with the experience of leading and mentoring other team members. Good knowledge of Agile Scrum and communication skills are also required. At GlobalLogic, the culture prioritizes caring and inclusivity. Youll join an environment where people come first, fostering meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Continuous learning and development opportunities are provided to help you grow personally and professionally. Meaningful work awaits you at GlobalLogic, where youll have the chance to work on impactful projects and engage your curiosity and problem-solving skills. The organization values balance and flexibility, offering various career areas, roles, and work arrangements to help you achieve a perfect balance between work and life. GlobalLogic is a high-trust organization where integrity is key, ensuring a safe, reliable, and ethical global environment for all employees. Truthfulness, candor, and integrity are fundamental values upheld in everything GlobalLogic does. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner that collaborates with the world's largest and most forward-thinking companies. Leading the digital revolution since 2000, GlobalLogic helps create innovative digital products and experiences, transforming businesses and redefining industries through intelligent products, platforms, and services.,

Posted 2 months ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

At PwC, the infrastructure team focuses on designing and implementing secure IT systems that support business operations. The primary goal is to ensure the smooth functioning of networks, servers, and data centers to enhance performance and minimize downtime. In the infrastructure engineering role at PwC, you will be tasked with creating robust and scalable technology infrastructure solutions for clients. This will involve working on network architecture, server management, and cloud computing. As a Data Modeler, we are seeking candidates with a solid background in data modeling, metadata management, and data system optimization. Your responsibilities will include analyzing business requirements, developing long-term data models, and ensuring the efficiency and consistency of our data systems. Key areas of expertise for this role include: - Analyzing and translating business needs into long-term data model solutions. - Evaluating existing data systems and suggesting enhancements. - Defining rules for translating and transforming data across various models. - Collaborating with the development team to create conceptual data models and data flows. - Establishing best practices for data coding to maintain consistency within the system. - Reviewing modifications of existing systems for cross-compatibility. - Implementing data strategies and developing physical data models. - Updating and optimizing local and metadata models. - Utilizing canonical data modeling techniques to improve data system efficiency. - Evaluating implemented data systems for variances, discrepancies, and efficiency. - Troubleshooting and optimizing data systems for optimal performance. - Demonstrating strong expertise in relational and dimensional modeling (OLTP, OLAP). - Using data modeling tools like Erwin, ER/Studio, Visio, PowerDesigner effectively. - Proficiency in SQL and database management systems (Oracle, SQL Server, MySQL, PostgreSQL). - Understanding of NoSQL databases (MongoDB, Cassandra) and their data structures. - Experience with data warehouses and BI tools (Snowflake, Redshift, BigQuery, Tableau, Power BI). - Familiarity with ETL processes, data integration, and data governance frameworks. - Strong analytical, problem-solving, and communication skills. Qualifications for this position include: - A Bachelor's degree in Engineering or a related field. - 5 to 9 years of experience in data modeling or a related field. - 4+ years of hands-on experience with dimensional and relational data modeling. - Expert knowledge of metadata management and related tools. - Proficiency with data modeling tools such as Erwin, Power Designer, or Lucid. - Knowledge of transactional databases and data warehouses. Preferred Skills: - Experience in cloud-based data solutions (AWS, Azure, GCP). - Knowledge of big data technologies (Hadoop, Spark, Kafka). - Understanding of graph databases and real-time data processing. - Certifications in data management, modeling, or cloud data engineering. - Excellent communication and presentation skills. - Strong interpersonal skills to collaborate effectively with various teams.,

Posted 2 months ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Data Analyst in the Solution Design team at Barclays, your primary responsibility will be to support in defining and designing technology and business solutions that align with organizational goals. This includes conducting requirements gathering, data analysis, data architecture, system integration, and delivering scalable, high-quality designs that cater to both business and technical needs. To excel in this role, you must have experience in delivering large-scale changes in complex environments, leading requirements documentation, and facilitating workshops to gather, clarify, and communicate business needs effectively. Your strong data analysis and data modeling skills will be crucial for performing data validations, anomaly detection, and deriving insights from large volumes of data to support decision-making. Proficiency in advanced SQL for querying, joining, and transforming data, along with experience in data visualization tools such as Tableau, Qlik, or Business Objects, is essential. Furthermore, you should be an effective communicator capable of translating complex technical concepts into clear language for diverse audiences. Your ability to liaise between business stakeholders and technical teams to ensure a mutual understanding of data interpretations, requirements definition, and solution designs will be key. Previous experience in Banking and Financial services, particularly in wholesale credit risk, as well as knowledge of implementing data governance standards, will be advantageous. Additional skills highly valued for this role include experience with Python data analysis and visualization tools, familiarity with external data vendors for integrating financials and third-party datasets, and exposure to wholesale credit risk IRB models and regulatory frameworks. Your responsibilities will include investigating and analyzing data quality issues, executing data cleansing and transformation tasks, designing and building data pipelines, applying advanced analytical techniques like machine learning and AI, and documenting data quality findings for improvement. It will also be essential to contribute to strategy, drive requirements, manage resources, and deliver continuous improvements in alignment with organizational goals. As a Senior Data Analyst, you will be expected to demonstrate leadership behaviors that create an environment for colleagues to thrive and deliver excellence. If the position includes leadership responsibilities, you will be required to set strategic direction, manage policies and processes, and drive continuous improvement. Additionally, you will advise key stakeholders, manage and mitigate risks, and collaborate with other areas of the organization to achieve business goals. All colleagues at Barclays are expected to embody the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as the Barclays Mindset of Empower, Challenge, and Drive.,

Posted 2 months ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Architect specializing in OLTP & OLAP Systems, you will play a crucial role in designing, optimizing, and governing data models for both OLTP and OLAP environments. Your responsibilities will include architecting end-to-end data models across different layers, defining conceptual, logical, and physical data models, and collaborating closely with stakeholders to capture functional and performance requirements. You will need to optimize database structures for real-time and analytical workloads, enforce data governance, security, and compliance best practices, and enable schema versioning, lineage tracking, and change control. Additionally, you will review query plans and indexing strategies to enhance performance. To excel in this role, you must possess a deep understanding of OLTP and OLAP systems architecture, along with proven experience in GCP databases such as BigQuery, CloudSQL, and AlloyDB. Your expertise in database tuning, indexing, sharding, and normalization/denormalization will be critical, as well as proficiency in data modeling tools like DBSchema, ERWin, or equivalent. Familiarity with schema evolution, partitioning, and metadata management is also required. Experience in the BFSI or mutual fund domain, knowledge of near real-time reporting and streaming analytics architectures, and familiarity with CI/CD for database model deployments are preferred skills that will set you apart. Strong communication, stakeholder management, strategic thinking, and the ability to mentor data modelers and engineers are essential soft skills for success in this position. By joining our team, you will have the opportunity to own the core data architecture for a cloud-first enterprise, bridge business goals with robust data design, and work with modern data platforms and tools. If you are looking to make a significant impact in the field of data architecture, this role is perfect for you.,

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies