Jobs
Interviews

455 Metadata Management Jobs - Page 5

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 7.0 years

13 - 18 Lacs

bengaluru

Work from Office

Educational Requirements Master Of Business Adm.,Master Of Commerce,Master Of Engineering,Master Of Technology,Master of Technology (Integrated),Bachelor Of Business Adm.,Bachelor Of Commerce,Bachelor of Engineering,Bachelor Of Technology,Bachelor Of Technology (Integrated) Service Line Enterprise Package Application Services Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to work with clients to identify business challenges and contribute to client deliverables by refining, analyzing, and structuring relevant data Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Technical and Professional Requirements: At least 2 years of configuration & development experience with respect to implementation of OFSAA Solutions (such as ERM, EPM, etc.) Expertise in implementing OFSAA Technical areas covering OFSAAI and frameworks Data Integrator, Metadata Management, Data Modelling. Perform and understand data mapping from Source systems to OFSAA staging; Execution of OFSAA batches and analyses result area tables and derived entities. Perform data analysis using OFSAA Metadata (i.e Technical Metadata, Rule Metadata, Business Metadata) and identify if any data mapping gap and report them to stakeholders. Participate in requirements workshops, help in implementation of designed solution, testing (UT, SIT), coordinate user acceptance testing etc. Knowledge and experience with full SDLC lifecycle Experience with Lean / Agile development methodologies Preferred Skills: Technology->Oracle Industry Solutions->Oracle Financial Services Analytical Applications (OFSAA)

Posted 2 weeks ago

Apply

8.0 - 13.0 years

12 - 16 Lacs

pune

Work from Office

Job Overview This position is responsible for the design and implementation of scalable data management practices and stewards hip protocols across core GTM and other operational data domains (accounts, contacts, opportunities, activities, etc.) This role requires both technical proficiency and strong leadership skills to drive business decisions through data. About Us When you join iCIMS, you join the team helping global companies transform business and the world through the power of talent. Our customers do amazing things: design rocket ships, create vaccines, deliver consumer goods globally, overnight, with a smile. As the Talent Cloud company, we empower these organizations to attract, engage, hire, and advance the right talent. Were passionate about helping companies build a diverse, winning workforce and about building our home team. We're dedicated to fostering an inclusive, purpose-driven, and innovative work environment where everyone belongs. Responsibilities Establish and Operationalize GTM Data Standards - Define and maintain critical data elements, ownership models, and classification structures aligned with operational and compliance needs. Partner with Legal, Compliance, and InfoSec to ensure alignment with overall corporate/ enterprisegovernanceprograms andpolicies. This role also plays a key part in contributing to enterprise-wide data governanceensuring GTM-level practices are aligned with broader corporate policies and grounded in real-world operational use. Execute Data Quality Strategy - Develop and manage a data quality framework to monitor completeness, consistency, timeliness, and accuracy of GTM and other operationaldata. Build dashboards and KPIs to track performance, surface issues, and support root cause investigations. Collaborate with RevOps, Product, and Engineering on remediation plans that reduce friction in pipeline, forecasting, and engagement flows. Advance Metadata and Lineage Visibility - Support development of a business glossary, metadata repository, and lineage mapping to ensure transparency across data workflows and enable better change impact analysis. Drive consistency and adoption of metadata standards across teams. Apply Agentic and Generative AI to Stewardship Workflows - Pilot AI solutions to automate repetitive tasks in data stewardship, including anomaly detection, issue triage, and documentation generation. Collaboratewith stakeholders to identify where AI can enhance efficiency while maintaining control and compliance (like the EU AI Act). Support Risk Mitigation and Audit Readiness - Help maintain audit-ready documentation of data practices and control activities. Monitor risk exposure across GTM systems and SaaS tools, and support remediation with control owners. Ensure data retention, masking, and deletion policies are consistently enforced. Enable Cross-Functional Alignment on Data Standards - Work closely with Sales, Marketing, Product, RevOps, and Engineering teams to embed data practices into operational workflows and system designs.Support working groups by providing updates, surfacing gaps, and tracking adoption progress. Promote Data Literacy and Stewardship - Develop and deliver training, documentation, and support materials to help data users, analysts, and stewards understand data responsibilities and best practices. Drive a culture of transparency, accountability, and responsible data usage. Qualifications 8+ years of experience in data quality, data management, or GTM data operations Experience implementing structured data practices across enterprise systems Proficiency with SQL, data profiling, and tooling (e.g., Alation, Collibra, Informatica) Familiarity with privacy and compliance regulations (e.g., GDPR, CCPA, SOC 2, & ISO) Experience with metadata management and data lineage practices Exposure to agentic or generative AI in data stewardship or operations Strong communication skills and ability to influence across technical and business teams Demonstrated program management expertiseincluding creating project plans, organizing and leading stakeholder meetings,and providing status updates on key milestones

Posted 2 weeks ago

Apply

8.0 - 10.0 years

10 - 15 Lacs

mumbai, pune, chennai

Work from Office

8 -10 years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols) Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts required Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required Experience of 2 to 3 years doing data analysis and data testing Good working knowledge of SQL Good data analysis skills to query, map and document source to target mappings, understand data schemas and modeling Good communication and documentation skills

Posted 2 weeks ago

Apply

4.0 - 12.0 years

0 Lacs

coimbatore, tamil nadu

On-site

We are looking for a Data Architect with a minimum of 12 years of experience, including at least 4 years in Azure data services. As a Data Architect, you will be responsible for leading the design and implementation of large-scale data solutions on Microsoft Azure. Your role will involve leveraging your expertise in cloud data architecture, data engineering, and governance to create robust, secure, and scalable platforms. Key Skills: - Proficiency in Azure Data Factory, Synapse, Databricks, and Blob Storage - Strong background in Data Modeling and Lakehouse Architecture - Experience with SQL, Python, and Spark - Knowledge of Data Governance, Security, and Metadata Management - Familiarity with CI/CD practices, Infra as Code (ARM/Bicep/Terraform), and Git - Excellent communication skills and the ability to collaborate effectively with stakeholders Bonus Points For: - Azure Certifications such as Data Engineer or Architect - Hands-on experience with Event Hubs, Stream Analytics, and Kafka - Understanding of Microsoft Purview - Industry experience in healthcare, finance, or retail sectors Join our team to drive innovation through data, shape architecture strategies, and work with cutting-edge Azure technologies. If you are ready to make a significant impact in the field of data architecture, apply now or reach out to us for more information. For more information, please contact karthicc@nallas.com.,

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

chennai, tamil nadu

On-site

You will play a crucial role in defining and developing Enterprise Data Structure along with Data Warehouse, Master Data, Integration, and transaction processing while maintaining and strengthening the modeling standards and business information. Your responsibilities will include: - Partnering with business leadership to provide strategic recommendations for maximizing the value of data assets and protecting the organization from disruptions. - Assessing benefits and risks of data using business capability models to create a data-centric view aligned with the defined business strategy. - Creating data strategies and roadmaps for Reference Data Architecture as per client requirements. - Engaging stakeholders to implement data governance models and ensure compliance with data modeling standards. - Overseeing frameworks to manage data across the organization and collaborating with vendors for system integrity. - Developing data migration plans and ensuring end-to-end view of all data service provider platforms. - Promoting common semantics and proper metadata use throughout the organization. - Providing solutions for RFPs from clients and ensuring implementation assurance. - Building enterprise technology environment for data architecture management by implementing standard patterns for data layers, data stores, and data management processes. - Developing logical data models for analytics and operational structures in accordance with industry best practices. - Enabling delivery teams by providing optimal delivery solutions and frameworks, monitoring system capabilities, and identifying technical risks. - Ensuring quality assurance of architecture and design decisions, recommending tools for improved productivity, and supporting integration teams for better efficiency. - Supporting pre-sales team in presenting solution designs to clients and demonstrating thought leadership to act as a trusted advisor. Join Wipro to be a part of an end-to-end digital transformation partner with bold ambitions and constant evolution. Realize your ambitions and design your reinvention in a purpose-driven environment. Applications from people with disabilities are explicitly welcome.,

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

As the Data Governance Manager reporting into the Enterprise Data Enablement function within the Operations of the London Stock Exchange Group (LSEG), you will play a crucial role in leading and managing people, processes, and technology to efficiently implement and operationalize the Enterprise Data Governance Program. Your responsibilities will include developing and implementing a policy framework to enhance data management capabilities, thereby enabling LSEG to deliver improved products and services. Your primary focus will be to act as a subject matter expert in data governance, institutionalize and implement the data governance framework, and drive adoption across LSEG. You will work towards building and refining Data Governance capabilities while ensuring alignment with strategic priorities and development of data governance as a service for businesses and corporate functions. Key Responsibilities: - Act as a Manager and SME on data governance to improve efficiencies, reduce risks, identify business opportunities, and support the LSEG mission - Drive the enterprise data governance adoption roadmap and develop the data governance operating model - Serve as a business partner to promote the adoption of data governance best practices - Define and implement policies, standards, and governance metrics while supporting related tools - Liaise between Divisions and supporting services to ensure clear communication of data-related business requirements - Chair relevant forums and implement data governance principles across the organization - Support compliance with operational regulatory requirements and contribute to the development of data management capabilities - Partner with other leaders to satisfy business priorities and capture use cases and requirements - Develop internal content to train the business in embracing data governance principles - Manage staff at various levels and review the current data policy framework - Support change management projects and provide consultation for data literacy, education, and training activities - Support the metadata strategy and provide input into Key Performance Indicators (KPIs) and metrics related to data governance capabilities Key Behaviors and Skills Required: - Minimum of 7-10 years of experience in a services organization with data or project management experience - In-depth knowledge of data governance best practices, ISO Standards, and relevant governing bodies - Experience collaborating on data governance initiatives and working with risk management, controls, data architecture, and technology solutions - Bachelor's degree in relevant field and relevant certifications Joining LSEG means being part of a global financial markets infrastructure and data provider committed to driving financial stability and sustainable growth. You will work in a collaborative and creative culture that values individuality and encourages new ideas. LSEG offers tailored benefits and support, including healthcare, retirement planning, and wellbeing initiatives. ,

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

As a Manager of Data Governance at Curriculum Associates, you will play a crucial role in establishing, implementing, and scaling a data governance framework within our Data & Insights Center of Excellence (CoE) to drive real business impact. Your responsibilities will involve a combination of hands-on execution and strategic leadership to ensure that governance policies and processes are integrated into data operations across the company. Your expertise in leading governance programs and influencing stakeholders will be essential in driving adoption and success. Operating in an agile environment, you will collaborate closely with data engineering, analytics, and business teams in the U.S. and India. Your success will be measured by the impact of governance on data usability, trust, and accessibility, ultimately enhancing decision-making and operational efficiency. Key responsibilities include: - Collaborating with stakeholders to develop and operationalize data governance policies, standards, processes, and best practices aligned with business goals. - Implementing data stewardship processes integrated into agile workflows to support metadata management and data quality management. - Establishing a data stewardship framework to empower teams to take ownership of data quality. - Engaging in data governance council discussions to ensure alignment across global teams. - Implementing and maintaining data governance tools such as Collibra, Alation, Informatica, etc. - Defining and tracking success metrics measuring governance's impact on end users, including improved data accessibility and decision-making. - Managing workflows for data issue resolution, change management, and continuous improvement. - Mentoring data stewards and ensuring consistency in data governance practices across the organization. - Partnering with data engineering to implement governance policies in data pipelines and platforms. - Supporting data analytics & BI teams to ensure governed data is trusted, accessible, and actionable. Required job skills: - Proven track record of executing and scaling data governance programs in agile environments. - Strong expertise in data quality, metadata management, MDM, regulatory compliance, and best practices. - Ability to drive adoption of governance practices, measure impact, and lead change management initiatives. - Demonstrated experience in documenting business requirements, business glossaries, data cataloging, data mapping, and governance frameworks. - Ability to identify business process and data management gaps and drive improvements. - Experience collaborating with data engineering, analytics, infrastructure, and business teams in an agile setup. - Ability to balance execution and leadership responsibilities while maintaining focus on delivering results. Minimum qualifications: - Bachelor's degree in a relevant field such as Business Administration, Computer Science, or Data Analytics. - 10+ years of experience in data management or enterprise data teams. - 7+ years of experience with data governance, data management, or enterprise data strategy. - 5+ years of experience leading cross-functional collaboration and communicating with stakeholders. - Experience with data governance tools such as Collibra, Alation, Atacama, Informatica, etc. - Industry certifications (e.g., CDMP/ DAMA DMBOK) are preferred.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

delhi

On-site

You will be joining Viraaj HR Solutions, a dynamic and innovative HR consultancy that is committed to facilitating connections between talented professionals and reputable organizations. The company's mission is to empower businesses by providing them with the right talent to drive success and foster growth. At Viraaj HR Solutions, values such as integrity, teamwork, and excellence are highly regarded, shaping a culture that promotes creativity, professional development, and work-life balance. Your role as a Snowflake Developer will involve designing, developing, and implementing data solutions using the Snowflake platform. Your responsibilities will also include optimizing Snowflake databases and schemas for efficient data access, developing data integration workflows utilizing ETL tools, and maintaining data models to support business analytics. Additionally, you will collaborate with data scientists and analysts to address data-related issues, monitor Snowflake usage, and ensure data quality and integrity across all data stores. Documenting data architecture and design processes, conducting system performance tuning, and troubleshooting the Snowflake environment will also be part of your duties. Furthermore, you will integrate Snowflake with cloud-based services and tools as needed, participate in code reviews, stay updated on Snowflake features and best practices, and provide training to team members on Snowflake capabilities and tools. Working closely with stakeholders to gather requirements and define project scope is essential for success in this role. To qualify for this position, you should hold a Bachelor's degree in Computer Science, Information Technology, or a related field and have a minimum of 3 years of experience in data engineering or database development. Proficiency in the Snowflake platform, SQL, ETL tools, and data warehousing concepts is required, with certification in Snowflake being advantageous. You should also possess strong analytical, problem-solving, communication, and team collaboration skills, as well as experience in data modeling, metadata management, and troubleshooting data-related issues. Familiarity with cloud platforms such as AWS, Azure, or Google Cloud, as well as knowledge of data governance practices, is desirable. Additionally, the ability to work independently, manage multiple tasks effectively, and adapt to new technologies will be crucial for excelling in this role. Your expertise in cloud computing, data modeling, SQL proficiency, analytical skills, data integration, cloud platforms (AWS, Azure, Google Cloud), data governance, problem-solving, Snowflake, metadata management, troubleshooting, communication, team collaboration, data warehousing, and performance tuning will be invaluable in fulfilling the responsibilities of the Snowflake Developer role at Viraaj HR Solutions.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As a Data Quality professional, your role involves defining and measuring data quality metrics to ensure accuracy, completeness, validity, consistency, timeliness, and reliability. You will continuously monitor and remediate data quality issues by conducting audits, root cause analysis, and implementing preventive measures. Additionally, you will be responsible for developing and maintaining comprehensive data profiles to understand data characteristics, creating validation rules for incoming data, and designing data cleansing processes to correct errors and inconsistencies. Your focus will be on enhancing overall data quality and reliability through these processes. In terms of Data Governance, you will establish a governance framework to enforce data governance practices, ensuring compliance with regulatory requirements and corporate policies. This includes developing and maintaining a metadata repository to document data definitions, lineage, and usage, as well as collaborating with business users to understand data needs and requirements. Furthermore, you will continuously seek opportunities for process improvement in data governance and quality management. Understanding user roles and access requirements for systems is crucial for implementing similar protections in analytical solutions, including establishing row-level security to ensure data access is restricted to authorized users. To drive continuous improvement, you will define naming conventions for business-friendly table and column names, create synonyms to simplify data access, and establish KPIs for tracking data governance and quality efforts. Regular reporting to stakeholders will be essential to demonstrate the value of data governance initiatives. Lastly, you will establish a feedback loop where users can report data quality issues and suggest improvements, fostering a culture of continuous improvement in data quality and governance processes.,

Posted 2 weeks ago

Apply

2.0 - 5.0 years

1 - 1 Lacs

chennai

Work from Office

Technical Expert – Content Solutions is responsible for ensuring the stability, scalability, and continuous improvement of our enterprise content ecosystem—built on a modern CCMS platform. Candidates with experience of Adobe Experience Manager are preferred. Technical Expert – Content Solutions blends hands-on technical execution with architectural thinking, collaborating across content, IT, and product teams to deliver multichannel, high-quality, and secure content solutions that power KONE’s customer and employee experiences. Key Responsibilities Platform & Ecosystem Management: - Continuously verify the health and performance of the content ecosystem. - Manage and execute platform upgrade activities in collaboration with Adobe Customer Success Engineers (CSE). - Perform daily troubleshooting related to content errors, renditions, and output generation. - Plan and execute content migration activities, ensuring data integrity and minimal downtime. Integration & Development: - Develop and conceptualize integrations between the CCMS and other KONE enterprise systems. - Work with CI/CD pipelines for deployment, configuration updates, and automated processes. - Design, update, and maintain stylesheets using DITA Open Toolkit (DITA-OT), XSLT, and CSS. Content Architecture & Quality: - Apply and recommend modular authoring principles to enable structured, reusable, and channel-agnostic content. - Ensure technical quality assurance of modular content, including validation rules, metadata, and taxonomy consistency. - Support and optimize multichannel publishing processes. Security & Compliance: - Incorporate cybersecurity principles and best practices into content solution designs. - Ensure integration and support for Single Sign-On (SSO) and secure authentication. Strong experience with CCMS platform management—preferably Adobe Experience Manager (AEM). In-depth knowledge of modular authoring principles. Expertise in DITA and DITA Open Toolkit configuration and publishing workflows. Experience with HTML, Java, or related web technologies. Software development experience, including CI/CD pipelines (e.g., Jenkins, GitLab CI). Proficiency in XSLT and CSS for template and stylesheet development. Understanding of multichannel publishing architecture and processes. Experience in modular content technical quality assurance. Understanding of cybersecurity principles and working with SSO solutions. Strong troubleshooting, analytical, and problem-solving skills. Excellent collaboration and communication skills for working with cross-functional teams. Bachelor’s degree in engineering or relevant Experience with content migrations at enterprise scale. Familiarity with content taxonomy design, metadata management, and tagging strategies. Knowledge of automation scripts and tools to optimize content operations. At KONE, we are focused on creating an innovative and collaborative working culture where we value the contribution of each individual. Employee engagement is a key focus area for us and we encourage participation and the sharing of information and ideas. Sustainability is an integral part of our culture and the daily practice. We follow ethical business practices and we seek to develop a culture of working together where co-workers trust and respect each other and good performance is recognized. In being a great place to work, we are proud to offer a range of experiences and opportunities that will help you to achieve your career and personal goals and enable you to live a healthy and balanced life. Read more on www.kone.com/careers

Posted 2 weeks ago

Apply

2.0 - 4.0 years

7 - 11 Lacs

gurugram, bengaluru

Hybrid

Dear candidate, Immediate hiring for Data governance for one of the top MNC Role: Data governance developer Primary Skills: Data governance, meta data management, Linux, My SQL, Collibra, Notice Period : Immediate -15 Days Location :Gurugram, Bangalore Employment type: Fulltime Work Mode : Hybrid Roles and Responsbilities: 3+ years of IT industry experience in working on Data Governance/Data Engineering/data Architecture areas. Experience of working in the production grade environment. Certified in Collibra Learning paths will be a plus. Working experience in data governance, metadata management and data catalog solutions, specifically on Collibra tools Must have: Hands on experience with Linux as well as experience with relational and non- relational database/data sources (MySQL, PostgreSQL). Experience troubleshooting web-based applications. Experience with Java and REST API Excellent knowledge of certificates - SSL, SSO, and LDAP. Knowledge of Collibra operating model, workflow BPMN development, and how to integrate various applications or systems with Collibra Excellent problem-solving, analytical, written and verbal communication skills. Knowledge on SaaS solution and containerized application will be plus Must have skills - Good hands on and core understanding on Colibra products ( Data Governance , Data Catalog , Data Intelligence Platform)

Posted 2 weeks ago

Apply

10.0 - 15.0 years

7 - 11 Lacs

bengaluru

Work from Office

About the Role We are looking for a Data Warehouse Engineer with strong expertise across the Azure Data Platform to design, build, and maintain modern data warehouse and analytics solutions. This role requires hands-on experience with Azure Synapse Analytics, Data Factory, Data Lake, Azure Analysis Services, and Power BI . The ideal candidate will ensure seamless data ingestion, storage, transformation, analysis, and visualization, enabling the business to make data-driven decisions. Key Responsibilities Data Ingestion & Orchestration 10-15years of experience in designing and building scalable ingestion pipelines using Azure Data Factory . Integrate data from multiple sources (SQL Server, relational databases, Azure SQL DB, Cosmos DB, Table Storage). Manage batch and real-time ingestion into Azure Data Lake Storage . Data Storage & Modelling Develop and optimize data warehouse solutions in Azure Synapse Analytics . Implement robust ETL/ELT processes to ensure data quality and consistency. Create data models for analytical and reporting needs. Data Analysis & Security Build semantic data models using Azure Analysis Services for enterprise reporting. Collaborate with BI teams to deliver well-structured datasets for reporting in Power BI . Implement Azure Active Directory for authentication, access control, and security best practices. Visualization & Business Support Support business teams in building insightful Power BI dashboards and reports . Translate business requirements into scalable and optimized BI solutions. Provide data-driven insights in a clear, business-friendly manner. Optimization & Governance Monitor system performance and optimize pipelines for efficiency and cost control. Establish standards for data governance, data quality, and metadata management . Qualifications & Skills Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or related field. Proven experience as a Data Warehouse Engineer / Data Engineer with strong expertise in: Azure Synapse Analytics Azure Data Factory Azure Data Lake Storage Azure Analysis Services Azure SQL Database / SQL Server Power BI (reporting & dashboarding) Strong proficiency in SQL and data modelling (star schema, snowflake schema, dimensional modelling). Knowledge of Azure Active Directory for authentication & role-based access control. Excellent problem-solving skills and ability to optimize large-scale data solutions. Strong communication skills to collaborate effectively with both technical and business stakeholders.

Posted 2 weeks ago

Apply

9.0 - 13.0 years

0 Lacs

pune, maharashtra

On-site

For over 30 years, Beghou Consulting has been a trusted adviser to life science firms. We combine our strategic consulting services with proprietary technology to develop custom, data-driven solutions that allow life sciences companies to take their commercial operations to new heights. We are dedicated to client service and offer a full suite of consulting and technology services, all rooted in advanced analytics, to enhance commercial operations and boost sales performance. Your key responsibilities include leading the delivery of data management and analytics solutions for clients in the life sciences industry. You will be involved in managing complex, multi-disciplinary projects, fostering strong client relationships, and aligning business needs with advanced technology capabilities. You will serve as a strategic advisor, collaborating with global teams to design scalable solutions across BI, AI, and MDM. If you are passionate about leveraging your expertise in U.S. pharmaceutical datasets to drive client success, we invite you to apply for this exciting opportunity! Engage in direct client interactions, managing multiple projects simultaneously with a focus on client satisfaction and timely delivery. Develop a deep understanding of clients" business challenges and provide strategic leadership in addressing critical issues. Proactively leverage Beghou's expertise to deliver impactful solutions. Collaborate with onshore and offshore teams to estimate pricing, plan resource capacity, coordinate the delivery of end-to-end technology solutions (BI Reporting, Analytics, AI, Big Data, MDM), report project status, and serve as the primary escalation point for clients. Work closely with global counterparts to plan and execute project activities, identify risks, and implement mitigation strategies. Ensure robust and scalable technical designs that align with Beghou and industry standards, while balancing architectural requirements with client-specific needs. Lead multiple projects, providing direction and maintaining a high level of service quality. Introduce innovative ideas related to emerging technology trends and continuous improvement opportunities to enhance client capabilities. Identify and drive opportunities to expand client engagements. Prepare and deliver compelling, strategic presentations to senior leadership. Monitor and analyze the latest technology and industry trends to shape Beghou's point of view and design forward-thinking solutions. Act as a trusted advisor and single point of contact for clients, ensuring alignment between business needs and Beghou's solutions and services. Serve as a bridge between business and technical teams by translating business drivers into actionable technology solutions (e.g., systems architecture, data warehousing, reporting). Coach cross-functional teams on business objectives and strategic priorities to inform key project decisions. Mentor, develop, and motivate team members. Lead by example. Bachelors/Masters degree from a Tier-I/Tier-II institution with a strong academic record. 9+ years of relevant consulting experience in U.S. Healthcare & Life Sciences, delivering medium-to-large scale Data Warehouse (DW) and Master Data Management (MDM) solutions. 5+ years of hands-on experience in designing and implementing DW and MDM solutions using tools such as Informatica, Databricks, Reltio, etc. Strong understanding of U.S. pharmaceutical datasets and their applications. Proven success in delivering data-driven commercial strategies for life sciences clients, particularly in analytics, CRM, and data management. Strong understanding of data management principles, including data modeling, data quality, metadata management, and best practices within the pharmaceutical industry. Solid experience with cloud-based data management platforms (e.g., AWS, Azure, Snowflake). Demonstrated ability to design and deliver scalable solutions on cloud platforms. Proficiency in ETL design and development, along with experience using OLAP tools to support business applications. Strong multitasking abilities with a proven track record of managing multiple concurrent projects effectively. Experience in developing thought leadership or contributing to firm intellectual capital through innovative problem-solving and technology adoption. Extensive knowledge of big data and cloud technologies. Proven ability to work collaboratively in a global, matrixed environment and engage effectively with U.S.-based clients. We treat our employees with respect and appreciation, not only for what they do but who they are. We value the many talents and abilities of our employees and promote a supportive, collaborative, and dynamic work environment that encourages both professional and personal growth. You will have the opportunity to work with and learn from all levels in the organization, allowing everyone to work together to develop, achieve, and succeed with every project. We have had steady growth throughout our history because the people we hire are committed not only to delivering quality results for our clients but also to becoming leaders in sales and marketing analytics.,

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

NTT DATA strives to hire exceptional, innovative, and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking an AI Solution Architect to join our team in Bangalore, Karnataka (IN-KA), India (IN). Role: - AI Solution Architect Experience: - 7+ Years Notice Period: - 30-60 Days Project Overview: We are seeking a seasoned AI Architect with strong experience in Generative AI and Large Language Models (LLMs) including OpenAI, Claude, and Gemini to lead the design, orchestration, and deployment of intelligent solutions across complex use cases. You will architect conversational systems, feedback loops, and LLM pipelines with robust data governance, leveraging the Databricks platform and Unity Catalog for enterprise-scale scalability, lineage, and compliance. Role Scope / Deliverables: - Architect end-to-end LLM solutions for chatbot applications, semantic search, summarization, and domain-specific assistants. - Design modular, scalable LLM workflows including prompt orchestration, RAG (retrieval-augmented generation), vector store integration, and real-time inference pipelines. - Leverage Databricks Unity Catalog for: - Centralized governance of AI training and inference datasets - Managing metadata, lineage, access controls, and audit trails - Cataloging feature tables, vector embeddings, and model artifacts - Collaborate with data engineers and platform teams to ingest, transform, and catalog datasets used for fine-tuning and prompt optimization. - Integrate feedback loop systems (e.g., user input, signal-driven reinforcement, RLHF) to continuously refine LLM performance. - Optimize model performance, latency, and cost using a combination of fine-tuning, prompt engineering, model selection, and token usage management. - Oversee secure deployment of models in production, including access control, auditability, and compliance alignment via Unity Catalog. - Guide teams on data quality, discoverability, and responsible AI practices in LLM usage. Key Skills: - 7+ years in AI/ML solution architecture, with 2+ years focused on LLMs and Generative AI. - Strong experience working with OpenAI (GPT-4/o), Claude, Gemini, and integrating LLM APIs into enterprise systems. - Proficiency in Databricks, including Unity Catalog, Delta Lake, MLflow, and cluster orchestration. - Deep understanding of data governance, metadata management, and data lineage in large-scale environments. - Hands-on experience with chatbot frameworks, LLM orchestration tools (LangChain, LlamaIndex), and vector databases (e.g., FAISS, Weaviate, Pinecone). - Strong Python development skills, including notebooks, REST APIs, and LLM orchestration pipelines. - Ability to map business problems to AI solutions, with strong architectural thinking and stakeholder communication. - Familiarity with feedback loops and continuous learning patterns (e.g., RLHF, user scoring, prompt iteration). - Experience deploying models in cloud-native and hybrid environments (AWS, Azure, or GCP). Preferred Qualifications: - Experience fine-tuning or optimizing open-source LLMs (e.g., LLaMA, Mistral) with tools like LoRA/QLoRA. - Knowledge of compliance requirements (HIPAA, GDPR, SOC2) in AI systems. - Prior work building secure, governed LLM applications in highly regulated industries. - Background in data cataloging, enterprise metadata management, or ML model registries. About NTT DATA: NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize, and transform for long-term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation, and management of applications, infrastructure, and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

chandigarh

On-site

As an integral part of Oceaneering's operations since 2003, Oceaneerings India Center caters to a wide range of business needs, including oil and gas field infrastructure, subsea robotics, and automated material handling & logistics. Our multidisciplinary team provides diverse solutions in Subsea Engineering, Robotics, Automation, Control Systems, Software Development, Asset Integrity Management, Inspection, ROV operations, Field Network Management, Graphics Design & Animation, and more. Apart from technical functions, Oceaneering India Center oversees crucial business functions like Finance, Supply Chain Management (SCM), Information Technology (IT), Human Resources (HR), and Health, Safety & Environment (HSE). Our state-of-the-art infrastructure in India, comprising modern offices, cutting-edge tools and software, well-equipped labs, and scenic campuses, reflects our commitment to the future of work. Oceaneering fosters a flexible, transparent, and collaborative work culture globally, promoting great team synergy. At Oceaneering India Center, we pride ourselves on Solving the Unsolvable by leveraging the diverse expertise within our team and shaping the future of technology and engineering solutions on a global scale. Position Summary: You will play a key role in managing and enhancing data governance processes and standards within the organization. Collaborating closely with various departments, your responsibilities will include ensuring accurate, consistent, secure, and responsible use of data. Duties And Responsibilities: - Document data governance activities, such as data policies, procedures, and guidelines. - Implement and enforce data governance policies, procedures, and standards. - Maintain the data governance framework in alignment with regulatory, legal, and business requirements. - Create and update data dictionaries, glossaries, and catalogs for ensuring data standardization. - Collaborate with data owners and stewards to define data quality requirements. Qualifications: - Bachelor's degree in data management, Information Systems, Computer Science, Statistics, or a related field. Experience: - 2-4 years of experience in data-related roles, including internships, coursework, or projects in data governance, data management, or data analysis. - Familiarity with regulatory standards like GDPR and CCPA is advantageous but not mandatory. Knowledge, Skills, Abilities, And Other Characteristics: - Familiarity with data governance principles and practices. - Strong analytical skills with a focus on detail and accuracy. - Knowledge of data management tools and platforms like Excel, SQL, data catalogs, or data governance tools is beneficial. - Excellent communication and interpersonal skills for effective collaboration. - Interest in data privacy, compliance, and risk management. - Ability to handle confidential information with integrity. Preferred Qualifications: - Willingness to learn and adapt to new technologies and frameworks. - Strong organizational and problem-solving abilities. - Capability to work independently and as part of a team in a fast-paced environment. Oceaneering is committed to providing equal employment opportunities to all applicants. Employees seeking job opportunities are encouraged to apply through the PeopleSoft or Oceanet portals and discuss their interest with their current manager/supervisor. Additionally, we prioritize learning and development opportunities for employees to reach their potential and drive their future growth. Our ethos of internal promotion offers long-term career advancement opportunities across countries and continents, supporting employees who demonstrate the ability, drive, and ambition to take charge of their future.,

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

hyderabad, telangana

On-site

As the Director of Enterprise Data Management at TriNet, you will play a crucial role in strengthening the impact of business data across the organization. You will be responsible for identifying, defining, and analyzing data assets that align with the company's strategies and business outcomes. Your expertise will drive the company's data architecture and execution strategy, enabling efforts in analytics, AI, ML, and data science. Your primary responsibilities will include influencing data strategy by collaborating with business unit leadership, enhancing data applications for efficient access and consumption, and recognizing the value derived from data and analytics. You will lead efforts in data and analytics governance, business data modeling, and improving business performance through enterprise data solutions. To excel in this role, you should have at least 10+ years of experience in Data Management, Data Architecture, Enterprise Architecture, and/or Information Systems. A Bachelor's Degree in Computer Science or a related field is preferred, along with Certification of Competency in Business Analysis (CCBATM). Strong organizational, planning, and communication skills are essential, along with a solid understanding of data administration and modern database technologies. You will work in a clean, pleasant, and comfortable office setting as this position is 100% in-office. TriNet encourages applicants who may not meet every single requirement to apply, as the company values diversity and strives to hire the most qualified candidates for each role. If you are passionate about driving innovation and making a significant impact on the large SMB market, this role at TriNet may be the perfect opportunity for you. Join us in powering our clients" business success with extraordinary HR and contribute to our mission of delivering outstanding results for small and medium-size customers.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As an AI Data Analyst at Carrier Global Corporation, a global leader in intelligent climate and energy solutions, you will be responsible for analyzing and interpreting complex datasets to provide actionable insights that support business objectives in areas such as Supply Chain, Finance, or Operations. With over 5 years of experience in data management principles, data engineering, and modeling, you will play a key role in transforming raw data into actionable insights, ensuring data quality, and supporting data-driven decision-making across the organization. You will collaborate with data engineering teams to design and optimize ETL processes, build and maintain data models using industry-standard data modeling tools, develop and manage data catalogs using data catalog tools, and contribute to the design and implementation of data marketplaces and data products. Working within an Agile framework, you will participate in sprint planning, stand-ups, and retrospectives to deliver iterative data solutions and partner with business stakeholders to understand specific processes and translate requirements into data-driven solutions. To be successful in this role, you should have a Bachelor's degree in Data Science, Computer Science, Statistics, Business Analytics, or a related field, along with 5+ years of experience as a Data Analyst or in a similar role. You should have strong experience in data management principles, proficiency in data engineering concepts, practical experience in developing and maintaining data catalogs, knowledge of data marketplaces and data products, and experience working in an Agile framework. Strong analytical skills, excellent communication skills, and the ability to work with cross-functional teams are essential for this role. Preferred qualifications include experience with cloud platforms, familiarity with BI tools for data visualization, and exposure to advanced analytics techniques. At Carrier Global Corporation, we are committed to providing a great place to work that attracts, develops, and retains the best talent, promotes employee engagement, fosters teamwork, and ultimately drives innovation for the benefit of our customers. Join us and make a difference by embracing The Carrier Way.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

As the Subject Matter Expert for Master Data (Business Partner) at Fresenius, you will be responsible for defining, managing, and optimizing the organization's master data frameworks related to business partner data. This includes ensuring the integrity, accuracy, and consistency of master data across systems by driving best practices in data management and governance. Your role will focus on regulatory compliance and operational excellence while collaborating with cross-functional teams to design scalable processes, support seamless data integration, and enable efficient decision-making through high-quality business partner data. Your tasks will include establishing and implementing governance frameworks specific to business partner master data, defining the structure, standards, and taxonomy for capturing, storing, and maintaining business partner data across enterprise applications, and collaborating with various teams to define and maintain data dictionaries, metadata, and data lineage documentation. You will also monitor and manage data quality initiatives, lead periodic data quality reviews, and maintain oversight of sensitive business partner data to ensure compliance with global regulations. Additionally, you will stay current with emerging data governance frameworks, technologies, and methodologies, recommending innovative solutions to enhance data governance effectiveness. Your role will also involve serving as a subject matter expert for business partner data, conducting gap analyses, and implementing scalable solutions for improved data quality and consistency. Qualifications for this role include a degree in Computer Science, Information Management, Life Sciences, or a related field, along with 8+ years of experience in data governance, preferably within the life sciences or pharmaceutical industries. You should have a strong familiarity with regulatory requirements such as FDA, EMA, HIPAA, and knowledge of GxP-compliant systems or similar controlled environments. Strong analytical and problem-solving skills, deep understanding of data lineage, metadata management, and data stewardship practices are also essential. Excellent communication skills will be crucial for engaging with cross-functional teams and presenting governance strategies and policies to diverse stakeholders. If you are looking to join Fresenius Digital Technology in Bangalore, India, in a full-time, permanent position within the Engineering & Technology IT sector, and meet the qualifications mentioned above, we encourage you to apply for this exciting opportunity. You can send your application to Amit Kumar at Amit.Singh1@fresenius.com. Please note that the application process will be subject to the country-specific labor laws of the legal entity.,

Posted 2 weeks ago

Apply

3.0 - 13.0 years

0 Lacs

hyderabad, telangana

On-site

As a Data Management Specialist, based in Hyderabad, you will leverage your 3 to 13 years of experience and work collaboratively with stakeholders to understand and translate data requirements into detailed technical specifications. You will play a crucial role as a data product owner, defining, prioritizing, and releasing data services and assets. Your responsibilities will also include defining data and metadata requirements, conducting data discovery and statistical analysis, and ensuring data accuracy and conformity. You will be responsible for creating, documenting, and maintaining data process maps, integration flows, and data models. In case of data discrepancies, you will conduct root-cause analysis and coordinate remediation efforts. Your role will involve automating data pipelines, feeds, and interfaces, particularly in cloud environments like AWS. Additionally, you will support the end-to-end delivery of data products by collaborating with Data Engineers and analysts. To excel in this role, you should have domain knowledge in Consumer or Business Lending or Wealth Management. Strong communication skills, both verbal and written, are essential. You should possess excellent problem-solving abilities, attention to detail, and the capacity to work collaboratively in a cross-functional environment. Proficiency in advanced SQL skills across enterprise-level DBMS such as Redshift, Oracle, SQL Server, and Python scripting is required. Familiarity with data governance, metadata management, and automation best practices is a plus. Knowledge of Agile methodologies and data product life cycle practices is mandatory. If you are a proactive Data Management Specialist with the required skills and experience, we encourage you to share your updated resume with mithra.umapathy@cognizant.com and athiaravinthkumar.selvappandi@cognizant.com.,

Posted 2 weeks ago

Apply

10.0 - 16.0 years

35 - 45 Lacs

hyderabad, pune

Hybrid

About Straive: Straive is a market leading Content and Data Technology company providing data services, subject matter expertise, & technology solutions to multiple domains. Data Analytics & Al Solutions, Data Al Powered Operations and Education & Learning form the core pillars of the companys long-term vision. The company is a specialized solutions provider to business information providers in finance, insurance, legal, real estate, life sciences and logistics. Straive continues to be the leading content services provider to research and education publishers. Data Analytics & Al Services: Our Data Solutions business has become critical to our client's success. We use technology and Al with human experts-in loop to create data assets that our clients use to power their data products and their end customers' workflows. As our clients expect us to become their future fit Analytics and Al partner, they look to us for help in building data analytics and Al enterprise capabilities for them. With a client-base scoping 30 countries worldwide, Straive’s multi-geographical resource pool is strategically located in eight countries - India, Philippines, USA, Nicaragua, Vietnam, United Kingdom, and the company headquarters in Singapore. Website: https://www.straive.com/ Job Summary: We are seeking a highly skilled Data Governance Architect with strong expertise in defining and lead enterprise-wide Data governance strategies, design, governance architecture and experience in tools implementation like Informatica EDC/AXON, Collibra,Alation,MHUB and other leading Data Governance tool platforms. The ideal candidate will lead data quality, consistency, and accessibility across various Enterprise platforms and business units. Required Qualifications: Bachelor’s/master’s degree in information systems, Computer Science, or a related technical field. Strong knowledge of data governance, architecture techniques and methodologies and experience in data governance initiatives is must. 7 years of minimum experience in data governance architecture and implementation of Data Governance across business enterprise. Hands on experience in design and implement architectural patterns for data quality, metadata management, data lineage, data security, and master data management Strong hands-on expertise in Collibra (workflows, APIs, metadata integration, policy automation). Experience with ETL/ELT pipelines, data lineage capture, and data integration tools. Familiarity with data modeling (conceptual, logical, physical). Proficiency in SQL, Python/Java for integration and automation. Experience with SQL, back end scripting and API’s. Understanding of data governance principles and compliance practices Proficiency in working with cloud platforms (AWS, Azure, or GCP) Knowledge of big data technologies (Hadoop/Spark, etc.) and data visualization and BI tools is a plus Strong analytical and problem-solving skills Excellent communication and stakeholder management abilities Roles & responsibilities: Design, develop, and maintain enterprise-wide data governance architecture frameworks and meta data models Establish data governance strategies, policies, standards, and procedures for compliance processes Conduct maturity assessments and change management efforts. Evaluate and recommend Data governance framework and tools to meet enterprise business needs. Design and implement architectural patterns for data catalog, data quality, metadata management, data lineage, data security, and master data management (MDM) across various data platforms (e.g., data lakes, data warehouses, operational databases). Create and manage data dictionaries, metadata repositories, and data catalogs Architect technical and business metadata workflows and govern glossary approvals and workflows Validate end to end lineage across multiple sources and targets Design and enforce rules for, classification, Access, Retention and sharing data techniques Analyze/ Define the enterprise business KPI’s and validate data governance requirements. Collaborate with Data Stewards to define technical specifications for data quality rules, validation checks and KPI’s reporting. Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications. Ensure data compliance with relevant regulations like GDPR,HIPAA,CCPA,SOX etc. Excellent communication and ability to mentor and inspire teams.

Posted 2 weeks ago

Apply

10.0 - 15.0 years

25 - 37 Lacs

hyderabad, pune

Hybrid

About Straive: Straive is a market leading Content and Data Technology company providing data services, subject matter expertise, & technology solutions to multiple domains. Data Analytics & Al Solutions, Data Al Powered Operations and Education & Learning form the core pillars of the companys long-term vision. The company is a specialized solutions provider to business information providers in finance, insurance, legal, real estate, life sciences and logistics. Straive continues to be the leading content services provider to research and education publishers. Data Analytics & Al Services: Our Data Solutions business has become critical to our client's success. We use technology and Al with human experts-in loop to create data assets that our clients use to power their data products and their end customers' workflows. As our clients expect us to become their future-fit Analytics and Al partner, they look to us for help in building data analytics and Al enterprise capabilities for them. With a client-base scoping 30 countries worldwide, Straive’s multi-geographical resource pool is strategically located in eight countries - India, Philippines, USA, Nicaragua, Vietnam, United Kingdom, and the company headquarters in Singapore. Website: https://www.straive.com/ Overview /Objective: The Data Governance Lead will serve as a hands-on leader to drive the development and implementation of our enterprise-wide data governance framework. This role is critical to solving current data challenges, including inconsistent business rules, unclear data ownership, lack of data quality standards, fragmented business taxonomies, and absence of a centralized data catalog. The successful candidate will serve as a bridge between business and technical stakeholders to establish scalable governance practices that increase trust in data, reduce inefficiencies, and enable data-driven decision-making across the organization. Responsibilities: Governance Framework & Standards Design and implement a fit-for-purpose enterprise data governance framework. Develop and maintain policies, procedures, and standards for data governance, including data ownership, stewardship, access, and usage. Lead the rollout of role definitions for data owners and stewards across the business. Business Rules & Taxonomy Management Partner with business units to identify, standardize, and document core business rules and data definitions. Drive alignment and version control of business taxonomies, ensuring they are centrally maintained and regularly updated. Establish a governance process for creating, updating, and retiring taxonomy elements and business rules. Data Quality & Accountability Define and implement data quality dimensions (accuracy, completeness, timeliness, consistency, etc.) across key data domains. Collaborate with functional leads to establish and maintain quality rules and thresholds. Create a feedback loop to monitor data issues and ensure accountability for resolution. Data Ownership & Stewardship Lead the identification and onboarding of data owners and stewards. Facilitate training and engagement programs to support the ongoing participation of governance roles. Establish a RACI model to ensure clarity of responsibilities and decision-making authority. Metadata Management & Cataloging Lead the selection and implementation of a data catalog and metadata management solution (if not already in place). Work with technical teams to populate and maintain metadata in a centralized platform. Promote catalog adoption and usage across the enterprise. Cross-Functional Engagement Partner with data platform, engineering, analytics, legal, compliance, and business teams to embed governance in daily operations. Serve as a subject matter expert and advisor on data governance for major initiatives and projects. Qualifications: 10+ years of experience in data management, with 7+ years in Data Governance roles. Proven track record designing and operationalizing governance frameworks at scale. Strong knowledge of data quality principles, metadata management, business rules management, and metadata standards. Hands-on knowledge with data governance platforms (e.g., Collibra, Alation, Informatic) is a plus. Familiarity with Google cloud platform(GCP) Exceptional communication and stakeholder management skills. Ability to operate both strategically and tactically in a cross-functional environment.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

20 - 25 Lacs

mumbai

Work from Office

Position Purpose The Data Architect is to support the work for ensuring that systems are designed, upgraded, managed, de-commissioned and archived in compliance with data policy across the full data life cycle. This includes complying with the data strategy and undertaking the design of data models and supporting the management of metadata. The Data Architect mission will integrate a focus on GDPR law, with the contribution to the privacy impact assessment and Record of Process & Activities relating to personal Data. The scope is CIB EMEA and CIB ASIA Responsibilities Direct Responsibilities Engage with key business stakeholders to assist with establishing fundamental data governance processes Define key data quality metrics and indicators and facilitate the development and implementation of supporting standards Help to identify and deploy enterprise data best practices such as data scoping, metadata standardization, data lineage, data deduplication, mapping and transformation and business validation Structures the information in the Information System (any data modelling tool like Abacus), i.e. the way information is grouped, as well as the navigation methods and the terminology used within the Information Systems of the entity, as defined by the lead data architects. Creates and manages data models (Business Flows of Personal Data with process involved) in all their forms, including conceptual models, functional database designs, message models and others in compliance with the data framework policy Allows people to step logically through the Information System (be able to train them to use tools like Abacus) Contribute and enrich the Data Architecture framework through the material collected during analysis, projects and IT validations Update all records in Abacus collected from stakeholder interviews/ meetings. Skill Area Expected Communicating between the technical and the non-technical Is able to communicate effectively across organisational, technical and political boundaries, understanding the context. Makes complex and technical information and language simple and accessible for non- technical audiences. Is able to advocate and communicate what a team does to create trust and authenticity, and can respond to challenge. Able to effectively translate and accurately communicate across technical and non- technical stakeholders as well as facilitating discussions within a multidisciplinary team, with potentially difficult dynamics. Data Modelling (Business Flows of Data in Abacus) Produces data models and understands where to use different types of data models. Understands different tools and is able to compare between different data models. Able to reverse engineer a data model from a live system. Understands industry recognized data modelling patterns and standards. Understands the concepts and principles of data modelling and is able to produce, maintain and update relevant data models for specific business needs. Data Standards (Rules defined to manage/ maintain Data) Develops and sets data standards for an organisation. Communicates the business benefit of data standards, championing and governing those standards across the organisation. Develops data standards for a specific component. Analyses where data standards have been applied or breached and undertakes an impact analysis of that breach. Metadata Management Understands a variety of metadata management tools. Designs and maintains the appropriate metadata repositories to enable the organization to understand their data assets. Works with metadata repositories to complete and Maintains it to ensure information remains accurate and up to date. The objective is to manage own learning and contribute to domain knowledge building Turning business problems into data design Works with business and technology stakeholders to translate business problems into data designs. Creates optimal designs through iterative processes, aligning user needs with organisational objectives and system requirements. Designs data architecture by dealing with specific business problems and aligning it to enterprise-wide standards and principles. Works within the context of well understood architecture and identifies appropriate patterns. Contributing Responsibilities It is expected that the data architect applies knowledge and experience of the capability, including tools and technique and adopts those that are more appropriate for the environment. The Data Architect needs to have the knowledge of: The Functional & Application Architecture, Enterprise Architecture and Architecture rules and principles The activities Global Market and/or Global Banking Market meta-models, taxonomies and ontologies (such as FpML, CDM, ISO2022) Skill Area Expected Data Communication Uses the most appropriate medium to visualise data to tell compelling and actionable stories relevant for business goals. Presents, communicates and disseminates data appropriately and with high impact. Able to create basic visuals and presentations. Data Governance Understands data governance and how it works in relation to other organisational governance structures. Participates in or delivers the assurance of a service. Understands what data governance is required and contribute to these data governance. Data Innovation Recognises and exploits business opportunities to ensure more efficient and effective performance of organisations. Explores new ways of conducting business and organisational processes Aware of opportunities for innovation with new tools and uses of data Technical & Behavioral Competencies 1. Able to effectively translate and accurately communicate across technical and non- technical stakeholders as well as facilitating discussions within a multidisciplinary team, with potentially difficult dynamics. 2. Able to create basic visuals and presentations. 3. Experience in working with Enterprise Tools (like Abacus, informatica, big data, collibra, etc) 4. Experience in working with BI Tools (Like Power BI) 5. Good understanding of Excel (formulas and Functions) Specific Qualifications (if required) Preferred: BE/ BTech, BSc-IT, BSc-Comp, MSc-IT, MSc Comp, MCA Skills Referential Behavioural Skills : Communication skills - oral & written Ability to collaborate / Teamwork Ability to deliver / Results driven Creativity & Innovation / Problem solving Transversal Skills: Analytical Ability Ability to understand, explain and support change Ability to develop and adapt a process Ability to anticipate business / strategic evolution Education Level: Bachelor Degree or equivalent Experience Level At least 7 years Other/Specific Qualifications (if required) 1. Experience in GDPR (General Data Protection Regulation) or in Privacy by Design would be preferred 2. DAMA Certified

Posted 2 weeks ago

Apply

9.0 - 12.0 years

27 - 32 Lacs

bengaluru

Work from Office

Job Purpose Bajaj Finserv Web is a critical component of the companys omnipresence strategy. You will be working with Indias largest NBFCs web technology stack, encompassing over 40 business lines and 230+ features, with nearly 500 million traffic and managing over 30,000 webpages.It is an integrated platform offering a portfolio of products covering payments, cards, wallets, loans, deposits, mutual funds, and loans on lifestyle products, ranging from consumer durables to home furnishings.The Technical Architect will lead a major implementation project, collaborating with various POD teams to ensure timely delivery and utilizing technologies like AEM, frontend frameworks, AWS/Azure, and DevOps, while focusing on customer segmentation and personalization. Duties and Responsibilities 1.Technology Architecture and Roadmap - Create a robust Architecture for the new Web Platform looking at non-Functional aspects including Security, Performance, Scalability and Availability Lead, Define, maintain, and own platform and solution architecture for the Customer Facing Asset within wider IT Compliance Ensure that the roadmap contains the new and yet-to-release features of the core base products like Adobe experience Manager, Node JS, React JS, Solid JS, AWS, DevOps pipeline, Adobe Target, Adobe and Google Analytics, NewRelic, Akamai and various other frameworks Must be able to create a validation framework to measure and report the effectiveness of Architecture Must be able to create a culture of industry benchmarking before releasing or opting any new product/framework and be able to define a robust roadmap and evolution of the same with respect to the current and future needs of the One Web Platform Collaborate with IT Teams, Marketing teams, Data teams and partners across the organization to create a sustainable and achievable framework for the platform Must be able to create a strong understanding of the backend infrastructure and systems while delivering a dynamic, personalized and customer first integrated asset Work collaboratively with various partners to define the Security Architecture of platform including Video hosting, Caching, Security feature like DOS Executing POCs to validate technology roadmaps, feasibilities & possibilities with scalable solutions which are also versatile, inter-operable, can co-exist in the overall ecosystem and cost effective Must create a wholistic Auto Scalable and Highly Available environment across all key components including Node servers, AEM Servers, DAM and other such critical components of the One Web Asset Leverage and sponsor innovation work, both through internal incubators and company's external start-up network to create, evaluate, and introduce novel technical capabilities into the platform Foster a culture of innovation and engineering excellence across the Enterprise: modern engineering practices, adoption of open source and open standards, creating a culture of collaboration and efficiency Ensure that throughout the year including peak sales season, digital assets continue to perform the best by suggesting robust technology frameworks, right infrastructure, and correct data flow processes Analyze data like drop-offs, bounce rate etc. to constantly evaluate and improve process flows and to identify any tool ideas for processes improvements that can be built to attract the online customer Partner with Engineering teams across BFL to create an environment that provides an optimal Infrastructure Developer Experience for, from IDE and CI/CD through to IaaS provisioning and Cloud Native Service on-boarding frameworks 2. Leadership and Team development Add strategic value to processes through competition mapping and best practices adoptionScout the technology landscape to ensure adoption of emerging solutions and maintain innovative edge Participate in Project presentation with project priorities, timelines, quarterly plans, etc. to Vertical Head for sign-off Inspire and influence others to think differently, solve problems, and seize opportunities Work with cross-functional teams to set and achieve targets for cross-selling Determine individual training needs & development plans to build expertise and enhance skills Set objectives, conduct reviews, and close appraisal processes for the team as per timelines Ensure high employee engagement and morale through right management interventions while ensuring a deep emotional intelligence in approach Establish performance expectations and regularly review individual performance of the team Identify and create development opportunities for team members to enhance technical knowledge Work towards customer business outcomes, ensuring there is a strong connection between delivery activities and business objectives Key Decisions / Dimensions Recommendations on existing architecture of AEM to get it integrated with NODE JS and REACT JS as major architecture component to build optimal solution to handle very high traffic with minimal infra Development workflow definition to reduce major gaps and bandwidth challenges Onboard and offload Partner and internal resource on basis of POD requirement for deliverablesInternal and external training program for the freshers and byte employee to build their career as per interest Development build checklist for every deployment to maintain hygiene on PROD serversAPI structure and integration approaches to build Mobile and Web App Common content across both APP and Web platform to reduce repetitive task and stepsProduct and technology evaluation to meet the business use cases/requirementFinance evolution for technology unit within Marketing department All decisions towards quality delivery to release quality products Major Challenges Innovative architecture definition which integrates seamlessly with Marketing product suites and tools Data driven architecture to utilize user behavioral and transactional data to provide prefer user experience for acquisitions of new users New finance products and capabilities understanding to build business driven solutions with collaboration of data and marketing products Systems and technologies need to be continuously evolved/ changed within minimum time to manage growing business volumesConstant training to byte hires and new joiners for optimum results Required Qualifications and Experience a) Qualifications B. Tech Computer Science and Engineering b)Work Experience-Minimum 9-12 years of experience in software development with a strong focus on web content management systems particularly AEM, React JS, Solid JS, Node Js along with DevOps practices Industry Knowledge: Knowledge of the finance industry and experience in leading technical deliveries. Technical Expertise: Proficiency in Java/JEE, AEM, and associated technologies like OSGi, Sling, JCR, Apache, React JS, Solid JS, Node JS, Akamai Frontend Skills: Solid knowledge of HTML5, CSS3, JavaScript and related frameworks (React JS, Solid JS). Experience with frontend technologies like Bootstrap, Backbone.js, ReactJS, Handlebars, Grunt, Angular, CSS3, HTML5, and jQuery. Cloud and DevOps: Experience with cloud platforms (AWS, Azure) and DevOps tools (Jenkins, Maven). Strong knowledge of cloud-native approaches and platforms including AWS, Azure, or GCP. Experience with SaaS-based implementation of AEM as Cloud Service, AEM SDK (preferred). Leadership: Strong leadership skills with the ability to manage and mentor development teams. Project Management: Lead and involve in planning and estimations of Adobe projects. Lead all tracks of the project from frontend, backend, QA, and project management. AEM Expertise: Strong hands-on experience in components, templates, taxonomy, metadata management, forward and reverse replication, workflow, content publishing and unpublishing, tagging, deployment (Maven), and content migration/planning.Infrastructure: Strong physical architecture concepts (infrastructure) including load balancers (ELB), Apache setup, CDN, disaster recovery, recommending capacity of AEM publish and author instances. Quality Assurance: Implemented quality processes for projects like continuous integration (Bamboo/Jenkins/Git/BitBucket/Cloud Manager), SonarQube, code reviews (manual and automated), code formatters, automation testing, etc. Mobile and DAM

Posted 2 weeks ago

Apply

4.0 - 8.0 years

16 - 20 Lacs

bengaluru

Work from Office

As a Process Mining, you need to design data and information architecture for the Process Mining Implementations using process mining tools like Celonis IBM Process Mining, ABBY Timeline, Signavio etc Needs to have the knowledge of ETL (extract, transform and load), SQL, data architecture, data integration, data mining, data modelling, metadata management, data staging techniques, BI tools (such as MS Power BI, SAP BI/BW, Oracle BI, etc) , and will be able integrate multiple end/source systems for process mining etc. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise ETL, SQL, Data Modelling, Data design/architecture - hands on / working experience. Celonis Integration (Live / Offline) and implementations experiences with clients. BI tools (such as MS Power BI, SAP BI/BW, Oracle BI, etc) Integrate multiple end/source systems for process mining etc Certification in one or more Celonis post sales/delivery tracks. Preferred technical and professional experience Good to have Celonis Implementation Professional Certified with working experience around it. Responsible to learn and apply appropriate ETL, BI tools, if required. Responsible to drive and work in process mining (Celonis) implementations end to end. If you thrive in a dynamic, collaborative workplace.

Posted 3 weeks ago

Apply

5.0 - 6.0 years

8 - 10 Lacs

mumbai

Work from Office

We are seeking a skilled SAS Administrator with at least 5 years of experience in managing SAS platforms, including installation, configuration, and administration. The ideal candidate should have hands-on expertise in SAS Viya 3.4, SAS Viya 3.5, SAS Viya 4, SAS Management Console (SMC), and server-level configurations. Experience working in government or large enterprise environments is preferred. Key Responsibilities: Perform installation, configuration, and maintenance of SAS 9.x and SAS Viya 4 on Linux/Unix server environments. Manage SAS Environment Manager, SAS Management Console (SMC), metadata administration, and user/group/role permissions. Monitor and tune system performance, ensuring platform availability and integrity. Administer SAS server security, backups, and patching. Collaborate with IT teams to troubleshoot server-level or configuration issues. Perform regular upgrades, migrations, and license updates. Coordinate with SAS Tech Support for critical issues or escalations. Prepare and maintain technical documentation and SOPs. Required Skills: Minimum 5 years of hands-on experience in SAS Platform Administration. Strong experience in SAS Viya 4 administration and traditional SAS (9.x). Good knowledge of SAS SMC, metadata management, and server architecture. Experience in installation/configuration on Linux/Unix environments. Familiar with security setup, resource management, and system health monitoring. Knowledge of shell scripting is a plus. Preferred Qualifications: BE / Btech / Mtech / MCA / MSc - Stats. Prior experience working with government or public sector clients is a plus. SAS certifications (e.g., SAS Certified Platform Administrator) are a bonus

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies