Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 14.0 years
0 Lacs
hyderabad, telangana
On-site
As a Business Analyst with 7-14 years of experience, you will be responsible for various tasks including Business Requirement Documents (BRD) and Functional Requirement Documents (FRD) creation, Stakeholder Management, User Acceptance Testing (UAT), understanding Datawarehouse Concepts, SQL queries, and subqueries, as well as utilizing Data Visualization tools such as Power BI or MicroStrategy. It is essential that you have a deep understanding of the Investment Domain, specifically in areas like Capital markets, Asset management, and Wealth management. Your primary responsibilities will involve working closely with stakeholders to gather requirements, analyzing data, and testing systems to ensure they meet business needs. Additionally, you should have a strong background in investment management or financial services, with experience in areas like Asset management, Investment operations, and Insurance. Your familiarity with concepts like Critical Data Elements (CDEs), data traps, and reconciliation workflows will be beneficial in this role. Technical expertise in BI and analytics tools like Power BI, Tableau, and MicroStrategy is required, along with proficiency in SQL. You should also possess excellent communication skills, analytical thinking capabilities, and the ability to engage effectively with stakeholders. Experience in working within Agile/Scrum environments with cross-functional teams is highly valued. In terms of technical skills, you should demonstrate proven abilities in analytical problem-solving, with a deep knowledge of investment data platforms such as Golden Source, NeoXam, RIMES, and JPM Fusion. Expertise in cloud data technologies like Snowflake, Databricks, and AWS/GCP/Azure data services is essential. Understanding data governance frameworks, metadata management, and data lineage is crucial, along with compliance standards in the investment management industry. Hands-on experience with Investment Book of Records (IBORs) like Blackrock Alladin, CRD, Eagle STAR (ABOR), Eagle Pace, and Eagle DataMart is preferred. Familiarity with investment data platforms including Golden Source, FINBOURNE, NeoXam, RIMES, and JPM Fusion, as well as cloud data platforms like Snowflake and Databricks, will be advantageous. Your background in data governance, metadata management, and data lineage frameworks will be essential in ensuring data accuracy and compliance within the organization.,
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Senior Data Engineer at Veersa, you will utilize your deep expertise in ETL/ELT processes, data warehousing principles, and both real-time and batch data integrations. In this role, you will have the opportunity to mentor junior engineers, establish best practices, and contribute to the overarching data strategy of the company. Your proficiency in SQL, Python, and ideally Airflow and Bash scripting will be instrumental in designing and implementing scalable data integration and pipeline solutions using Azure cloud services. Your key responsibilities will include architecting and implementing data solutions, developing ETL/ELT processes, building and automating data workflows, orchestrating pipelines, and writing Bash scripts for system automation. Collaborating with business and technical stakeholders to understand data requirements and translating them into technical solutions will be a key aspect of your role. Moreover, you will be expected to develop data flows, mappings, quality standards, and validation rules across various systems, ensuring adherence to best practices in data modeling, metadata management, and data governance. To qualify for this role, you must hold a B.Tech or B.E degree in Computer Science, Information Systems, or a related field, along with a minimum of 3 years of experience in data engineering, focusing on Azure-based solutions. Your proficiency in SQL and Python, experience with Airflow and Bash scripting, and proven track record in real-time and batch data integrations will be essential. Hands-on experience with Azure Data Factory, Azure Data Lake, Azure Synapse, and Databricks is highly desirable, as well as a strong understanding of data warehousing principles, ETL/ELT methodologies, and data pipeline architecture. In addition, familiarity with data quality, metadata management, and data validation frameworks, coupled with strong problem-solving skills and clear communication abilities, will set you up for success in this role. Preferred qualifications include experience with multi-tenant SaaS data solutions, knowledge of DevOps practices, CI/CD pipelines, version control systems like Git, and a proven ability to mentor and coach other engineers in technical decision-making processes. By joining Veersa as a Senior Data Engineer, you will play a crucial role in driving innovation and delivering cutting-edge technical solutions to clients in the US healthcare industry.,
Posted 2 months ago
4.0 - 8.0 years
0 Lacs
haryana
On-site
You are seeking an Analytics Developer with expertise in Databricks, Power BI, and ETL technologies to design, develop, and deploy advanced analytics solutions. Your focus will be on creating robust, scalable data pipelines, implementing actionable business intelligence frameworks, and delivering insightful dashboards and reports to drive strategic decision-making. This role involves close collaboration with technical teams and business stakeholders to ensure analytics initiatives align with organizational objectives. With 8+ years of experience in analytics, data integration, and reporting, you should possess 4+ years of hands-on experience with Databricks, including proficiency in Databricks Notebooks for development and testing. Your key responsibilities will include leveraging Databricks to develop and optimize scalable data pipelines for real-time and batch data processing, designing and implementing Databricks Notebooks for exploratory data analysis, ETL workflows, and machine learning models, managing and optimizing Databricks clusters for performance, cost efficiency, and scalability, using Databricks SQL for advanced query development, data aggregation, and transformation, incorporating Python and/or Scala within Databricks workflows to automate and enhance data engineering processes, developing solutions to integrate Databricks with other platforms such as Azure Data Factory for seamless data orchestration, creating interactive and visually compelling Power BI dashboards and reports to enable self-service analytics, leveraging DAX for building calculated columns, measures, and complex aggregations, designing effective data models in Power BI using star schema and snowflake schema principles for optimal performance, configuring and managing Power BI workspaces, gateways, and permissions for secure data access, implementing row-level security and data masking strategies in Power BI to ensure compliance with governance policies, building real-time dashboards by integrating Power BI with Databricks, Azure Synapse, and other data sources, providing end-user training and support for Power BI adoption across the organization, developing and maintaining ETL/ELT workflows ensuring high data quality and reliability, implementing data governance frameworks to maintain data lineage, security, and compliance with organizational policies, optimizing data flow across multiple environments including data lakes, warehouses, and real-time processing systems, collaborating with data governance teams to enforce standards for metadata management and audit trails, working closely with IT teams to integrate analytics solutions with ERP, CRM, and other enterprise systems, troubleshooting and resolving technical challenges related to data integration, analytics performance, and reporting accuracy, staying updated on the latest advancements in Databricks, Power BI, and data analytics technologies, driving innovation by integrating AI/ML capabilities into analytics solutions using Databricks, contributing to the enhancement of organizational analytics maturity through scalable and reusable approaches. You should possess self-management skills, thinking outside the box, learning new technologies, logical thinking, fluency in English, strong communication skills, a Bachelor's degree in Computer Science, Data Science, or a related field (Masters preferred), relevant certifications, and the ability to manage multiple priorities in a fast-paced environment with high customer expectations.,
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
At PwC, our team in managed services focuses on providing a variety of outsourced solutions and supporting clients across multiple functions. We help organizations streamline their operations, reduce costs, and enhance efficiency by managing key processes and functions on their behalf. Our team is skilled in project management, technology, and process optimization, ensuring the delivery of high-quality services to our clients. Those in managed service management and strategy at PwC are responsible for transitioning and running services, managing delivery teams, programs, commercials, performance, and delivery risk. Your role will involve continuous improvement and optimization of managed services processes, tools, and services. As a member of our team, you will build meaningful client relationships and learn how to manage and inspire others. You will navigate complex situations, develop your personal brand, deepen your technical expertise, and leverage your strengths. Anticipating the needs of your teams and clients, you will deliver quality results. Embracing ambiguity, you will be comfortable when the path forward is unclear, asking questions and using such moments as opportunities for growth. Required skills, knowledge, and experiences for this role include but are not limited to: - Responding effectively to diverse perspectives, needs, and feelings of others - Using a broad range of tools, methodologies, and techniques to generate new ideas and solve problems - Applying critical thinking to break down complex concepts - Understanding the broader objectives of your project or role and how your work aligns with the overall strategy - Developing a deeper understanding of the business context and its changing dynamics - Using reflection to enhance self-awareness, strengths, and development areas - Interpreting data to derive insights and recommendations - Upholding and reinforcing professional and technical standards, along with the Firm's code of conduct and independence requirements As a Senior Associate, you will work collaboratively with a team of problem solvers, addressing complex business issues from strategy to execution through Data, Analytics & Insights Skills. Your responsibilities at this level include: - Using feedback and reflection to enhance self-awareness, personal strengths, and address development areas - Demonstrating critical thinking and the ability to structure unstructured problems - Reviewing deliverables for quality, accuracy, and relevance - Adhering to SLAs, incident management, change management, and problem management - Leveraging tools effectively in different situations and explaining the rationale behind the choices - Seeking opportunities for exposure to diverse situations, environments, and perspectives - Communicating straightforwardly and structurally to influence and connect with others - Demonstrating leadership by engaging directly with clients and leading engagements - Collaborating in a team environment with client interactions, workstream management, and cross-team cooperation - Contributing to cross-competency work and Center of Excellence activities - Managing escalations and risks effectively Position Requirements: - Primary Skill: Tableau, Visualization, Excel - Secondary Skill: Power BI, Cognos, Qlik, SQL, Python, Advanced Excel, Excel Macro BI Engineer Role: - Minimum 5 years hands-on experience in building advanced Data Analytics - Minimum 5 years hands-on experience in delivering Managed Data and Analytics programs - Extensive experience in developing scalable, repeatable, and secure data structures and pipelines - Proficiency in industry tools like Python, SQL, Spark for Data analytics - Experience in building Data Governance solutions using leading tools - Knowledge of Data consumption patterns and BI tools like Tableau, Qlik Sense, Power BI - Strong communication, problem-solving, quantitative, and analytical abilities Certifications in Tableau and other BI tools are advantageous, along with certifications in any cloud platform. In our Managed Services - Data, Analytics & Insights team at PwC, we focus on collaborating with clients to leverage technology and human expertise, delivering simple yet powerful solutions. Our goal is to enable clients to focus on their core business while trusting us as their IT partner. We are driven by the passion to enhance our clients" capabilities every day. Within our Managed Services platform, we offer integrated services grounded in industry experience and powered by top talent. Our team of global professionals, combined with cutting-edge technology, ensures effective outcomes that add value to our clients" enterprises. Through a consultative approach, we enable transformational journeys that drive sustained client outcomes, allowing clients to focus on accelerating their priorities and optimizing their operations. As a member of our Data, Analytics & Insights Managed Service team, you will contribute to critical Application Evolution Service offerings, help desk support, enhancement and optimization projects, and strategic roadmap development. Your role will involve technical expertise and relationship management to support customer engagements effectively.,
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
You will have a pivotal role in implementing and embracing the data governance framework at Amgen, which aims to revolutionize the company's data ecosystem and establish Amgen as a pioneer in biopharma innovation. This position will make use of cutting-edge technologies such as Generative AI, Machine Learning, and integrated data. Your expertise in domains, technical knowledge, and business processes will be crucial in providing exceptional support for Amgen's data governance framework. Collaboration with business stakeholders and data analysts will be essential to ensure successful implementation and adoption of the data governance framework. Working closely with the Product Owner and other Business Analysts will be necessary to guarantee operational support and excellence from the team. You will be responsible for the implementation of the data governance and data management framework within a specific domain of expertise, such as Research, Development, or Supply Chain. Operationalizing the Enterprise data governance framework and aligning a broader stakeholder community with their data governance needs, including data quality, data access controls, compliance with privacy and security regulations, master data management, data sharing, communication, and change management will be part of your responsibilities. Collaborating with Enterprise MDM and Reference Data to enforce standards and data reusability will also be key. You will drive cross-functional alignment in your area of expertise to ensure adherence to Data Governance principles and maintain privacy policies and procedures to safeguard sensitive data and ensure compliance. Regular privacy risk assessments and audits will be conducted by you to identify and mitigate potential risks as required. Furthermore, you will be responsible for maintaining documentation on data definitions, data standards, data flows, legacy data structures, common data models, and data harmonization for the assigned domains. Ensuring compliance with data privacy, security, and regulatory policies for the assigned domains, including GDPR, CCPA, and other relevant legislations, will be critical. Together with Technology teams, business functions, and enterprise teams, you will define the specifications shaping the development and implementation of data foundations. Building strong relationships with key business leads and partners to ensure their needs are met will also be part of your role. Your must-have functional skills include technical knowledge of Pharma processes with specialization in a domain, in-depth understanding of data management, data quality, master data management, data stewardship, data protection, and familiarity with data protection laws and regulations. You should have experience in the development life cycle of data products and proficiency in tools like Collibra and Alation. Strong problem-solving skills, excellent communication, and working with data governance frameworks are essential. Experience with data governance councils, Agile software development methodologies, proficiency in data analysis and quality tools, and 3-5 years of experience in data privacy or compliance are good-to-have functional skills. Soft skills required for this role include integrity, adaptability, proactivity, leadership, organization, analytical skills, ability to work effectively with teams, manage multiple priorities, ambition to develop skills and career, build business relationships, understand end-to-end data use and needs, interpersonal skills, initiative, self-motivation, presentation skills, attention to detail, time management, and customer focus. Basic qualifications for this position include any Degree and 9-13 years of experience.,
Posted 2 months ago
3.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
We are seeking a highly skilled and experienced Snowflake Architect to take charge of designing, developing, and deploying enterprise-grade cloud data solutions. The ideal candidate should possess a solid background in data architecture, cloud data platforms, and Snowflake implementation, along with practical experience in end-to-end data pipeline and data warehouse design. In this role, you will be responsible for leading the architecture, design, and implementation of scalable Snowflake-based data warehousing solutions. You will also be tasked with defining data modeling standards, best practices, and governance frameworks. Collaborating with stakeholders to comprehend data requirements and translating them into robust architectural solutions will be a key part of your responsibilities. Furthermore, you will be required to design and optimize ETL/ELT pipelines utilizing tools like Snowpipe, Azure Data Factory, Informatica, or DBT. Implementing data security, privacy, and role-based access controls within Snowflake is also essential. Providing guidance to development teams on performance tuning, query optimization, and cost management within Snowflake will be part of your duties. Additionally, ensuring high availability, fault tolerance, and compliance across data platforms will be crucial. Mentoring developers and junior architects on Snowflake capabilities is an important aspect of this role. Qualifications and Experience: - 8+ years of overall experience in data engineering, BI, or data architecture, with a minimum of 3+ years of hands-on Snowflake experience. - Expertise in Snowflake architecture, data sharing, virtual warehouses, clustering, and performance optimization. - Strong proficiency in SQL, Python, and cloud data services (e.g., AWS, Azure, or GCP). - Hands-on experience with ETL/ELT tools like ADF, Informatica, Talend, DBT, or Matillion. - Good understanding of data lakes, data mesh, and modern data stack principles. - Experience with CI/CD for data pipelines, DevOps, and data quality frameworks. - Solid knowledge of data governance, metadata management, and cataloging. Desired Skills: - Snowflake certification (e.g., SnowPro Core/Advanced Architect). - Familiarity with Apache Airflow, Kafka, or event-driven data ingestion. - Knowledge of data visualization tools such as Power BI, Tableau, or Looker. - Experience in healthcare, BFSI, or retail domain projects. Please note that this job description is sourced from hirist.tech.,
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
You are a highly motivated and detail-oriented Data Catalog Analyst with expertise in erwin Data Intelligence Suite (DIS), particularly the erwin Data Catalog module. Your main responsibility will be to build and maintain a centralized metadata repository that enables data discovery, lineage, and governance across the enterprise. You will configure, implement, and maintain the erwin Data Catalog to support enterprise metadata management. It will be your duty to harvest metadata from various data sources such as databases, ETL tools, BI platforms, etc., ensuring accuracy and completeness. Your role will involve developing and maintaining data lineage, impact analysis, and data flow documentation. Collaboration with data stewards, business analysts, and IT teams is essential for defining and enforcing metadata standards and governance policies. You will also support the creation and maintenance of business glossaries, technical metadata, and data dictionaries. Ensuring metadata is accessible and well-organized to enable data discovery and self-service analytics will be one of your priorities. You are expected to provide training and support to business and technical users on effectively using the erwin platform. Monitoring system performance and troubleshooting issues related to metadata ingestion and cataloging will also be part of your responsibilities. To qualify for this role, you should hold a Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field. Additionally, you should have at least 3 years of experience in data governance, metadata management, or enterprise data architecture. Hands-on experience with erwin DIS, especially erwin Data Catalog and erwin Data Modeler, is required. Knowledge of other tools like Collibra, Atlasian, etc., can also be considered. A strong understanding of metadata management, data lineage, and data governance frameworks (e.g., DAMA-DMBOK) is necessary. You should be familiar with relational databases, data warehouses, and cloud data platforms (e.g., AWS, Azure, GCP), along with proficiency in SQL and data profiling tools. Preferred skills for this role include experience with other data governance tools (e.g., Collibra, Informatica, Alation), knowledge of regulatory compliance standards (e.g., GDPR, HIPAA, CCPA), strong communication and stakeholder engagement skills, knowledge of creating documentation, experience with Agile methodology, and the ability to work independently and manage multiple priorities in a fast-paced environment.,
Posted 2 months ago
4.0 - 8.0 years
7 - 11 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Work from Office
Job Title: Erwin Data Modeler, Insurance domain Location: Any Job Type: Full-Time | 2-11pm Shift Job Summary We are seeking a skilled and experienced Data Modeler with hands-on expertise in Erwin Data Modeling to join our team. The ideal candidate will have a strong background in data architecture and modeling, with a minimum of 4 years of relevant experience. Knowledge of the insurance domain is a significant plus. Key Responsibilities Design, develop, and maintain conceptual, logical, and physical data models using Erwin Data Modeler. Collaborate with business analysts, data architects, and developers to understand data requirements and translate them into data models. Ensure data models align with enterprise standards and best practices. Perform data analysis and profiling to support modeling efforts. Maintain metadata and documentation for data models. Support data governance and data quality initiatives. Participate in reviews and provide feedback on data models and database designs. Required Skills & Qualifications Strong understanding of data modeling concepts including normalization, denormalization, and dimensional modeling. Knowledge on any relational database will be an advantage. Familiarity with data warehousing and ETL processes. Excellent analytical and problem-solving skills. Strong communication and collaboration abilities.
Posted 2 months ago
5.0 - 10.0 years
13 - 18 Lacs
Hyderabad
Work from Office
Overview Enterprise Data Operations Analyst Job OverviewAs an Analyst, Data Modeling , your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics . The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks , Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/ modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives , maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper managementbusiness and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levelslow-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications Qualifications 5+ years of overall technology experience that includes at least 2+ years of data modeling and systems architecture. Around 2+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 2+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ , and Great Expectations. Experience building/ operating highly available , distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake . Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI ).
Posted 2 months ago
8.0 - 13.0 years
18 - 22 Lacs
Hyderabad
Work from Office
Overview Enterprise Data Operations Sr Analyst L08 Job OverviewAs Senior Analyst, Data Modeling , your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics . The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks , Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/ modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives , maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper managementbusiness and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levelslow-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications Qualifications 8+ years of overall technology experience that includes at least 4+ years of data modeling and systems architecture. 3+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ , and Great Expectations. Experience building/ operating highly available , distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake . Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI ). Does the person hired for this job need to be based in a PepsiCo office, or can they be remote Employee must be based in a Pepsico office Primary Work LocationHyderabad HUB-IND
Posted 2 months ago
3.0 - 8.0 years
10 - 20 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Job Title: OCI Data Catalog Specialist Experience Level: 3+ Years Job Level: Consultant / Senior Consultant (based on total experience) Work Location: Hyderabad, Bangalore, Chennai, Mumbai, Pune, Kolkata, Gurgaon Work Mode: Hybrid (Minimum 2 days/week from office) Work Timings: 10:00 AM - 7:00 PM IST Contract to Hire Role Overview: We are looking for a skilled and proactive OCI Data Catalog PoC Specialist to design, implement, and demonstrate a Proof of Concept (PoC) for Oracle Cloud Infrastructure (OCI) Data Catalog as part of a strategic data governance initiative. The role involves showcasing OCI Data Catalog features, evaluating its fit for business needs, and guiding future adoption in production. Key Responsibilities: Lead the end-to-end delivery of the OCI Data Catalog PoC . Collaborate with client stakeholders to understand data governance goals and cataloging needs. Configure and integrate OCI Data Catalog with data sources including: Oracle Autonomous Database OCI Object Storage On-premises databases Develop and execute test scenarios for: Metadata harvesting Data lineage Data classification Stewardship and search Integrate OCI Data Catalog metadata output with Marketplace applications for automated sharing. Troubleshoot PoC issues and coordinate with Oracle support. Document PoC results , provide lessons learned, and recommend steps for production rollout. Provide training and knowledge transfer sessions to client teams. Required Skills and Experience: 3+ years of experience in data management, governance, or cloud-based data solutions. Hands-on experience with OCI Data Catalog is mandatory . Strong understanding of: Metadata management Data lineage Classification and cataloging principles Proven experience integrating data catalogs with multiple source types (cloud + on-prem). Familiarity with Oracle Cloud Infrastructure (OCI) and associated data services. Strong analytical, communication, and documentation skills. Ability to work with cross-functional teams and present findings to both technical and business stakeholders . Good to Have: Experience with other data governance tools or frameworks. Exposure to Oracle Marketplace and its integration with metadata workflows. Prior experience in client-facing PoC or advisory roles in cloud data environments.
Posted 2 months ago
7.0 - 10.0 years
0 - 0 Lacs
noida
On-site
Lead Taxonomist AEM Operations_Full-time_Noida [Remote/Hybrid] Job Title: Lead Taxonomist AEM Operations Experience Required: 7+ Years Location: Noida [Remote/Hybrid] Employment Type: [Full-time] About the Role *We are seeking a highly skilled and strategic Lead Taxonomist to join our AEM Operations team. *In this role, you will be responsible for designing, implementing, and maintaining the visa.com taxonomy structure to ensure intuitive content categorization and seamless navigation across digital platforms. *You will partner with UX/UI designers, content strategists, SEO specialists, and engineering teams to deliver a robust taxonomy framework that enhances discoverability, improves content performance, and supports a consistent user experience. *This role is ideal for someone who thrives in a data-driven, collaborative environment and is passionate about organizing information in meaningful and scalable ways. Key Responsibilities *Design and maintain the visa.com taxonomy to support intuitive navigation, consistent content structure, and scalable content strategy. *Develop tagging strategies and metadata frameworks to ensure accurate content labeling, improved search capabilities, and optimal content discoverability. *Collaborate cross-functionally with UX/UI, content, SEO, and engineering teams to implement, validate, and enhance taxonomy solutions across platforms powered by Adobe Experience Manager (AEM). *Conduct regular audits of taxonomy and metadata structures; analyze user behavior and site analytics to continuously optimize taxonomy for usability and performance. *Champion best practices in information architecture, ensuring taxonomy evolves with the digital ecosystem and business objectives. *Translate business and user requirements into actionable taxonomy updates and metadata enhancements. Required Qualifications & Skills *Bachelors or Masters degree in Information Science, Data Science, Library Science, Human-Computer Interaction (HCI), or a related field. *5+ years of experience in taxonomy development, metadata management, or information architecture within digital product, CMS, or website environments. *Strong understanding of content management systems (especially AEM), tagging frameworks, metadata schemas, and SEO best practices. *Proficiency in auditing digital content using analytics tools and translating insights into actionable taxonomy improvements. *Exceptional attention to detail, with strong analytical and problem-solving skills. *Excellent communication skills and demonstrated ability to work collaboratively across cross-functional teams, including product, design, engineering, and marketing. *Comfortable working in an agile, fast-paced digital environment. Preferred Qualifications (Nice to Have) *Experience working with Adobe Experience Manager (AEM) or similar enterprise-level CMS platforms. *Familiarity with tools like Adobe Analytics, Google Analytics, or ContentSquare. *Experience with accessibility and internationalization considerations in taxonomy. What We Offer *An opportunity to shape the information structure of a global digital platform. *A dynamic, collaborative work environment with leading industry professionals. *Support for ongoing professional development in taxonomy and content strategy. *Flexible working hours and remote opportunities. --------------- If you are interested, please share your updated resume along with the following details for the next steps: # Your full name ( First : Middle : Last ) ( All expanded ): # Present Employer Name & Work Location: # Permanent / Contract Employee: # Current Location: # Preferred Location (Noida): # Highest Qualification (University Name and Passing year): # Total experience: # Current CTC and take home: # Expected CTC and take home: # Official Notice Period: # Are you serving notice period if yes then mention LWD (Last Working Day): # Any offer you are holding (if yes please share the offer amount):
Posted 2 months ago
7.0 - 11.0 years
0 Lacs
vadodara, gujarat
On-site
The purpose of your role is to define and develop Enterprise Data Structure, Data Warehouse, Master Data, Integration, and transaction processing while maintaining and strengthening modeling standards and business information. You will define and develop Data Architecture that supports the organization and clients in new/existing deals. This includes partnering with business leadership to provide strategic recommendations, assessing data benefits and risks, creating data strategy and roadmaps, engaging stakeholders for data governance, ensuring data storage and database technologies are supported, monitoring compliance with Data Modeling standards, overseeing frameworks for data management, and collaborating with vendors and clients to maximize the value of information. Additionally, you will be responsible for building enterprise technology environments for data architecture management. This involves developing, maintaining, and implementing standard patterns for data layers, data stores, data hub & lake, evaluating implemented systems, collecting and integrating data, creating data models, implementing best security practices, and demonstrating strong experience in database architectures and design patterns. You will also enable Delivery Teams by providing optimal delivery solutions and frameworks. This includes building relationships with delivery and practice leadership teams, defining database structures and specifications, establishing relevant metrics, monitoring system capabilities and performance, integrating new solutions, managing projects, identifying risks, ensuring quality assurance, recommending tools for reuse and automation, and supporting the integration team for better efficiency. In addition, you will ensure optimal Client Engagement by supporting pre-sales teams, negotiating and coordinating with client teams, demonstrating thought leadership, and acting as a trusted advisor. Join Wipro to reinvent your world and be a part of an end-to-end digital transformation partner with bold ambitions. Realize your ambitions in a business powered by purpose and empowered to design your reinvention. Applications from people with disabilities are explicitly welcome.,
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
At PwC, our team in infrastructure is dedicated to designing and implementing secure and robust IT systems that facilitate business operations. We focus on ensuring the smooth functioning of networks, servers, and data centers to enhance performance and reduce downtime. As part of the infrastructure engineering team at PwC, your role will involve creating and implementing scalable technology infrastructure solutions for our clients. This will encompass tasks such as network architecture, server management, and experience in cloud computing. We are currently seeking a Data Modeler with a solid background in data modeling, metadata management, and optimizing data systems. In this role, you will be responsible for analyzing business requirements, developing long-term data models, and maintaining the efficiency and consistency of our data systems. Key Responsibilities: - Analyze business needs and translate them into long-term data model solutions. - Evaluate existing data systems and suggest enhancements. - Define rules for data translation and transformation across different models. - Collaborate with the development team to design conceptual data models and data flows. - Establish best practices for data coding to ensure system consistency. - Review modifications to existing systems to ensure cross-compatibility. - Implement data strategies and create physical data models. - Update and optimize local and metadata models. - Utilize canonical data modeling techniques to improve system efficiency. - Evaluate implemented data systems for discrepancies, variances, and efficiency. - Troubleshoot and optimize data systems to achieve optimal performance. Key Requirements: - Proficiency in relational and dimensional modeling (OLTP, OLAP). - Experience with data modeling tools such as Erwin, ER/Studio, Visio, and PowerDesigner. - Strong skills in SQL and database management systems like Oracle, SQL Server, MySQL, and PostgreSQL. - Familiarity with NoSQL databases such as MongoDB and Cassandra, including their data structures. - Hands-on experience with data warehouses and BI tools like Snowflake, Redshift, BigQuery, Tableau, and Power BI. - Knowledge of ETL processes, data integration, and data governance frameworks. - Excellent analytical, problem-solving, and communication skills. Qualifications: - Bachelor's degree in Engineering or a related field. - 5 to 9 years of experience in data modeling or related areas. - 4+ years of practical experience in dimensional and relational data modeling. - Expertise in metadata management and relevant tools. - Proficiency in data modeling tools like Erwin, Power Designer, or Lucid. - Understanding of transactional databases and data warehouses. Preferred Skills: - Experience in cloud-based data solutions, such as AWS, Azure, and GCP. - Knowledge of big data technologies like Hadoop, Spark, and Kafka. - Understanding of graph databases and real-time data processing. - Certifications in data management, modeling, or cloud data engineering. - Strong communication and presentation skills. - Excellent interpersonal skills to collaborate effectively with diverse teams.,
Posted 2 months ago
3.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
We are seeking a highly skilled and experienced Snowflake Architect to take charge of designing, developing, and deploying enterprise-grade cloud data solutions. As the ideal candidate, you should possess a robust background in data architecture, cloud data platforms, and Snowflake implementation. Hands-on experience in end-to-end data pipeline and data warehouse design is essential for this role. Your responsibilities will include leading the architecture, design, and implementation of scalable Snowflake-based data warehousing solutions. You will be tasked with defining data modeling standards, best practices, and governance frameworks. Designing and optimizing ETL/ELT pipelines using tools such as Snowpipe, Azure Data Factory, Informatica, or DBT will be a key aspect of your role. Collaboration with stakeholders to understand data requirements and translating them into robust architectural solutions will also be expected. Additionally, you will be responsible for implementing data security, privacy, and role-based access controls within Snowflake. Guiding development teams on performance tuning, query optimization, and cost management in Snowflake is crucial. Ensuring high availability, fault tolerance, and compliance across data platforms will also fall under your purview. Mentoring developers and junior architects on Snowflake capabilities is another important aspect of this role. In terms of Skills & Experience, we are looking for candidates with at least 8+ years of overall experience in data engineering, BI, or data architecture, and a minimum of 3+ years of hands-on Snowflake experience. Expertise in Snowflake architecture, data sharing, virtual warehouses, clustering, and performance optimization is highly desirable. Strong proficiency in SQL, Python, and cloud data services (e.g., AWS, Azure, or GCP) is required. Hands-on experience with ETL/ELT tools like ADF, Informatica, Talend, DBT, or Matillion is also necessary. A good understanding of data lakes, data mesh, and modern data stack principles is preferred. Experience with CI/CD for data pipelines, DevOps, and data quality frameworks is a plus. Solid knowledge of data governance, metadata management, and cataloging is beneficial. Preferred qualifications include holding a Snowflake certification (e.g., SnowPro Core/Advanced Architect), familiarity with Apache Airflow, Kafka, or event-driven data ingestion, knowledge of data visualization tools such as Power BI, Tableau, or Looker, and experience in healthcare, BFSI, or retail domain projects. If you meet these requirements and are ready to take on a challenging and rewarding role as a Snowflake Architect, we encourage you to apply.,
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
The role of a Data Catalog Analyst is crucial in establishing and maintaining a centralized metadata repository to facilitate data discovery, lineage, and governance across an organization. As a Data Catalog Analyst, your primary responsibility will involve configuring, implementing, and managing the erwin Data Catalog to ensure effective metadata management. You will be required to extract metadata from diverse data sources and guarantee its accuracy and completeness. Collaboration with data stewards, business analysts, and IT teams will be essential to define and enforce metadata standards and governance policies. Additionally, you will play a key role in developing and sustaining data lineage, impact analysis, and data flow documentation. Your expertise will be instrumental in supporting the creation and upkeep of business glossaries, technical metadata, and data dictionaries. Facilitating data discovery and self-service analytics will be a core aspect of your role. This will entail ensuring that metadata is easily accessible and well-organized to enable efficient utilization by business and technical users. Providing training and assistance to users on leveraging the erwin platform effectively will also be part of your responsibilities. Monitoring system performance, troubleshooting metadata ingestion and cataloging issues, and ensuring seamless operational functionality will be critical tasks. As a qualified candidate, you should possess a Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field along with a minimum of 3 years of experience in data governance, metadata management, or enterprise data architecture. Proficiency in erwin DIS, particularly erwin Data Catalog and erwin Data Modeler, is required. A strong understanding of metadata management, data lineage, and data governance frameworks such as DAMA-DMBOK is essential. Familiarity with relational databases, data warehouses, and cloud data platforms like AWS, Azure, and GCP is preferred. Proficiency in SQL, data profiling tools, and experience with other data governance tools are advantageous. Preferred skills include knowledge of regulatory compliance standards, strong communication abilities, stakeholder engagement skills, experience in documentation creation, familiarity with Agile methodology, and the capability to work independently while managing multiple priorities in a fast-paced environment.,
Posted 2 months ago
8.0 - 10.0 years
10 - 20 Lacs
Pune
Remote
Job Summary: We are seeking an experienced Azure Data Governance Specialist to design, implement, and manage data governance frameworks and infrastructure across Azure-based platforms. The ideal candidate will ensure enterprise data is high-quality, secure, compliant, and aligned with business and regulatory requirements. This role combines deep technical expertise in Azure with a strong understanding of data governance principles, MDM, and data quality management. Key Responsibilities: Data Governance & Compliance: Design and enforce data governance policies, standards, and frameworks aligned with enterprise objectives and compliance requirements (e.g., GDPR, HIPAA). Master Data Management (MDM): Implement and manage MDM strategies and solutions within the Azure ecosystem to ensure consistency, accuracy, and accountability of key business data. Azure Data Architecture: Develop and maintain scalable data architecture on Azure (e.g., Azure Data Lake, Synapse, Purview, Alation, Anomalo) to support governance needs. Tooling & Automation: Deploy and manage Azure-native data governance tools such as Azure Purview, Microsoft Fabric, and Data Factory to classify, catalog, and monitor data assets including third party tools like Alation. Data Quality (DQ): Lead and contribute to Data Quality forums, establish DQ metrics, and integrate DQ checks and dashboards within Azure platforms. Security & Access Management: Collaborate with security teams to implement data security measures, role-based access controls, and data encryption in accordance with Azure best practices. Technical Leadership: Guide teams in best practices for designing data pipelines, metadata management, and lineage tracking with Azure tooling. Continuous Improvement: Drive improvements in data management processes and tooling to enhance governance efficiency and compliance posture. Mentorship & Collaboration: Provide technical mentorship to data engineers and analysts, promoting data stewardship and governance awareness across the organization. Qualifications: Education: Bachelors degree in Computer Science, Information Systems, or a related field. Experience: 8+ years of experience in data infrastructure and governance, with 3+ years focused on Azure data services and tools. Technical Skills: Proficiency with data governance tools: Alation, Purview, Synapse, Data Factory, Azure SQL, etc. Strong understanding of data modeling (conceptual, logical, and physical models). Experience with programming languages such as Python, C#, or Java. In-depth knowledge of SQL and metadata management. Leadership: Proven experience leading or influencing cross-functional teams in data governance and architecture initiatives. Certifications (preferred): Azure Data Engineer Associate, Azure Solutions Architect Expert, or Azure Purview-related certifications.
Posted 2 months ago
8.0 - 12.0 years
0 - 3 Lacs
Pune
Work from Office
Greetings for the Day! At least 8 years experience in a similar role in data management/analytics/architecture or engineering Experience with solution design and data modelling Experience working with metadata and an appreciation for metadata frameworks and ontologies A technical understanding of data transport mechanisms An understanding of data mesh and data product concepts Technical or physical data lineage experience is preferable Evidenced experience in documenting requirements and designing solutions to meet objectives in an efficient and robust way Experience within project or risk management change environment Recognition of being strong communicator, with excellent written and oratory ability A track record as an Agile and change management practitioner Signs of having the enthusiasm to identify, learn and coach others in new data, modelling and risk processes
Posted 2 months ago
7.0 - 12.0 years
30 - 45 Lacs
Noida, Hyderabad
Hybrid
Data Governance Lead About the Role: We are seeking an experienced Data Governance lead to join our team. In this role, you will be responsible for defining and implementing the data governance strategy. The role involves establishing metadata standards, defining attribute ownership models, ensuring regulatory compliance, and improving data quality and trust across the enterprise. Responsibilities: Define and implement enterprise-wide data governance framework Own the metadata catalog and ensure consistency across business and technical assets Develop and manage KPI registries, data dictionaries, and lineage documentation Collaborate with data stewards and domain owners to establish attribute ownership Lead efforts around data standardization, quality rules, and classification of sensitive data Ensure privacy and compliance (e.g., GDPR, PII, PHI) by enforcing tagging, masking, and access rules Define access control rules (purpose-based views, user roles, sensitivity levels) Oversee governance for data products and federated data domains Support internal audits and external regulatory reviews Coordinate with platform, analytics, security, and compliance teams Qualifications: Bachelor's or Master's degree in Computer Science, or a related technical field 10+ years of experience in data governance roles Experience integrating governance with modern data stacks (e.g.Data bricks, Snowflake) Strong experience in data governance tools (e.g., Alation, Unity Catalog ,Azure Purview,) Proven understanding of metadata management, data lineage, and data quality frameworks Experience in implementing federated governance models and data stewardship programs Knowledge of compliance requirements (GDPR, HIPAA,PII, etc.) Familiarity with data mesh principles and data contract approaches Excellent communication and stakeholder management skills Certification in data governance or management frameworks
Posted 2 months ago
8.0 - 10.0 years
17 - 22 Lacs
Bengaluru
Work from Office
The purpose of the role is to define and develop Enterprise Data Structure along with Data Warehouse, Master Data, Integration and transaction processing with maintaining and strengthening the modelling standards and business information. Do 1. Define and Develop Data Architecture that aids organization and clients in new/ existing deals a. Partnering with business leadership (adopting the rationalization of the data value chain) to provide strategic, information-based recommendations to maximize the value of data and information assets, and protect the organization from disruptions while also embracing innovation b. Assess the benefits and risks of data by using tools such as business capability models to create an data-centric view to quickly visualize what data matters most to the organization, based on the defined business strategy c. Create data strategy and road maps for the Reference Data Architecture as required by the clients d. Engage all the stakeholders to implement data governance models and ensure that the implementation is done based on every change request e. Ensure that the data storage and database technologies are supported by the data management and infrastructure of the enterprise f. Develop, communicate, support and monitor compliance with Data Modelling standards g. Oversee and monitor all frameworks to manage data across organization h. Provide insights for database storage and platform for ease of use and least manual work i. Collaborate with vendors to ensure integrity, objectives and system configuration j. Collaborate with functional & technical teams and clients to understand the implications of data architecture and maximize the value of information across the organization k. Presenting data repository, objects, source systems along with data scenarios for the front end and back end usage l. Define high-level data migration plans to transition the data from source to target system/ application addressing the gaps between the current and future state, typically in sync with the IT budgeting or other capital planning processes m. Knowledge of all the Data service provider platforms and ensure end to end view. n. Oversight all the data standards/ reference/ papers for proper governance o. Promote, guard and guide the organization towards common semantics and the proper use of metadata p. Collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure a common understanding, consistency, accuracy and control q. Provide solution of RFPs received from clients and ensure overall implementation assurance i. Develop a direction to manage the portfolio of all the databases including systems, shared infrastructure services in order to better match business outcome objectives ii. Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution for the big/small data iii. Provide technical leadership to the implementation of custom solutions through thoughtful use of modern technology iv. Define and understand current issues and problems and identify improvements v. Evaluate and recommend solutions to integrate with overall technology ecosystem keeping consistency throughout vi. Understand the root cause problem in integrating business and product units vii. Validate the solution/ prototype from technology, cost structure and customer differentiation point of view viii. Collaborating with sales and delivery leadership teams to identify future needs and requirements ix. Tracks industry and application trends and relates these to planning current and future IT needs 2. Building enterprise technology environment for data architecture management a. Develop, maintain and implement standard patterns for data layers, data stores, data hub & lake and data management processes b. Evaluate all the implemented systems to determine their viability in terms of cost effectiveness c. Collect all the structural and non-structural data from different places integrate all the data in one database form d. Work through every stage of data processing: analysing, creating, physical data model designs, solutions and reports e. Build the enterprise conceptual and logical data models for analytics, operational and data mart structures in accordance with industry best practices f. Implement the best security practices across all the data bases based on the accessibility and technology g. Strong understanding of activities within primary discipline such as Master Data Management (MDM), Metadata Management and Data Governance (DG) h. Demonstrate strong experience in Conceptual, Logical and physical database architectures, design patterns, best practices and programming techniques around relational data modelling and data integration 3. Enable Delivery Teams by providing optimal delivery solutions/ frameworks a. Build and maintain relationships with delivery and practice leadership teams and other key stakeholders to become a trusted advisor b. Define database physical structure, functional capabilities, security, back-up and recovery specifications c. Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results d. Monitor system capabilities and performance by performing tests and configurations e. Integrate new solutions and troubleshoot previously occurred errors f. Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards g. Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects h. Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams i. Recommend tools for reuse, automation for improved productivity and reduced cycle times j. Help the support and integration team for better efficiency and client experience for ease of use by using AI methods. k. Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams l. Ensures architecture principles and standards are consistently applied to all the projects m. Ensure optimal Client Engagement i. Support pre-sales team while presenting the entire solution design and its principles to the client ii. Negotiate, manage and coordinate with the client teams to ensure all requirements are met iii. Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Mandatory Skills: Dataiku. Experience: 8-10 Years.
Posted 2 months ago
8.0 - 10.0 years
17 - 22 Lacs
Bengaluru
Work from Office
The purpose of the role is to define and develop Enterprise Data Structure along with Data Warehouse, Master Data, Integration and transaction processing with maintaining and strengthening the modelling standards and business information. Do 1. Define and Develop Data Architecture that aids organization and clients in new/ existing deals a. Partnering with business leadership (adopting the rationalization of the data value chain) to provide strategic, information-based recommendations to maximize the value of data and information assets, and protect the organization from disruptions while also embracing innovation b. Assess the benefits and risks of data by using tools such as business capability models to create an data-centric view to quickly visualize what data matters most to the organization, based on the defined business strategy c. Create data strategy and road maps for the Reference Data Architecture as required by the clients d. Engage all the stakeholders to implement data governance models and ensure that the implementation is done based on every change request e. Ensure that the data storage and database technologies are supported by the data management and infrastructure of the enterprise f. Develop, communicate, support and monitor compliance with Data Modelling standards g. Oversee and monitor all frameworks to manage data across organization h. Provide insights for database storage and platform for ease of use and least manual work i. Collaborate with vendors to ensure integrity, objectives and system configuration j. Collaborate with functional & technical teams and clients to understand the implications of data architecture and maximize the value of information across the organization k. Presenting data repository, objects, source systems along with data scenarios for the front end and back end usage l. Define high-level data migration plans to transition the data from source to target system/ application addressing the gaps between the current and future state, typically in sync with the IT budgeting or other capital planning processes m. Knowledge of all the Data service provider platforms and ensure end to end view. n. Oversight all the data standards/ reference/ papers for proper governance o. Promote, guard and guide the organization towards common semantics and the proper use of metadata p. Collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure a common understanding, consistency, accuracy and control q. Provide solution of RFPs received from clients and ensure overall implementation assurance i. Develop a direction to manage the portfolio of all the databases including systems, shared infrastructure services in order to better match business outcome objectives ii. Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution for the big/small data iii. Provide technical leadership to the implementation of custom solutions through thoughtful use of modern technology iv. Define and understand current issues and problems and identify improvements v. Evaluate and recommend solutions to integrate with overall technology ecosystem keeping consistency throughout vi. Understand the root cause problem in integrating business and product units vii. Validate the solution/ prototype from technology, cost structure and customer differentiation point of view viii. Collaborating with sales and delivery leadership teams to identify future needs and requirements ix. Tracks industry and application trends and relates these to planning current and future IT needs 2. Building enterprise technology environment for data architecture management a. Develop, maintain and implement standard patterns for data layers, data stores, data hub & lake and data management processes b. Evaluate all the implemented systems to determine their viability in terms of cost effectiveness c. Collect all the structural and non-structural data from different places integrate all the data in one database form d. Work through every stage of data processing: analysing, creating, physical data model designs, solutions and reports e. Build the enterprise conceptual and logical data models for analytics, operational and data mart structures in accordance with industry best practices f. Implement the best security practices across all the data bases based on the accessibility and technology g. Strong understanding of activities within primary discipline such as Master Data Management (MDM), Metadata Management and Data Governance (DG) h. Demonstrate strong experience in Conceptual, Logical and physical database architectures, design patterns, best practices and programming techniques around relational data modelling and data integration 3. Enable Delivery Teams by providing optimal delivery solutions/ frameworks a. Build and maintain relationships with delivery and practice leadership teams and other key stakeholders to become a trusted advisor b. Define database physical structure, functional capabilities, security, back-up and recovery specifications c. Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results d. Monitor system capabilities and performance by performing tests and configurations e. Integrate new solutions and troubleshoot previously occurred errors f. Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards g. Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects h. Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams i. Recommend tools for reuse, automation for improved productivity and reduced cycle times j. Help the support and integration team for better efficiency and client experience for ease of use by using AI methods. k. Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams l. Ensures architecture principles and standards are consistently applied to all the projects m. Ensure optimal Client Engagement i. Support pre-sales team while presenting the entire solution design and its principles to the client ii. Negotiate, manage and coordinate with the client teams to ensure all requirements are met iii. Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Mandatory Skills: Data Governance. Experience: 8-10 Years.
Posted 2 months ago
5.0 - 22.0 years
0 Lacs
karnataka
On-site
As an Enterprise Architect - Director at Ivy Mobility, you will be a key member of our core product development team and report directly to the VP of Engineering. Your primary responsibilities will involve designing processes, enforcing relevant architectural patterns, and optimizing the IT infrastructure to enhance business operations efficiently. Your role will also include driving technology initiatives, identifying areas for technical improvements, and contributing to the continuous enhancement of our product stack. Your responsibilities will encompass driving innovations and product improvements by analyzing opportunities for technical enhancements, actively participating in backlog grooming and implementation to strengthen the product's robustness and future-readiness. You will provide guidance on business architecture and systems processing, proactively engage with cross-functional teams to address challenges faced by internal teams or customers, and analyze product usage data to optimize operational costs and enhance application maintainability. Furthermore, you will be tasked with identifying bottlenecks in applications and developing solutions to ensure optimal performance, quality, scalability, security, and responsiveness. Your role will also involve suggesting cost-saving ideas to IT and business leadership, creating business architecture models aligned with the product strategies, and fostering employee knowledge and skills development for future growth. To be successful in this role, you should possess a Bachelor's degree in information technology, along with 18-22 years of work experience, including a minimum of 5 years as an Enterprise Architect or in a similar role within a SAAS product development organization. Proficiency in Microsoft .NET Core/Framework, experience in Python, extensive knowledge of SQL Server or other RDBMS, hands-on experience with AWS or Azure cloud platforms, and expertise in complex architectures and microservices frameworks are essential skill sets required for this position. Additionally, a background in Metadata Management, experience with Elastic search, proficiency in Docker, Kubernetes, and API Gateway, familiarity with data modeling and graphical representations, and full-stack development experience are highly preferred. Your ability to drive technology roadmaps, conduct research on new tools and technologies, and demonstrate effective leadership, motivational skills, strong communication, and interpersonal abilities will be crucial for excelling in this role.,
Posted 2 months ago
6.0 - 10.0 years
10 - 13 Lacs
Chennai, Bengaluru
Hybrid
Job Title: Senior Java Lead Compiler & Native Code Integration Location: Chennai Work Mode: Hybrid (WFO/WFH Optional) Job Summary We are seeking an exceptional and deeply technical Senior Java Lead to pioneer the future of Java performance and extensibility. You will spearhead the architecture and development of our ahead-of-time (AOT) compilation pipeline, translating Java source code and its extensions into high-performance native executables. This role centers on leveraging the JLang projectan LLVM backend for the Polyglot extensible compiler framework—to build a robust, next-generation toolchain. The ideal candidate is a systems-level programmer with a profound understanding of Java language internals, compiler design, and the LLVM ecosystem. You will not only lead a team of talented engineers but will be the principal architect for our compiler, its runtime system, and its language extension capabilities. This is a unique opportunity to solve fundamental challenges in programming language implementation and shape a technology that bridges the managed world of Java with the raw power of native code. Key Responsibilities 1. Architectural Ownership & Technical Vision • Design the Core Compiler Pipeline: Architect the end-to-end translation process from Java source -> Polyglot AST -> LLVM IR -> Native Executable. Make critical decisions on intermediate representations, code generation strategies, and toolchain integration. • Lead Language Extensibility Strategy: Define the framework for creating and integrating custom language features using Polyglot. Determine when features should be "desugared" to standard Java versus requiring direct, custom LLVM IR generation for maximum performance and control. • Set the Bar for Systems Programming: Champion rigorous coding standards, design patterns, and best practices for compiler development. Mentor and upskill developers in the complexities of LLVM, runtime systems, and language semantics. 2. Compiler and LLVM Toolchain Development • Master Java-to-LLVM Translation: Oversee the implementation of mappers for core Java semantics to LLVM IR, including the object model, virtual method dispatch (e.g., vtables), and exception handling (e.g., invoke/landingpad instructions). • Drive Code Optimization: Collaborate with LLVM specialists to leverage advanced optimization passes (e.g., LTO, PGO). Profile and analyze the generated LLVM IR to minimize binary size and maximize execution speed. • Ensure OpenJDK Compatibility: Guarantee that the AOT-compiled code can correctly link against and interoperate with the native components and class libraries of a standard OpenJDK distribution (e.g., for file I/O, networking). 3. Runtime System Design and Implementation • Engineer a Lightweight Java Runtime: Lead the development of a minimal, efficient runtime system required to support features that the JVM traditionally provides. • Solve the Hard Problems of AOT: Design and implement robust solutions for core JVM features in a native context: o Garbage Collection: Integrate and configure a GC (e.g., Boehm-Demers-Weiser) or lead the design of a custom memory manager suitable for AOT-compiled code. o Reflection: Develop a strategy to support java.lang.reflect by generating necessary metadata at compile-time to be consumed by the runtime. o Concurrency: Implement support for Java's memory model and synchronized keyword using native OS-level primitives (e.g., mutexes, atomics). 4. Build, Deployment, and Developer Enablement • Automate the Compiler Toolchain: Design and maintain a sophisticated CI/CD pipeline that automates the multi-stage build, cross-compilation, testing, and packaging of the entire system. • Create World-Class Documentation: Author and maintain in-depth technical documentation, including architectural diagrams, Getting Started guides, and tutorials for developing new language extensions. • Foster a Collaborative Environment: Act as the primary technical liaison between the compiler, runtime, and application teams, ensuring alignment and resolving deep technical challenges. Qualifications & Experience • Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related field. • 10+ years of professional software engineering experience, with a strong emphasis on Java. • Minimum of 3+ years in systems-level programming, such as compiler development, runtime systems, operating systems, or embedded systems. • Deep understanding of Java internals: You must be fluent in the Java Language Specification (JLS), Java Bytecode, the JVM's internal architecture, class loading, and JIT compilation principles. • Proven experience with LLVM: Demonstrable ability to generate, optimize, and debug LLVM Intermediate Representation (IR). Familiarity with the LLVM C++ API is a strong plus. • Excellent leadership qualities with experience mentoring junior engineers and leading complex technical projects. • Exceptional problem-solving skills and the ability to navigate ambiguous, highly complex technical domains. Preferred Skills (Desirable) • Direct experience with the Polyglot compiler framework or a similar extensible compiler project (e.g., Roslyn, Clang plugins). • Familiarity with the challenges of Ahead-of-Time (AOT) compilation for Java, including knowledge of projects like GraalVM Native Image or OpenJDK's Project Leyden. • Hands-on experience with runtime system components, particularly garbage collection and reflection mechanisms. • Active contributor to open-source compiler, toolchain, or language projects (please include links to your GitHub or relevant work)
Posted 2 months ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As an AEM Developer with over 6 years of experience, you will be responsible for working with AEM Assets, a digital asset management system integrated with Adobe Experience Manager (AEM). AEM Assets enables organizations to efficiently store, organize, and manage various digital assets like images, videos, documents, and multimedia files in a centralized repository. By leveraging the features of AEM Assets, you will streamline content creation and delivery processes, enhance collaboration between teams, and ensure the effective management of digital assets. Key Features of AEM Assets include: - Asset Management: Easily store, organize, and retrieve digital assets in a centralized repository, supporting various file formats and integrating with Adobe Creative Cloud tools. - Metadata Management: Attach metadata to assets for improved searchability and categorization, including descriptions, tags, and custom fields. - Version Control: Track changes made to assets over time with versioning capabilities, allowing for easy restoration of previous asset versions. - Dynamic Media: Create and deliver dynamic media in different formats and resolutions optimized for various devices without manual intervention. - Smart Tagging: Utilize Adobe Sensei to automatically generate tags for assets based on content, facilitating faster and more accurate asset discovery. - Workflow Automation: Automate approval workflows, asset updates, and content delivery processes to enhance efficiency and reduce manual tasks. - Integration with Adobe Creative Cloud: Seamlessly integrate with tools like Photoshop, Illustrator, and InDesign for efficient asset management. - Role-based Access Control: Control access permissions to assets with granular settings, ensuring secure access management. - Content Delivery Network (CDN) Integration: Integrate with a CDN for fast and reliable global delivery of digital assets. The benefits of using AEM Assets include improved collaboration, enhanced brand consistency, faster time to market, scalability for handling large volumes of assets, better user experience through metadata and smart tagging, and centralized asset management. AEM Assets caters to marketing teams, content creators, and developers seeking an efficient solution to manage digital assets effectively for various marketing and content delivery channels. In this full-time on-site role at Programming.com in Bengaluru, you will work as an Adobe Experience Manager, collaborating with cross-functional teams to manage and optimize digital content, implement web solutions, and deliver exceptional user experiences. Your qualifications should include analytical skills, project management abilities, communication skills, customer service expertise, experience with content management systems like AEM, a strong understanding of web technologies, and a bachelor's degree in Computer Science, Information Technology, or a related field. Join us at Programming.com, a leading software solution and digital transformation partner with a global presence, dedicated team of over 2000+ employees, and a commitment to core values, offering Agile development services to diverse clients across industries.,
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
You will be the visionary Group Data Product Manager (GPM) for AI/ML & Metadata Management, responsible for leading the development of advanced AI/ML-powered metadata solutions. Your primary focus will be on establishing a cohesive and intuitive Data Platform tailored to cater to a variety of user roles including data engineers, producers, and consumers. Your role involves integrating various tools to create a unified platform that will significantly improve data discoverability, governance, and operational efficiency on a large scale.,
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |