Home
Jobs

1380 Data Governance Jobs - Page 9

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 8.0 years

12 - 16 Lacs

Pune

Work from Office

Naukri logo

I am sharing the job description (JD) for the Data Architect role. We are looking for someone who can join as soon as possible, and I have included a few key bullet points below. The ideal candidate should have between 6 to 8 years of experience, and I am flexible in terms of a strong profile. Job Description: Key Responsibilities The ideal profile should have a strong foundation in data concepts, design, and strategy, with the ability to work across diverse technologies in an agnostic manner. Transactional Database Architecture Design and implement high-performance, reliable, and scalable transactional database architectures. Collaborate with cross-functional teams to understand transactional data requirements and create solutions that ensure data consistency, integrity, and availability. Optimize database designs and recommend best practices and technology stacks. Oversee the management of entire transactional databases, including modernization and de-duplication initiatives. Data Lake Architecture Design and implement data lakes that consolidate data from disparate sources into a unified, scalable storage solution. Architect and deploy cloud-based or on-premises data lake infrastructure. Ensure self-service capabilities across the data engineering space for the business. Work closely with Data Engineers, Product Owners, and Business teams. Data Integration & Governance: Understand ingestion and orchestration strategies. Implement data sharing, data exchange, and assess data sensitivity and criticality to recommend appropriate designs. Basic understanding of data governance practices. Innovation Evaluate and implement new technologies, tools, and frameworks to improve data accessibility, performance, and scalability. Stay up-to-date with industry trends and best practices to continuously innovate and enhance the data architecture strategy. Location: Pune Brand: Merkle Time Type: Full time Contract Type: Permanent

Posted 4 days ago

Apply

10.0 - 14.0 years

45 - 55 Lacs

Bengaluru

Work from Office

Naukri logo

As a Senior Engineering Manager - Myntra Data Platform, you will oversee the technical aspects of the data platform, driving innovation, and ensuring efficient data management processes. Your role will have a significant impact on the organizations data strategy and overall business objectives. Roles and Responsibilities: Lead and mentor a team of engineers to deliver high-quality data solutions. Develop and execute strategies for data platform scalability and performance optimization. Collaborate with cross-functional teams to align data platform initiatives with business goals. Define and implement best practices for data governance, security, and compliance. Drive continuous improvement through innovation and technological advancement. Monitor and analyze data platform metrics to identify areas for enhancement. Ensure seamless integration of new data sources and technologies into the platform. Qualifications: Bachelor s or Master s degree in Computer Science, Engineering, or related field. 10-14 years of experience in engineering roles with a focus on data management and analysis. Proven experience in leading high-performing engineering teams. Strong proficiency in data architecture, ETL processes, and database technologies. Excellent communication and collaboration skills to work effectively with stakeholders. Relevant certifications in data management or related fields are a plus. " Who are we Myntra is India s leading fashion and lifestyle platform, where technology meets creativity. As pioneers in fashion e-commerce, we ve always believed in disrupting the ordinary. We thrive on a shared passion for fashion, a drive to innovate to lead, and an environment that empowers each one of us to pave our own way. We re bold in our thinking, agile in our execution, and collaborative in spirit. Here, we create MAGIC by inspiring vibrant and joyous self-expression and expanding fashion possibilities for India, while staying true to what we believe in. We believe in taking bold bets and changing the fashion landscape of India. We are a company that is constantly evolving into newer and better forms and we look for people who are ready to evolve with us. From our humble beginnings as a customization company in 2007 to being technology and fashion pioneers today, Myntra is going places and we want you to take part in this journey with us. Working at Myntra is challenging but fun - we are a young and dynamic team, firm believers in meritocracy, believe in equal opportunity, encourage intellectual curiosity and empower our teams with the right tools, space, and opportunities.

Posted 4 days ago

Apply

9.0 - 14.0 years

0 - 0 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Role & responsibilities Design and develop conceptual, logical, and physical data models for enterprise and application-level databases. Translate business requirements into well-structured data models that support analytics, reporting, and operational systems. Define and maintain data standards, naming conventions, and metadata for consistency across systems. Collaborate with data architects, engineers, and analysts to implement models into databases and data warehouses/lakes. Analyze existing data systems and provide recommendations for optimization, refactoring, and improvements. Create entity relationship diagrams (ERDs) and data flow diagrams to document data structures and relationships. Support data governance initiatives including data lineage, quality, and cataloging. Review and validate data models with business and technical stakeholders. Provide guidance on normalization, denormalization, and performance tuning of database designs. Ensure models comply with organizational data policies, security, and regulatory requirements. Looking for a Data Modeler Architect to design conceptual, logical, and physical data models. Must translate business needs into scalable models for analytics and operational systems. Strong in normalization , denormalization , ERDs , and data governance practices. Experience in star/snowflake schemas and medallion architecture preferred. Role requires close collaboration with architects, engineers, and analysts. Data modelling, Normalization, Denormalization, Star and snowflake schemas, Medallion architecture, ERDLogical data model, Physical data model & Conceptual data model

Posted 4 days ago

Apply

8.0 - 12.0 years

10 - 14 Lacs

Chennai

Work from Office

Naukri logo

Role Responsibilities : - Develop and design comprehensive Power BI reports and dashboards. - Collaborate with stakeholders to understand reporting needs and translate them into functional requirements. - Create visually appealing interfaces using Figma for enhanced user experience. - Utilize SQL for data extraction and manipulation to support reporting requirements. - Implement DAX measures to ensure accurate data calculations. - Conduct data analysis to derive actionable insights and facilitate decision-making. - Perform user acceptance testing (UAT) to validate report performance and functionality. - Provide training and support for end-users on dashboards and reporting tools. - Monitor and enhance the performance of existing reports on an ongoing basis. - Work closely with cross-functional teams to align project objectives with business goals. - Maintain comprehensive documentation for all reporting activities and processes. - Stay updated on industry trends and best practices related to data visualization and analytics. - Ensure compliance with data governance and security standards. - Participate in regular team meetings to discuss project progress and share insights. - Assist in the development of training materials for internal stakeholders. Qualifications - Minimum 8 years of experience in Power BI and Figma. - Strong proficiency in SQL and database management. - Extensive knowledge of data visualization best practices. - Expertise in DAX for creating advanced calculations. - Proven experience in designing user interfaces with Figma. - Excellent analytical and problem-solving skills. - Ability to communicate complex data insights to non-technical stakeholders. - Strong attention to detail and commitment to quality. - Experience with business analytics and reporting tools. - Familiarity with data governance and compliance regulations. - Ability to work independently and as part of a team in a remote setting. - Strong time management skills and ability to prioritize tasks. - Ability to adapt to fast-paced working environments. - Strong interpersonal skills and stakeholder engagement capability. - Relevant certifications in Power BI or data analytics are a plus.

Posted 4 days ago

Apply

5.0 - 7.0 years

10 - 14 Lacs

Kolkata

Work from Office

Naukri logo

Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.

Posted 4 days ago

Apply

7.0 - 10.0 years

5 - 8 Lacs

Chennai

Work from Office

Naukri logo

Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 4 days ago

Apply

5.0 - 7.0 years

10 - 14 Lacs

Chennai

Work from Office

Naukri logo

Job Title : Sr. Data Engineer Ontology & Knowledge Graph Specialist Department : Platform Engineering Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.

Posted 4 days ago

Apply

8.0 - 12.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Role Responsibilities : - Develop and design comprehensive Power BI reports and dashboards. - Collaborate with stakeholders to understand reporting needs and translate them into functional requirements. - Create visually appealing interfaces using Figma for enhanced user experience. - Utilize SQL for data extraction and manipulation to support reporting requirements. - Implement DAX measures to ensure accurate data calculations. - Conduct data analysis to derive actionable insights and facilitate decision-making. - Perform user acceptance testing (UAT) to validate report performance and functionality. - Provide training and support for end-users on dashboards and reporting tools. - Monitor and enhance the performance of existing reports on an ongoing basis. - Work closely with cross-functional teams to align project objectives with business goals. - Maintain comprehensive documentation for all reporting activities and processes. - Stay updated on industry trends and best practices related to data visualization and analytics. - Ensure compliance with data governance and security standards. - Participate in regular team meetings to discuss project progress and share insights. - Assist in the development of training materials for internal stakeholders. Qualifications - Minimum 8 years of experience in Power BI and Figma. - Strong proficiency in SQL and database management. - Extensive knowledge of data visualization best practices. - Expertise in DAX for creating advanced calculations. - Proven experience in designing user interfaces with Figma. - Excellent analytical and problem-solving skills. - Ability to communicate complex data insights to non-technical stakeholders. - Strong attention to detail and commitment to quality. - Experience with business analytics and reporting tools. - Familiarity with data governance and compliance regulations. - Ability to work independently and as part of a team in a remote setting. - Strong time management skills and ability to prioritize tasks. - Ability to adapt to fast-paced working environments. - Strong interpersonal skills and stakeholder engagement capability. - Relevant certifications in Power BI or data analytics are a plus.

Posted 4 days ago

Apply

8.0 - 12.0 years

10 - 14 Lacs

Kolkata

Work from Office

Naukri logo

Role Responsibilities : - Develop and design comprehensive Power BI reports and dashboards. - Collaborate with stakeholders to understand reporting needs and translate them into functional requirements. - Create visually appealing interfaces using Figma for enhanced user experience. - Utilize SQL for data extraction and manipulation to support reporting requirements. - Implement DAX measures to ensure accurate data calculations. - Conduct data analysis to derive actionable insights and facilitate decision-making. - Perform user acceptance testing (UAT) to validate report performance and functionality. - Provide training and support for end-users on dashboards and reporting tools. - Monitor and enhance the performance of existing reports on an ongoing basis. - Work closely with cross-functional teams to align project objectives with business goals. - Maintain comprehensive documentation for all reporting activities and processes. - Stay updated on industry trends and best practices related to data visualization and analytics. - Ensure compliance with data governance and security standards. - Participate in regular team meetings to discuss project progress and share insights. - Assist in the development of training materials for internal stakeholders. Qualifications - Minimum 8 years of experience in Power BI and Figma. - Strong proficiency in SQL and database management. - Extensive knowledge of data visualization best practices. - Expertise in DAX for creating advanced calculations. - Proven experience in designing user interfaces with Figma. - Excellent analytical and problem-solving skills. - Ability to communicate complex data insights to non-technical stakeholders. - Strong attention to detail and commitment to quality. - Experience with business analytics and reporting tools. - Familiarity with data governance and compliance regulations. - Ability to work independently and as part of a team in a remote setting. - Strong time management skills and ability to prioritize tasks. - Ability to adapt to fast-paced working environments. - Strong interpersonal skills and stakeholder engagement capability. - Relevant certifications in Power BI or data analytics are a plus.

Posted 4 days ago

Apply

7.0 - 9.0 years

7 - 12 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

About KPI Partners. KPI Partners is a leading provider of enterprise software solutions, enabling organizations to harness the power of data through innovation and advanced analytics. Our team is dedicated to delivering high-quality services and products that transform the way businesses operate. We are seeking a seasoned Senior Report Developer – Tableau to drive and manage a strategic migration project from Tableau to Power BI . This role requires strong hands-on experience with Tableau dashboards, a deep understanding of Power BI, and the technical leadership to manage end-to-end migration with minimal disruption to business users. You will serve as the bridge between business stakeholders, BI developers, and project managers, ensuring a seamless transition while maintaining data integrity, performance, and user experience. Key Responsibilities: Lead the assessment, planning, and execution of Tableau-to-Power BI migration Analyze existing Tableau dashboards, data models, and data sources to define migration scope and approach Translate Tableau visualizations and logic (calculated fields, filters, LOD expressions, etc.) into Power BI equivalents Oversee data model optimization, dashboard redesign, and user experience improvements during migration Collaborate with business stakeholders to validate migrated reports and ensure alignment with reporting needs Manage a team of BI developers and provide technical direction and code reviews Ensure proper version control, documentation, and change management throughout the migration Establish performance benchmarks and implement best practices for Power BI deployment Conduct knowledge transfer and training sessions for end-users transitioning to Power BI Required Qualifications: 7+ years of experience in Business Intelligence and Analytics 5+ years of hands-on experience with Tableau development (dashboards, data blending, LODs, parameters) 2+ years of experience with Power BI , including DAX, Power Query, and data modeling Proven experience in BI migration projects (preferably Tableau to Power BI) Strong SQL skills and experience working with relational databases (SQL Server, Oracle, etc.) Solid understanding of data governance, security, and access controls Excellent communication, leadership, and stakeholder management skills Why Join KPI Partners? - Opportunity to work with a talented and passionate team in a fast-paced environment. - Competitive salary and benefits package. - Continuous learning and professional development opportunities. - A collaborative and inclusive workplace culture that values diversity and innovation. KPI Partners is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. Join us at KPI Partners and help us unlock the power of data for our clients!

Posted 4 days ago

Apply

12.0 - 15.0 years

16 - 20 Lacs

Chennai

Work from Office

Naukri logo

Group Company: MINDSPRINT DIGITAL (INDIA) PRIVATE LIMITED Designation: Data SolutionArchitect JobDescription: Design,architect, and implement scalable data solutions on Google Cloud Platform (GCP)to meet the strategic data needs of the organization. Leadthe integration of diverse data sources into a unified data platform, ensuringseamless data flow and accessibility across the organization. Developand enforce robust data governance, security, and compliance frameworkstailored to GCP's architecture. Collaboratewith cross-functional teams, including data engineers, data scientists, andbusiness stakeholders, to translate business requirements into technical datasolutions. Optimizedata storage, processing, and analytics solutions using GCP services such asBigQuery, Dataflow, and Cloud Storage. Drivethe adoption of best practices in data architecture and cloud computing toenhance the performance, reliability, and scalability of data solutions. Conductregular reviews and audits of the data architecture to ensure alignment withevolving business goals and technology advancements. Stayinformed about emerging GCP technologies and industry trends to continuouslyimprove data solutions and drive innovation. ProfileDescription: Experience:12-15 years of experience in data architecture, with extensive expertise inGoogle Cloud Platform (GCP). Skills:Deep understanding of GCP services including BigQuery, Dataflow, Pub/Sub, CloudStorage, and IAM. Proficiency in data modeling, ETL processes, and datawarehousing. Qualifications:Masters degree in Computer Science, Data Engineering, or a related field. Competencies:Strong leadership abilities, with a proven track record of managing large-scaledata projects. Ability to balance technical and business needs in designingdata solutions. Certifications:Google Cloud Professional Data Engineer or Professional Cloud Architectcertification preferred. Knowledge:Extensive knowledge of data governance, security best practices, and compliancein cloud environments. Familiarity with big data technologies like ApacheHadoop and Spark. SoftSkills: Excellent communication skills to work effectively with both technicalteams and business stakeholders. Ability to lead and mentor a team of dataengineers and architects. Tools:Experience with version control (Git), CI/CD pipelines, and automation tools.Proficient in SQL, Python, and data visualization tools like Looker or Power BI. Required abilities Physical: Other: Work Environment Details: Specific requirements Travel: Vehicle: Work Permit: Other details Pay Rate: Contract Types: Time Constraints: Compliance Related: Union Affiliation:

Posted 4 days ago

Apply

12.0 - 15.0 years

15 - 19 Lacs

Chennai

Work from Office

Naukri logo

Group Company: MINDSPRINT DIGITAL (INDIA) PRIVATE LIMITED Designation: Data Solution Architect Job Description: Design, architect, and implement scalable data solutions on Google Cloud Platform (GCP) to meet the strategic data needs of the organization. Lead the integration of diverse data sources into a unified data platform, ensuring seamless data flow and accessibility across the organization. Develop and enforce robust data governance, security, and compliance frameworks tailored to GCP's architecture. Collaborate with cross-functional teams, including data engineers, data scientists, and business stakeholders, to translate business requirements into technical data solutions. Optimize data storage, processing, and analytics solutions using GCP services such as BigQuery, Dataflow, and Cloud Storage. Drive the adoption of best practices in data architecture and cloud computing to enhance the performance, reliability, and scalability of data solutions. Conduct regular reviews and audits of the data architecture to ensure alignment with evolving business goals and technology advancements. Stay informed about emerging GCP technologies and industry trends to continuously improve data solutions and drive innovation. Profile Description: Experience: 12-15 years of experience in data architecture, with extensive expertise in Google Cloud Platform (GCP). Skills: Deep understanding of GCP services including BigQuery, Dataflow, Pub/Sub, Cloud Storage, and IAM. Proficiency in data modeling, ETL processes, and data warehousing. Qualifications: Masters degree in Computer Science, Data Engineering, or a related field. Competencies: Strong leadership abilities, with a proven track record of managing large-scale data projects. Ability to balance technical and business needs in designing data solutions. Certifications: Google Cloud Professional Data Engineer or Professional Cloud Architect certification preferred. Knowledge: Extensive knowledge of data governance, security best practices, and compliance in cloud environments. Familiarity with big data technologies like Apache Hadoop and Spark. Soft Skills: Excellent communication skills to work effectively with both technical teams and business stakeholders. Ability to lead and mentor a team of data engineers and architects. Tools: Experience with version control (Git), CI/CD pipelines, and automation tools. Proficient in SQL, Python, and data visualization tools like Looker or Power BI. Required abilities Physical: Other: Work Environment Details: Specific requirements Travel: Vehicle: Work Permit: Other details Pay Rate: Contract Types: Time Constraints: Compliance Related: Union Affiliation:

Posted 4 days ago

Apply

3.0 - 6.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Duration : 6 Months Notice Period : within 15 days or immediate joiner Experience : 3- 6 Years About The Role : As a Data Engineer for the Data Science team, you will play a pivotal role in enriching and maintaining the organization's central repository of datasets. This repository serves as the backbone for advanced data analytics and machine learning applications, enabling actionable insights from financial and market data. You will work closely with cross-functional teams to design and implement robust ETL pipelines that automate data updates and ensure accessibility across the organization. This is a critical role requiring technical expertise in building scalable data pipelines, ensuring data quality, and supporting data analytics and reporting infrastructure for business growth. Note : Must be ready for face-to-face interview in Bangalore (last round). Should be working with Azure as cloud technology. Key Responsibilities : ETL Development : - Design, develop, and maintain efficient ETL processes for handling multi-scale datasets. - Implement and optimize data transformation and validation processes to ensure data accuracy and consistency. - Collaborate with cross-functional teams to gather data requirements and translate business logic into ETL workflows. Data Pipeline Architecture : - Architect, build, and maintain scalable and high-performance data pipelines to enable seamless data flow. - Evaluate and implement modern technologies to enhance the efficiency and reliability of data pipelines. - Build pipelines for extracting data via web scraping to source sector-specific datasets on an ad hoc basis. Data Modeling : - Design and implement data models to support analytics and reporting needs across teams. - Optimize database structures to enhance performance and scalability. Data Quality And Governance : - Develop and implement data quality checks and governance processes to ensure data integrity. - Collaborate with stakeholders to define and enforce data quality standards across the and Communication - Maintain detailed documentation of ETL processes, data models, and other key workflows. - Effectively communicate complex technical concepts to non-technical stakeholders and business Collaboration - Work closely with the Quant team and developers to design and optimize data pipelines. - Collaborate with external stakeholders to understand business requirements and translate them into technical solutions. Essential Requirements Basic Qualifications : - Bachelor's degree in Computer Science, Information Technology, or a related field. - Familiarity with big data technologies like Hadoop, Spark, and Kafka. - Experience with data modeling tools and techniques. - Excellent problem-solving, analytical, and communication skills. - Proven experience as a Data Engineer with expertise in ETL techniques (minimum years). - 3-6 years of strong programming experience in languages such as Python, Java, or Scala - Hands-on experience in web scraping to extract and transform data from publicly available web sources. - Proficiency with cloud-based data platforms such as AWS, Azure, or GCP. - Strong knowledge of SQL and experience with relational and non-relational databases. - Deep understanding of data warehousing concepts and Qualifications : - Master's degree in Computer Science or Data Science. - Knowledge of data streaming and real-time processing frameworks. - Familiarity with data governance and security best practices.

Posted 4 days ago

Apply

4.0 - 6.0 years

6 - 11 Lacs

Noida

Work from Office

Naukri logo

Responsibilities : - Collaborate with the sales team to understand customer challenges and business objectives and propose solutions, POC etc. - Develop and deliver impactful technical presentations and demos showcasing the capabilities of GCP Data and AI , GenAI Solutions. - Conduct technical proof-of-concepts (POCs) to validate the feasibility and value proposition of GCP solutions. - Collaborate with technical specialists and solution architects from COE Team to design and configure tailored cloud solutions. - Manage and qualify sales opportunities, working closely with the sales team to progress deals through the sales funnel. - Stay up to date on the latest GCP offerings, trends, and best practices. Experience : - Design and implement a comprehensive strategy for migrating and modernizing existing relational on-premise databases to scalable and cost-effective solution on Google Cloud Platform ( GCP). - Design and Architect the solutions for DWH Modernization and experience with building data pipelines in GCP. - Strong Experience in BI reporting tools ( Looker, PowerBI and Tableau). - In-depth knowledge of Google Cloud Platform (GCP) services, particularly Cloud SQL, Postgres, Alloy DB, BigQuery, Looker Vertex AI and Gemini (GenAI). - Strong knowledge and experience in providing the solution to process massive datasets in real time and batch process using cloud native/open source Orchestration techniques. - Build and maintain data pipelines using Cloud Dataflow to orchestrate real-time and batch data processing for streaming and historical data. - Strong knowledge and experience in best practices for data governance, security, and compliance. - Excellent Communication and Presentation Skills with ability to tailor technical information as per customer needs. - Strong analytical and problem-solving skills. - Ability to work independently and as part of a team.

Posted 4 days ago

Apply

3.0 - 8.0 years

9 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

3+ years of experience in data governance projects. * Import the technical metadata from different resources into EDC environment. * Ensuring population of data lineage between tables and fields * Create Custom lineage * Should have knowledge on various APIs required to extract details from EDC * Should be able to analyse IICS mappings and create lineage mapping accordingly * Should have knowledge of SQL and should be able to analyse view definitions.

Posted 4 days ago

Apply

3.0 - 6.0 years

27 - 42 Lacs

Chennai

Work from Office

Naukri logo

1. Job Title : GeoSpatial Sr.Analyst MAP 2. Job Summary : The GeoSpatial Sr. Analyst MAP will play a crucial role in analyzing and interpreting geospatial data to support strategic decision-making. With a focus on Networx technologies the analyst will ensure the effective use of geospatial data in various projects. The role requires a blend of technical expertise and analytical skills to drive impactful outcomes in a hybrid work model. 3. Experience : 3 - 6 years 4. Required Skills : Technical Skills: Networx - Pricer Networx - Facets Pricer Networx - Modeler NetworX Domain Skills: 5. Nice to have skills : Domain Skills: 6. Technology : Custom Service 7. Shift : Day 8. Responsibilities : - Analyze geospatial data using advanced Networx tools to support strategic business decisions. - Collaborate with cross-functional teams to integrate geospatial insights into project planning and execution. - Develop and maintain geospatial databases to ensure data accuracy and accessibility. - Provide detailed reports and visualizations to communicate geospatial findings to stakeholders. - Utilize Networx - Pricer and Networx - Facets Pricer to optimize pricing strategies and models. - Implement Networx - Modeler to simulate and predict geospatial trends and patterns. - Ensure compliance with data governance and security protocols in all geospatial analyses. - Conduct regular audits of geospatial data to maintain high-quality standards. - Support the development of geospatial strategies that align with organizational goals. - Train team members on the effective use of geospatial tools and technologies. - Monitor industry trends to keep the organization at the forefront of geospatial innovation. - Facilitate workshops and presentations to share geospatial insights with internal and external audiences. - Contribute to the continuous improvement of geospatial processes and methodologies. Qualifications - - Possess a strong background in geospatial analysis with experience in Networx technologies. - Demonstrate proficiency in Networx - Pricer Networx - Facets Pricer and Networx - Modeler. - Exhibit excellent analytical and problem-solving skills. - Have a minimum of 3 years of experience in a similar role. - Show ability to work effectively in a hybrid work environment. - Display strong communication skills to convey complex geospatial concepts. - Be detail-oriented with a focus on data accuracy and integrity. 9. Job Location : Primary Location :INTNCHNA16(ITIND COG KITS Campus(CKC)SDB2&3 SEZ) Alternate Location :NA NA Alternate Location 1 :NA NA 10. Job Type : Business Associate [60CW00] 11. Demand Requires Travel? : No 12. Certifications Required : N/A

Posted 4 days ago

Apply

12.0 - 17.0 years

5 - 9 Lacs

Mumbai

Work from Office

Naukri logo

Project Role : Business Function Implement Practitioner Project Role Description : Support the implementation of activities for a specific business function to improve performance for a function end to end. Activities include analyzing and designing/re-designing business processes and/or defining parts of an organization. Must have skills : SAP Master Data Governance MDG Tool Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : MBA Summary :As a Business Function Implement Practitioner, you will support the implementation of activities for a specific business function to improve performance end to end. You will be involved in analyzing and designing/re-designing business processes and defining parts of an organization in Mumbai. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Lead the implementation of SAP Master Data Governance MDG Tool.- Provide expertise in optimizing business processes.- Contribute to the strategic direction of the organization. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Master Data Governance MDG Tool.- Strong understanding of data governance principles.- Experience in implementing SAP MDG solutions.- Knowledge of SAP ERP systems.- Familiarity with data modeling and data management best practices. Additional Information:- The candidate should have a minimum of 12 years of experience in SAP Master Data Governance MDG Tool.- This position is based at our Mumbai office.- A MBA degree is required. Qualification MBA

Posted 4 days ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating innovative solutions to address various business needs and ensuring seamless application functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead and mentor junior professionals- Conduct regular knowledge sharing sessions within the team- Stay updated on the latest industry trends and technologies Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio- Strong understanding of ETL processes- Experience in data warehousing concepts- Hands-on experience with data integration tools- Knowledge of data quality and data governance principles Additional Information:- The candidate should have a minimum of 5 years of experience in Ab Initio- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education

Posted 4 days ago

Apply

5.0 - 7.0 years

11 - 15 Lacs

Coimbatore

Work from Office

Naukri logo

Mandate Skills : Apache spark, hive, Hadoop, spark, scala, Databricks The Role : - Designing and building optimized data pipelines using cutting-edge technologies in a cloud environment to drive analytical insights. - Constructing infrastructure for efficient ETL processes from various sources and storage systems. - Leading the implementation of algorithms and prototypes to transform raw data into useful information. - Architecting, designing, and maintaining database pipeline architectures, ensuring readiness for AI/ML transformations. - Creating innovative data validation methods and data analysis tools. - Ensuring compliance with data governance and security policies. - Interpreting data trends and patterns to establish operational alerts. - Developing analytical tools, programs, and reporting mechanisms - Conducting complex data analysis and presenting results effectively. - Preparing data for prescriptive and predictive modeling. - Continuously exploring opportunities to enhance data quality and reliability. - Applying strong programming and problem-solving skills to develop scalable solutions. Requirements : - Experience in the Big Data technologies (Hadoop, Spark, Nifi, Impala) - 5+ years of hands-on experience designing, building, deploying, testing, maintaining, monitoring, and owning scalable, resilient, and distributed data pipelines. - High proficiency in Scala/Java and Spark for applied large-scale data processing. - Expertise with big data technologies, including Spark, Data Lake, and Hive

Posted 4 days ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Reltio Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will collaborate with key stakeholders, data owners, and architects to model existing and new data, ensuring data integrity and accuracy. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and maintain data models for current and future data needs.- Collaborate with business representatives to understand data requirements.- Implement data modeling best practices to ensure data quality and consistency.- Provide data modeling expertise and guidance to the team.- Contribute to data governance initiatives and compliance efforts. Professional & Technical Skills: - Must To Have Skills: Proficiency in Reltio.- Strong understanding of data modeling concepts and techniques.- Experience with data modeling tools and techniques.- Knowledge of data governance principles and practices.- Experience in data analysis and interpretation. Additional Information:- The candidate should have a minimum of 3 years of experience in Reltio.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

7.0 - 12.0 years

10 - 14 Lacs

Kolkata

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Informatica MDM Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute to key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the effort to design, build, and configure applications- Act as the primary point of contact- Manage the team and ensure successful project delivery Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM- Strong understanding of data management principles and practices- Experience in designing and implementing data integration solutions- Knowledge of data quality and data governance concepts- Experience with data modeling and database design- Good To Have Skills: Experience with ETL tools such as Informatica PowerCenter- Experience with data migration and data synchronization- Familiarity with master data management best practices Additional Information:- The candidate should have a minimum of 7.5 years of experience in Informatica MDM- This position is based in Kolkata- A 15 years full-time education is required Qualification 15 years full time education

Posted 4 days ago

Apply

7.0 - 11.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Skill required: Delivery - HR Analytics Designation: I&F Decision Sci Practitioner Specialist Qualifications: Any Graduation Years of Experience: 7 to 11 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Data & AIA set of tasks to provide insights about the effectiveness of HR processes, procedures and policies, help make data-driven decisions based on the information collected and help HR to move from operational to tactical or strategic partner. What are we looking for SAP SuccessFactors SAP SuccessFactors Reporting Microsoft Excel Data Analysis Structured Query Language (SQL) Written and verbal communication Ability to manage multiple stakeholders Hands-on experience with trouble-shooting Problem-solving skills Strong analytical skills People Management A collaborative and customer-focused mindset. Ability to manage multiple priorities and deadlines. Excellent communication and presentation skills. Strong attention to detail and accuracy. Experience with predictive analytics and statistical modeling techniques. Familiarity with workforce planning and compensation analysis. Knowledge of compliance standards and HR data governance. Roles and Responsibilities: Collaboration & Stakeholder Engagement:Partner with HR teams (Talent Acquisition, L&D, Compensation & Benefits) to understand their data needs.Provide training to HR staff on how to interpret and use analytics tools and reports effectively. Compliance & Data Governance:Ensure HR data reporting complies with company policies and legal regulations (e.g., GDPR, EEOC).Establish best practices for data governance, accuracy, and security. Predictive Analytics & Workforce Planning:Use statistical methods to predict workforce trends such as attrition and hiring needs.Support workforce planning and budgeting processes with data-driven insights. HR Systems Integration:Collaborate with IT, HR, and external vendors to ensure seamless data integration across HRIS, payroll, and other HR platforms.Conduct audits to ensure the accuracy and integrity of HR data. Data Visualization:Design and maintain interactive dashboards using tools like Power BI, Tableau, or Excel for HR teams and leadership.Simplify complex data into user-friendly formats for non-technical audiences. HR Metrics & KPIs:Define and monitor key HR performance indicators (KPIs) aligned with organizational goals.Identify trends, risks, and opportunities within HR metrics Data Reporting & Analysis:Collect, analyze, and interpret HR data from multiple sources (HRIS, ATS, payroll systems, etc.). Develop and maintain standardized and ad-hoc HR reports and dashboards to track metrics such as turnover, hiring, diversity, engagement, and performance. Deliver actionable insights to HR leadership and business units. Qualification Any Graduation

Posted 4 days ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will collaborate with teams to ensure successful project delivery and application functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead and mentor junior professionals- Conduct regular team meetings to discuss progress and challenges- Stay updated on industry trends and best practices Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio- Strong understanding of ETL processes- Experience with data integration and data warehousing- Knowledge of data quality and data governance principles- Hands-on experience with Ab Initio GDE and EME tools Additional Information:- The candidate should have a minimum of 5 years of experience in Ab Initio- This position is based at our Chennai office- A 15 years full time education is required Qualification 15 years full time education

Posted 4 days ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Microsoft Azure Databricks, PySpark, Core JavaMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop innovative solutions and streamline processes. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the development and implementation of new applications- Conduct code reviews and ensure coding standards are met- Stay updated on industry trends and best practices Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform- Good To Have Skills: Experience with PySpark- Strong understanding of data engineering concepts- Experience in building and optimizing data pipelines- Knowledge of cloud platforms like Microsoft Azure- Familiarity with data governance and security practices Additional Information:- The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education

Posted 4 days ago

Apply

8.0 - 10.0 years

12 - 15 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

Naukri logo

We are seeking an SAP MDG Consultant with 8-10 years of experience in SAP MDG (Master Data Governance). The consultant should have a strong techno-functional background with expertise in MDG Data Model build, Business Partner, Finance, and MM domains. The role involves implementing MDG BRF+, managing mass changes, and understanding the Data Replication Framework. Knowledge of Data distribution using BD64, Partner Profiles, and RFCs is required. Exposure to ALE Idoc for Master Data and debugging skills in SAP (especially ABAP) is a big plus. The candidate should be comfortable with remote work and be willing to travel to Manila in January to run onboarding sessions for the new support team. Location-Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad,Remote

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies