Home
Jobs

1406 Data Governance Jobs - Page 23

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 10.0 years

8 - 14 Lacs

Nagpur

Remote

Naukri logo

Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 2 weeks ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

JobOpening Senior Data Engineer (Remote, Contract 6 Months) Remote | Contract Duration: 6 Months | Experience: 6-8 Years We are hiring a Senior Data Engineer for a 6-month remote contract position. The ideal candidate is highly skilled in building scalable data pipelines and working within the Azure cloud ecosystem, especially Databricks, ADF, and PySpark. You'll work closely with cross-functional teams to deliver enterprise-level data engineering solutions. #KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices #GoodToHaveSkills Event Hubs, Logic Apps Power BI Strong logic building and competitive programming background Mode: Remote Duration: 6 Months Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 2 weeks ago

Apply

7.0 - 10.0 years

8 - 14 Lacs

Ahmedabad

Remote

Naukri logo

Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 2 weeks ago

Apply

1.0 - 3.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

The opportunity At Hitachi Energy, we are building a future-ready data ecosystem. As a Data Governance Specialist, you will be a key enabler in shaping and operationalizing our enterprise-wide data governance framework. You will focus on the implementation and evolution of our Data Catalog, Metadata Management, and Data Compliance initiatives ensuring our data assets are trusted, discoverable, and aligned with business value . This role is ideal for early-career professionals with a can-do mindset and a passion for making things happen. You will work in a dynamic, cross-functional environment that values curiosity, ownership, and ethical leadership. How you ll make an impact Data Catalog Compliance Implementation learning by doing Define and maintain the roadmap for the Enterprise Data Catalog and Data Supermarket Con figur e and execute deployment of cataloging tools (e. g. , metadata management, lineage, glossary) Ensure alignment with DAMA - DMBOK principles Governance Framework Execution Collaborate with Data Owners, Stewards, and Custodians to define and enforce data policies, standards, and RACI mode Support the Data Governance Council and contribute to the development of governance artifacts (e. g. , roles, regulations, KPIs) Data Quality Stewardship Partner with domain experts to drive data profiling, cleansing, and validation initiatives Monitor data quality metrics and support remediation efforts across domains Stakeholder Engagement Enablement Provide training and support to business users on catalog usage and governance practices Act as a liaison between business and IT to ensure data needs are met and governance is embedded in operations Innovation Continuous Improvement Stay current with industry trends and tool capabilities (e. g. , Databricks, SAP MDG) Propose enhancements to governance processes and tooling based on user feedback and analytics Your background Bachelors degree in information systems , Data Science, Business Informatics, or related field 1-3 years of experience in data governance, data management, or analytics roles Familiarity with DAMA DMBOK2 frame wo rk and data governance tools ( e. g. SAP MDG, Data Sphere, Business Warehouse , Data Intelligence, Informatica ETL , ) Strong communication and collaboration skills; ability to work across business and technical teams. Proactive, solution-oriented, and eager to learn ready to make it happen. Autonomy and ambiguity management capacities are competitive advantages Complete the preference for a candidate, new technology focus, next stop learning and embracing the challenge attitude CDMP certifications preference attribute More about us When joining us you may expect: A purpose-driven role in a global energy leader committed to sustainability and digital transformation Mentorship and development opportunities within a diverse and inclusive team. a initiatives and cutting-edge technologies A culture that values integrity, curiosity, and collaboration aligned with Hitachi Energy s Leadership Pillars: Lead with Purpose Opportunity to create customer value Drive Results Build Collaboration Develop Self Others Qualified individuals with a disability may request a reasonable accommodation if you are unable or limited in your ability to use or access the Hitachi Energy career site as a result of your disability. You may request reasonable accommodations by completing a general inquiry form on our website. Please include your contact information and specific details about your required accommodation to support you during the job application process. .

Posted 2 weeks ago

Apply

9.0 - 14.0 years

15 - 30 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

JD: Senior Snowflake Data Architect : Design , implements, and optimizes data solutions within the Snowflake cloud data platform, ensuring data security, governance, and performance, while also collaborating with cross-functional teams and providing technical leadership. Data architect include determining a data strategy. understanding data management technologies oversee data inventory. maintain a finger on the pulse of an organization's data management systems.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

15 - 20 Lacs

Hyderabad, Gurugram

Work from Office

Naukri logo

Hands-on experience with the OneTrust platform , including modules for Privacy Management, Data Governance, and Risk & Compliance. Ability to troubleshoot and support OneTrust workflows, integrations, and automation rules in a production environment. Experience managing incident response workflows, data subject access requests (DSARs), and third-party risk assessments using OneTrust. Proficiency in analyzing logs, audit trails, and system alerts generated by OneTrust to identify and resolve operational issues. Understanding of OneTrust API integrations and ability to support connected systems (e.g., CRM, ticketing tools, cloud platforms). Exposure to OneTrust AI features like Copilot or Agentic AI for automating compliance and governance tasks is a plus. Strong collaboration skills to work with compliance, legal, and IT teams to ensure platform stability and regulatory alignment Familiarity and working experience on ecommerce projects Working knowledge with ServiceNOW ITSM tool Knowledge of Production Support processes and procedures. Have ability to demonstrate functional and technical architecture knowledge and correlate between the two from past experiences Have good exposure of ITIL Processes like Incident Management, Problem Management, and Knowledge Management etc.

Posted 2 weeks ago

Apply

13.0 - 20.0 years

25 - 40 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Position Overview We are looking for a highly experienced and versatile Solution Architect Data to lead the solution design and delivery of next-generation data solutions for our BFS clients. The ideal candidate will have a strong background in data architecture and engineering, deep domain expertise in financial services, and hands-on experience with cloud-native data platforms and modern data analytics tools. The role will require architecting solutions across Retail, Corporate, Wealth, and Capital Markets, as well as Payments, Lending, and Onboarding journeys. Possession of Data Analytics and Exposure to Data regulatory domain will be of distinct advantage. Hands on experience of AI & Gen AI enabling data related solution will be a distinct advantage for the position. Key Responsibilities Design and implement end-to-end data solutions for BFS clients, covering data engineering and analytics involving modern data stacks and concepts. Architect cloud-native data platforms using AWS, Azure, and GCP (certifications preferred). Build and maintain data models aligned with Open Banking, Open Finance, SCA, AISP, and PISP requirements. Enrich Solution design by incorporating the construct of industry-standard data architectures using frameworks such as BIAN and lead data standardization programs for banks. Enrich solution architecture by enabling AI and Gen AI paradigm for data engineering, analytics and data regulatory Deliver data solutions in domains like Core Banking, Payments, Lending, Customer Onboarding, Wealth, and Capital Markets. Collaborate with business and technology stakeholders to gather requirements and translate them into scalable data architectures. Solution Design and if needed hands-on in developing lab-class Proof-of-Concepts (POCs) showcasing data-driven capabilities. Lead and contribute to RFX responses for banking and financial services clients and regulatory bodies across UK, EMEA regions. Provide architectural leadership in data initiatives related to regulatory compliance and risk analytics. In this regard, familiarity and working experience of with regulatory software and platform such as SAS, Nice Actimize, and Wolters Kluwer will be preferred. Required Skills & Experience 1218 years of experience in IT with a focus on data solution architecture in BFS domain. Strong delivery and development experience in Retail, Corporate, Wealth, and Capital Market banking domains. Deep understanding of data standards such as BIAN and experience implementing them in banking projects. Expertise in cloud platforms (AWS, Azure, GCP) and leveraging native services for data processing, storage, and analytics. Strong experience in building data models and data solutions for Open Banking, Open Finance, and regulatory needs including SCA, AISP, and PISP. Proficiency in data engineering pipelines and real-time/batch data processing. Experience in designing enterprise data lakes, data warehouses, and implementing data mesh and data lineage frameworks. Hands-on experience in developing rapid POCs and accelerators. Primary Technical Skills Cloud Platforms: AWS, Azure, GCP (certified preferred) Big Data Technologies: Hadoop, Spark, Databricks, Delta Lake Programming Languages: Python, Scala, SQL Data Engineering & Pipelines: Apache Airflow, Kafka, Glue, Data Factory Data Warehousing: Snowflake, Redshift, BigQuery, Synapse Visualization: Power BI, Tableau, Looker Data Governance: Data Lineage, Data Cataloging, Master Data Management Architecture Concepts: Data Mesh, Data Fabric, Event-driven architecture

Posted 2 weeks ago

Apply

4.0 - 9.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Overview The Senior Data Analyst is responsible for serving as a subject matter expert who can lead efforts to analyze data with the goal of delivering insights that will influence our products and customers. This position will report into the Data Analytics Manager , and will work closely with members of our product and marketing team s , data engineers, and members of our Customer Success organization supporting client outreach efforts. The chief function s of this role will be finding and sharing data-driven insights to deliver value to le ss technical a udiences , and instilling best practices for analytics in the rest of the team . Responsibilities Perform various data analysis functions to a nalyzedatafrom a variety of sources including external labor marketdataand research and internaldatasets fromourplatforms Incorporate information from a variety of systems to produce comprehensive and compelling narrative s for thought-leadership initiatives and customer engagements Demonstrate critical thinking - identify the story in context using multipledatasets, and present results . A strong proficiency in data storytelling will be critical to success in this role. Understand principles of quality data visualization and apply them in Tableau to create and maintain custom dashboards for consumption by other employees Find and i nvestigatedataquality issues, root causes andrecommendremedies to be implemented by thedatascientists and engineers Lia i se with teams around our business to understand their problems , determine how our team can help, then use our database to produce the content they need Identify datamappingand enrichmentrequirements . Familiarity with SQL, especially the logic behind different types of data joins and writing efficient queries, will be necessary Consistently ensure that business is always conducted with integrity and that behavior aligns with iCIMS policies, procedures, and core competencies Additional Job Responsibilities: Produce and adaptdatavisualizations in response to business requests for internaland externaluse Shows good judgement in prioritizing their own commitments and those of the larger team , while demonstrating initiative and appropriate urgency when needed Mentor junior team members in best practices for analytics, data visualization, and data storytelling . Exemplify these standards and guide teammates in following them. Think creatively to produce unique, actionable insights from complex datasets, which can deliver value to our business and to our customers. Qualifications 5-10 years professional experienceworking in an analytics capacity Excellent communication skills , especially with regards to data storytelling – finding insights from complex datasets and sharing those findings with key stakeholders Strongdataanalytics and visualization skills Expertise in Tableau Desktop (Tableau Server and Prep are preferable) producing clear and informative graphs and dashboards Proficiency in SQL and either Python or R to extract and prepare data for analysis Advanced knowledge of Excel (Pivot tables, VLOOKUPs, IF statements) Familiarity with data guardrails to ensure compliance with applicable data governance regulations and privacy laws (i.e., GDPR)

Posted 2 weeks ago

Apply

2.0 - 6.0 years

8 - 15 Lacs

Navi Mumbai

Work from Office

Naukri logo

Overview GEP is a diverse, creative team of people passionate about procurement. We invest ourselves entirely in our client’s success, creating strong collaborative relationships that deliver extraordinary value year after year. Our clients include market global leaders with far-flung international operations, Fortune 500 and Global 2000 enterprises, leading government and public institutions. We deliver practical, effective services and software that enable procurement leaders to maximise their impact on business operations, strategy and financial performance. That’s just some of the things that we do in our quest to build a beautiful company, enjoy the journey and make a difference. GEP is a place where individuality is prized, and talent respected. We’re focused on what is real and effective. GEP is where good ideas and great people are recognized, results matter, and ability and hard work drive achievements. We’re a learning organization, actively looking for people to help shape, grow and continually improve us. Are you one of us? GEP is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, ethnicity, color, national origin, religion, sex, disability status, or any other characteristics protected by law. We are committed to hiring and valuing a global diverse work team. For more information please visit us on GEP.com or check us out on LinkedIn.com. Responsibilities Job Title: Sr. Associate – Sales Operations & Data Analytics Office Location: Navi Mumbai Experience: 3+ years Shift Timing: 12:30 PM – 9:30 PM IST / 2:00 PM – 11:00 PM IST Role Summary: We are looking for a results-driven Sr. Associate with strong experience in Sales Operations, Data Analytics , and Power BI to support our global sales team. This role will play a key part in driving data governance, managing CRM systems, and creating impactful dashboards and executive presentations. Key Responsibilities: Manage and maintain data accuracy within the CRM system. Support software sales teams with actionable insights through data governance and analytics. Develop and maintain interactive dashboards and reports using Power BI. Automate data flows and reports using Power Automate and Power Query. Build high-quality leadership decks and sales performance presentations using PowerPoint. Collaborate with cross-functional teams to ensure alignment on sales metrics and KPIs. Identify process improvement opportunities through analytics and reporting. Qualifications Required Qualifications & Skills: Minimum 3 years of experience in Sales Operations and Data Analytics. Strong hands-on experience with Advanced Excel, Power BI, Power Automate, and Power Query . Proficiency in creating visually compelling dashboards and reports . Advanced skills in Microsoft PowerPoint for executive-level presentations. Prior experience working with CRM systems (e.g., Salesforce, HubSpot) preferred. Strong attention to detail, problem-solving mindset, and ability to work independently in a fast-paced environment. Preferred Traits: Excellent communication and collaboration skills. Comfortable working in overlapping time zones with international teams. Self-starter with a passion for using data to drive decisions.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a seasoned Senior Data Engineer to join our Marketing Data Platform team. This role is pivotal in designing, building, and optimizing scalable data pipelines and infrastructure that support our marketing analytics and customer engagement strategies. The ideal candidate will have extensive experience with big data technologies, cloud platforms, and a strong understanding of marketing data dynamics. Data Pipeline Development & Optimization Design, develop, and maintain robust ETL/ELT pipelines using Apache PySpark on GCP services like Dataproc and Cloud Composer. Ensure data pipelines are scalable, efficient, and reliable to handle large volumes of marketing data. Data Warehousing & Modeling Implement and manage data warehousing solutions using BigQuery, ensuring optimal performance and cost-efficiency. Develop and maintain data models that support marketing analytics and reporting needs. Collaboration & Stakeholder Engagement Work closely with marketing analysts, data scientists, and cross-functional teams to understand data requirements and deliver solutions that drive business insights. Translate complex business requirements into technical specifications and data architecture. Data Quality & Governance Implement data quality checks and monitoring to ensure the accuracy and integrity of marketing data. Adhere to data governance policies and ensure compliance with data privacy regulations. Continuous Improvement & Innovation Stay abreast of emerging technologies and industry trends in data engineering and marketing analytics. Propose and implement improvements to existing data processes and infrastructure Years of Experience 5 Years in Data Engineer space Education Qualification & Certifications B.Tech or MCA Experience Proven experience with Apache PySpark, GCP (including Dataproc, BigQuery, Cloud Composer), and data pipeline orchestration. Technical Skills Proficiency in SQL and Python. Experience with data modeling, ETL/ELT processes, and data warehousing concepts.

Posted 2 weeks ago

Apply

3.0 - 6.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Job Description As a Data Engineer, your role is to spearhead the data engineering teams and elevate the team to the next level! You will be responsible for laying out the architecture of the new project as well as selecting the tech stack associated with it. You will plan out the development cycles deploying AGILE if possible and create the foundations for good data stewardship with our new data products! You will also set up a solid code framework that needs to be built to purpose yet have enough flexibility to adapt to new business use cases tough but rewarding challenge! Responsibilities Collaborate with several stakeholders to deeply understand the needs of data practitioners to deliver at scale Lead Data Engineers to define, build and maintain Data Platform Work on building Data Lake in Azure Fabric processing data from multiple sources Migrating existing data store from Azure Synapse to Azure Fabric Implement data governance and access control Drive development effort End-to-End for on-time delivery of high-quality solutions that conform to requirements, conform to the architectural vision, and comply with all applicable standards. Present technical solutions, capabilities, considerations, and features in business terms. Effectively communicate status, issues, and risks in a precise and timely manner. Further develop critical initiatives, such as Data Discovery, Data Lineage and Data Quality Leading team and Mentor junior resources Help your team members grow in their role and achieve their career aspirations Build data systems, pipelines, analytical tools and programs Conduct complex data analysis and report on results Qualifications 3+ Years of Experience as a data engineer or similar role in Azure Synapses, ADF or relevant exp in Azure Fabric Degree in Computer Science, Data Science, Mathematics, IT, or similar fiel

Posted 2 weeks ago

Apply

7.0 - 12.0 years

10 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Job Description As a Sr Data Engineer, your role is to spearhead the data engineering teams and elevate the team to the next level! You will be responsible for laying out the architecture of the new project as well as selecting the tech stack associated with it. You will plan out the development cycles deploying AGILE if possible and create the foundations for good data stewardship with our new data products! You will also set up a solid code framework that needs to be built to purpose yet have enough flexibility to adapt to new business use cases tough but rewarding challenge! Responsibilities Collaborate with several stakeholders to deeply understand the needs of data practitioners to deliver at scale Lead Data Engineers to define, build and maintain Data Platform Work on building Data Lake in Azure Fabric processing data from multiple sources Migrating existing data store from Azure Synapse to Azure Fabric Implement data governance and access control Drive development effort End-to-End for on-time delivery of high-quality solutions that conform to requirements, conform to the architectural vision, and comply with all applicable standards. Present technical solutions, capabilities, considerations, and features in business terms. Effectively communicate status, issues, and risks in a precise and timely manner. Further develop critical initiatives, such as Data Discovery, Data Lineage and Data Quality Leading team and Mentor junior resources Help your team members grow in their role and achieve their career aspirations Build data systems, pipelines, analytical tools and programs Conduct complex data analysis and report on results Qualifications 7+ Years of Experience as a data engineer or similar role in Azure Synapses, ADF or relevant exp in Azure Fabric Degree in Computer Science, Data Science, Mathematics, IT, or similar fiel

Posted 2 weeks ago

Apply

5.0 - 7.0 years

8 - 12 Lacs

Kochi, Bengaluru

Work from Office

Naukri logo

Overview As an MDM Technical Delivery Manager, you will be responsible for leading and overseeing the end-to-end delivery of Master Data Management (MDM) solutions. You will collaborate with cross-functional teams to drive technical implementation, ensure data governance, and align with business objectives. Your expertise in MDM platforms, integration strategies, and project execution will be key to delivering high-quality solutions Key Responsibilities Oversee a team of experienced professionals, fostering collaboration and high performance. Guide and mentor team members, supporting their job performance and career growth. Lead the technical delivery of MDM implementations, ensuring successful project execution. Define MDM architecture, strategy, and integration frameworks with enterprise systems. Collaborate with business stakeholders to understand data requirements and align solutions. Oversee data governance, quality, and compliance with regulatory standards. Manage MDM development teams, ensuring adherence to best practices and standards. Optimize data models, workflows, and processes for efficient MDM operations. Drive continuous improvements in MDM technologies, methodologies, and performance. Communicate project updates, risks, and resolutions to leadership and stakeholders. Required Qualifications Bachelor s degree in Computer Engineering, Computer Science, or a related field. 5-7+ years of experience in software development and data Management. 5+ years of expertise in MDM implementation, with hands-on experience in Reltio, DataBricks, Azure, Oracle, and Snowflake. Strong background in integration design and development. Strong expertise in data integration design, ETL processes, and API development. At least 2+ years in an MDM Technical Lead and Delivery role. Proven track record in leading MDM projects and cross-functional teams. Solid understanding of diverse data sets, sources, and country-specific data models. Experience in life sciences MDM implementations. Experience in life sciences, healthcare, or pharmaceutical industries is a plus. Excellent communication, leadership, and problem-solving skills.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Location(s): Quay Building 8th Floor, Bagmane Tech Park, Bengaluru, IN Line Of Business: Technology Services Group(TSG) Job Category: Engineering & Technology Experience Level: Experienced Hire At Moodys, we unite the brightest minds to turn today s risks into tomorrow s opportunities. We do this by striving to create an inclusive environment where everyone feels welcome to be who they are-with the freedom to exchange ideas, think innovatively, and listen to each other and customers in meaningful ways. If you are excited about this opportunity but do not meet every single requirement, please apply! You still may be a great fit for this role or other open roles. We are seeking candidates who model our values: invest in every relationship, lead with curiosity, champion diverse perspectives, turn inputs into actions, and uphold trust through integrity. Position Overview We are looking for a skilled Configuration Management Data Analyst with expertise in ServiceNow to manage, analyze, and optimize configuration data across our organization. The ideal candidate will play a critical role in maintaining the accuracy and integrity of our Configuration Management within ServiceNow, ensuring alignment with ITIL best practices and supporting business decision-making. This role requires a strong background in ServiceNow, analytical capabilities, and a collaborative approach to working with cross-functional teams. Key Responsibilities 1. Configuration Management Administration and Maintenance o Maintain and enhance the ServiceNow CMDB to ensure data accuracy, completeness, and compliance with organizational standards. o Regularly audit configuration data to identify inconsistencies, address gaps, and enforce data governance policies. o Design and implement automated workflows within ServiceNow to streamline data updates and ensure real-time accuracy. 2. Data Analysis and Reporting o Analyze configuration data stored in ServiceNow to identify trends, risks, and opportunities for optimization. o Create and maintain dashboards, reports, and KPIs within ServiceNow to provide actionable insights to stakeholders. o Provide data-driven recommendations to improve IT infrastructure and configuration management processes. 3. Collaboration and Process Improvement o Work closely with IT, operations, and engineering teams to ensure the proper integration of configuration management processes with business objectives. o Act as a subject matter expert for CMDB best practices and ServiceNow capabilities, providing training and support to teams as needed. 4. ServiceNow Development and Optimization o Collaborate with ServiceNow developers to customize CMDB modules, workflows, and scripts based on organizational needs. o Stay up-to-date with ServiceNow platform updates, features, and releases to identify opportunities for improved functionality. o Troubleshoot and resolve technical issues related to ServiceNow CMDB operations. Experience and Qualification * 3+ years of experience in configuration management, data analysis, or CMDB administration, preferably in financial services. * Hands-on experience with ServiceNow, including CMDB module administration and customization. * Strong understanding of ITIL principles, particularly Configuration Management. * Proficiency in creating dashboards, reports, and workflows within ServiceNow. * Bachelor s degree in Information Technology, Computer Science, or a related field.

Posted 2 weeks ago

Apply

3.0 - 10.0 years

10 - 11 Lacs

Gurugram

Work from Office

Naukri logo

NAB is looking for Analyst to join our dynamic team and embark on a rewarding career journey. Collect, interpret, and analyze data to help the organization make informed business decisions Create reports, dashboards, and visual presentations to communicate insights clearly to stakeholders Identify trends, patterns, and discrepancies in large datasets using statistical tools and software Collaborate with departments to define metrics and KPIs for ongoing monitoring Recommend improvements to processes, strategies, and performance based on findings Ensure data accuracy, maintain databases, and follow data governance and compliance standards Continuously research industry trends to support strategic planning and optimization efforts

Posted 2 weeks ago

Apply

1.0 - 13.0 years

13 - 14 Lacs

Pune

Work from Office

Naukri logo

Design, develop, and maintain scalable data solutions using Starburst. Collaborate with cross-functional teams to integrate Starburst with existing data sources and tools. Optimize query performance and ensure data security and compliance. Implement monitoring and alerting systems for data platform health. Stay updated with the latest developments in data engineering and analytics. Skills Must have Bachelors degree or Masters in a related technical field; or equivalent related professional experience. Prior experience as a Software Engineer applying new engineering principles to improve existing systems including leading complex, well defined projects. Strong knowledge of Big-Data Languages including: SQL Hive Spark/Pyspark Presto Python Strong knowledge of Big-Data Platforms, such as:o The Apache Hadoop ecosystemo AWS EMRo Qubole or Trino/Starburst Good knowledge and experience in cloud platforms such as AWS, GCP, or Azure. Continuous learner with the ability to apply previous experience and knowledge to quickly master new technologies. Demonstrates the ability to select among technology available to implement and solve for need. Able to understand and design moderately complex systems. Understanding of testing and monitoring tools. Ability to test, debug, fix issues within established SLAs. Experience with data visualization tools (e.g., Tableau, Power BI). Understanding of data governance and compliance standards. Nice to have Data Architecture & Engineering: Design and implement efficient and scalable data warehousing solutions using Azure Databricks and Microsoft Fabric. Business Intelligence & Data Visualization: Create insightful Power BI dashboards to help drive business decisions. Other Languages English: C1 Advanced Seniority Senior Refer a Friend Positive work environments and stellar reputations attract and retain top talent. Find out why Luxoft stands apart from the rest. Recommend a friend Related jobs View all vacancies Senior Azure AI & ML Engineer Data Science United States of America Alpharetta Senior Azure AI & ML Engineer Data Science United States of America Remote United States Data Engineer with Neo4j Data Science India Chennai Pune, India Req. VR-114886 Data Science BCM Industry 05/06/2025 Req. VR-114886 Apply for Starburst Engineer in Pune *

Posted 2 weeks ago

Apply

5.0 - 10.0 years

13 - 15 Lacs

Pune

Work from Office

Naukri logo

Design, build, and manage data pipelines using Azure Data Integration Services (Azure DataBricks, ADF, Azure Functions.) Collaborate closely with the security team to develop robust data solutions that support our security initiatives. Implement, monitor, and optimize data processes, ensuring adherence to security and data governance best practices. Troubleshoot and resolve data-related issues, ensuring data quality and accessibility. Develop strategies for data acquisitions and integration of the new data into our existing architecture. Document procedures and workflows associated with data pipelines, contributing to best practices. Share knowledge about latest Azure Data Integration Services trends and techniques. Implement and manage CI/CD pipelines to automate data and UI testcases and integrate testing with development pipelines. Implement and manage CI/CD pipelines to automate development and integrate test pipelines. Conduct regular reviews of the system, identify possible security risks, and implement preventive measures. Skills Must have Excellent command of English Bachelors or Masters degree in Computer Science, Information Technology, or related field. 5+ years of experience in data integration and pipeline development using Azure Data Integration Services including Azure Data Factory and Azure Databricks. Hands-on with Python and Spark Strong understanding of security principles in the context of data integration. Proven experience with SQL and other data query languages. Ability to write, debug, and optimize data transformations and datasets. Extensive experience in designing and implementing ETL solutions using Azure Databricks, Azure Data Factory or similar technologies. Familiar with automated testing frameworks using Squash Nice to have Data Architecture & Engineering: Design and implement efficient and scalable data warehousing solutions using Azure Databricks and Microsoft Fabric. Business Intelligence & Data Visualization: Create insightful Power BI dashboards to help drive business decisions. Other Languages English: C1 Advanced Seniority Senior Refer a Friend Positive work environments and stellar reputations attract and retain top talent. Find out why Luxoft stands apart from the rest. Recommend a friend Related jobs View all vacancies Senior Azure AI & ML Engineer Data Science United States of America Alpharetta Senior Azure AI & ML Engineer Data Science United States of America Remote United States Data Engineer with Neo4j Data Science India Chennai Pune, India Req. VR-114884 Data Science BCM Industry 05/06/2025 Req. VR-114884 Apply for Databricks Developer in Pune *

Posted 2 weeks ago

Apply

8.0 - 13.0 years

15 - 19 Lacs

Gurugram

Work from Office

Naukri logo

Every day, tens of millions of people come to Roblox to explore, create, play, learn, and connect with friends in 3D immersive digital experiences- all created by our global community of developers and creators. At Roblox, we re building the tools and platform that empower our community to bring any experience that they can imagine to life. Our vision is to reimagine the way people come together, from anywhere in the world, and on any device. We re on a mission to connect a billion people with optimism and civility, and looking for amazing talent to help us get there. A career at Roblox means you ll be working to shape the future of human interaction, solving unique technical challenges at scale, and helping to create safer, more civil shared experiences for everyone. About the role: The Roblox Operating System (ROS) team is responsible for the foundational technology and services that power all experiences on Roblox. This critical team ensures a seamless, performant, and reliable platform for our global community of users and developers. You will be the first Product Manager hire for our India office, reporting to Theresa Johnson, the Head of Product for ROS. You will play a pivotal role in building and enhancing our data analytics capabilities within the Roblox operating system, collaborating closely with the India-based Data Engineering team, which includes an Engineering Manager, three engineers, and multiple data scientists. This is a full time onsite role based out of our Gurugram office. Shift Time: 2:00PM - 10:30PM IST (Cabs will be provided) You will: Collaborate with data engineering and product engineering teams in India to build integrated analytics tooling. Develop cross-functional data visualization and reporting capabilities. Implement advanced insights extraction methodologies. Develop self-service data exploration tools. Integrate data analytics capabilities into Roblox operating system. Ensure seamless data flow across organizational platforms. Implement cutting-edge data infrastructure solutions. Build a scalable data registry that will allow us to understand, register, classify and govern data across all of ROS. This will involve partnering with data engineers to build and maintain robust data pipelines integrating diverse sources like HR systems (Workday, Greenhouse), collaboration tools (Slack, Zoom), business applications (Pigment, Zendesk), and internal Roblox applications. Partner with Data Scientists to process and transform data into actionable insights, developing systems that generate builder development signals and promote positive behaviors. Contribute to achieving key outcomes such as reducing data access request resolution time by 60%, increasing self-service data exploration adoption by 75%, and achieving 99.9% data pipeline reliability. You have: A Bachelor s degree or equivalent experience in Computer Science, Computer Engineering, or a similar technical field. 8+ years of product management experience, with a focus on data platforms, analytics, or developer tools. Strong understanding of data infrastructure, data warehousing, and ETL processes, including experience with data governance tools focusing on discovery, cataloging, metadata management, classification, and quality assurance. Proven ability to work autonomously and define product scope in ambiguous environments. Experience collaborating with engineering and data science teams to deliver impactful data products. Excellent communication and interpersonal skills, with the ability to articulate complex technical concepts to diverse audiences. You are: Someone with strong product intuition of what we should be doing rather than just following instructions. Highly organized with a strong sense of urgency. A collaborative team player who can navigate cross-functional partnerships effectively. Adaptable and comfortable working in a fast-paced, evolving environment. A strategic thinker with a bias for action and a focus on delivering measurable results. Roles that are based in our San Mateo, CA Headquarters are in-office Tuesday, Wednesday, and Thursday, with optional in-office on Monday and Friday (unless otherwise noted).

Posted 2 weeks ago

Apply

5.0 - 12.0 years

18 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Core Competences Required and Desired Attributes: Bachelors degree in computer science, Information Technology, or a related field. Proficiency in Azure Data Factory, Azure Databricks and Unity Catalog, Azure SQL Database, and other Azure data services. Strong programming skills in SQL, Python and PySpark languages. Experience in the Asset Management domain would be preferable. Strong proficiency in data analysis and data modelling, with the ability to extract insights from complex data sets. Hands-on experience in Power BI, including creating custom visuals, DAX expressions, and data modelling. Familiarity with Azure Analysis Services, data modelling techniques, and optimization. Experience with data quality and data governance frameworks with an ability to debug, fine tune and optimise large scale data processing jobs. Strong analytical and problem-solving skills, with a keen eye for detail. Excellent communication and interpersonal skills, with the ability to work collaboratively in a team environment. Proactive and self-motivated, with the ability to manage multiple tasks and deliver high-quality results within deadlines.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

10 - 14 Lacs

Pune

Work from Office

Naukri logo

What you ll do Works independently within Data and Analytics with limited design help from manager or senior associates Leverage coding best practices and advanced techniques to ensure efficient execution of code against large datasets, ensuring code is repeatable and scalable Run, create and optimize standard processes to ensure metrics, reports and insights are delivered consistently to stakeholders with minimal manual intervention Leverage knowledge of data structures to prepare data for ingestion efforts, analysis, assembling data from disparate data sources for the creation of insights; accurately integrate new and complex data sources Integrate Equifax, customer and third party data to solve internal or customer analytical problems of moderate complexity and report findings to managers and stakeholders Review output of code for anomalies and perform analysis to determine cause, and work with Data, Analytics, Product and Technology counterparts to implement corrective measures Ability to communicate impacts and importance of findings on the business (either Equifax or external customer) and recommend appropriate course of action. Understands the concepts of quantitative and qualitative data and how to relate them to the customer to show value of analysis. Ensure proper use of Equifax data assets by working closely with data governance and compliance professionals What experience you need BS degree in a STEM major or equivalent discipline 2-5 years of experience in a related analyst role Cloud certification strongly preferred Technical capabilities including SQL, BigQuery, R, Python, MS Excel / Google Sheets, Tableau, Looker Experience working as a team and collaborating with others on producing descriptive and diagnostic analysis What could set you apart Cloud certification such as GCP strongly preferred Self Starter Excellent communicator / Client Facing Ability to work in fast paced environment Flexibility work across A/NZ time zones based on project needs

Posted 2 weeks ago

Apply

5.0 - 8.0 years

7 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

We are looking forward to hire Microsoft Fabric Professionals in the following areas : Experience 5-8 Years Experience in Azure Fabric, Azure Data factory, Azure Databricks, Azure Synapse, Azure Storage Services, Azure SQL, ETL, Azure Cosmos DB, Event HUB, Azure Data Catalog, Azure Functions, Azure Purview Create Pipelines, datasets, dataflows, Integration runtimes and monitoring Pipelines and trigger runs Extract Transformation and Load Data from source system and processing the data in Azure Databricks Prepare DB Design Documents for the user story based on the client requirement Work with the development team to create database structure, queries and used triggers Extensive experience in Microsoft Cloud solutions, i.e., Designing, Developing, and Testing Technologies Create SQL scripts to perform complex queries Create Synapse pipelines to migrate data from Gen2 to Azure SQL Data Migration pipeline to Azure cloud (Azure SQL). Database migration from on-prem SQL server to Azure Dev Environment by using Azure DMS and Data Migration Assistant Lift and Shift of the Development Server to Production Server Data governance in Azure Data migration pipeline which can migrate on-prem SQL server data to azure cloud (Azure SQL and Cosmos DB) Experience in using azure data catalog Experience in Big Data Batch Processing Solutions; Interactive Processing Solutions; Real Time Processing Solutions Certifications Mandatory Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 2 weeks ago

Apply

12.0 - 14.0 years

11 - 16 Lacs

Hyderabad

Work from Office

Naukri logo

We are looking forward to hire Microsoft Fabric Professionals in the following areas : Experience 12 Yrs to 14 Yrs 12+ years of experience in Microsoft Azure Data Engineering for analytical projects. Proven expertise in designing, developing, and deploying high-volume, end-to-end ETL pipelines for complex models, including batch, and real-time data integration frameworks using Azure, Microsoft Fabric and Databricks. Extensive hands-on experience with Azure Data Factory, Databricks (with Unity Catalog), Azure Functions, Synapse Analytics, Data Lake, Delta Lake, and Azure SQL Database for managing and processing large-scale data integrations. Experience in Databricks cluster optimization and workflow management to ensure cost-effective and high-performance processing. Sound knowledge of data modelling, data governance, data quality management, and data modernization processes. Develop architecture blueprints and technical design documentation for Azure-based data solutions. Provide technical leadership and guidance on cloud architecture best practices, ensuring scalable and secure solutions. Keep abreast of emerging Azure technologies and recommend enhancements to existing systems. Lead proof of concepts (PoCs) and adopt agile delivery methodologies for solution development and delivery. Certifications Mandatory Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 2 weeks ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

Gurugram

Work from Office

Naukri logo

Senior Specialist, BA/DA Gurgaon/Bangalore, India AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities This data should not only be high quality, but also actionable - enabling AXA XL s executive leadership team to maximize benefits and facilitate sustained dynamic advantage Our Innovation, Data & Analytics function is focused on driving innovation through optimizing how we leverage data to drive strategy and differentiate ourselves from the competition As we develop an enterprise-wide data and digital strategy that moves us toward greater focus on the use of data and data-driven insights, we are seeking a Senior Specialist for our Data Sourcing & Solutions team The role sits across the Innovation, Data & Analytics Department to ensure customer requirements are properly captured and transformed into actionable data specifications Success in the role will require focus on proactive management of the sourcing and management of data from source through usage What you ll be DOING What will your essential responsibilities include? Identify, evaluate, and acquire various data sources that align with the customer needs This may involve collaborating with business stakeholders, third party vendors and source system teams Design and implement data integration strategies to combine diverse datasets from internal and external sources This would include accountable for documenting data requirements to the ETL processes, APIs, and data pipelines Develop data solutions to address specific business challenges This might involve creating custom data models that provide actionable insights which integrate with the existing data assets Oversee the organization and management of data within databases, ensuring data security, integrity, and accessibility Able to work in Agile framework by defining and prioritizing the product backlog, collaborating with agile teams to deliver the business goals and customer needs Work closely with cross-functional teams, including Data Engineers, Data Science, Data Management, Data Governance, Data Quality, BI Solutions and Stakeholders Implement measures to maintain data accuracy, consistency, and completeness Perform data validation and cleansing as needed Adhere to data governance standards, ensuring compliance with regulations and internal policies related to data usage and privacy Proficiency in various data technologies such as SQL, Azure cloud technologies, Databricks to analyze and produce data insights Stay updated with emerging technologies in data management Developing expertise in the Insurance domain to better understand the context of the data Identify data related issues, troubleshoot problems, and recommend solutions to enhance data sourcing and integration processes Provide guidance and mentorship to junior analysts and team members, fostering a culture of continuous learning and improvement Translate complex technical concepts into understandable insights for non-technical stakeholders to drive data-informed decision-making Explore innovative approaches to data acquisition, integration, and solution development that can lead to improved efficiency and effectiveness It is pivotal in ensuring that an organization s data ecosystem is robust, well-integrated, and capable of providing accurate and actionable insights to become a data driven organization Instill a customer-first attitude, prioritizing service for our business stakeholders above all else You will report to Lead Specialist, Data Sourcing & Solutions What you will BRING We re looking for someone who has these abilities and skills: Required Skills and Abilities: Extensive experience in a data role (business analyst, data analyst, analytics) preferably in the Insurance industry and within a data division Excellent presentation, communication (oral & written), and relationship building skills, across all levels of management and customer interaction Excellent SQL knowledge, exposure to Azure cloud technologies (including Databricks) and technical ability to query AXA XL data sources to understand our data Deep insurance experience in data, underwriting, claims and/or operations, including influencing, collaborating, and leading efforts in complex, disparate and inter-related teams with competing priorities Passion for data and experience working within a data driven organization Integrate internal data with external industry data to deliver holistic solutions Work with unstructured data to unlock information needed by the business to create unique products for the insurance industry Possesses excellent exploratory analysis skills and high intellectual curiosity Displays exceptional organizational skills and is detail oriented Effective conceptual thinker who connects dots, critical thinking, and analytical skills Ability to take ownership, work under pressure, and meet deadlines Ability to work with team members across the globe and across departments Desired Skills and Abilities: Builds trust and rapport within and across groups Applies in-depth knowledge of business and specialized areas to solve business problems and understand integration challenges and long-term impact creatively and strategically Ability to manage data needs of an individual project(s) while being able to understand the broader enterprise data perspective Expected to recommend innovation and improvement to policies, procedures, deploying resources and performing core activities

Posted 2 weeks ago

Apply

4.0 - 10.0 years

6 - 12 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Data Engineer (AWS) - Neoware Technology Solutions Private Limited Data Engineer (AWS) Requirements 4 - 10 years of hands-on experience in designing, developing and implementing data engineering solutions. Strong SQL development skills, including performance tuning and query optimization. Good understanding of data concepts. Proficiency in Python and a solid understanding of programming concepts. Hands-on experience with PySpark or Spark Scala for building data pipelines. Understanding of streaming data pipelines for near real-time analytics. Experience with Azure services including Data Factory, Functions, Databricks, Synapse Analytics, Event Hub, Stream Analytics and Data Lake Storage. Familiarity with at least one NoSQL database. Knowledge of modern data architecture patterns and industry trends in data engineering. Understanding of data governance concepts for data platforms and analytical solutions. Experience with Git for managing version control for source code. Experience with DevOps processes, including experience implementing CI/CD pipelines for data engineering solutions. Strong analytical and problem-solving skills. Excellent communication and teamwork skills. Responsibilities Azure Certifications related to Data Engineering are highly preferred. Experience with Amazon AppFlow, EKS, API Gateway, NoSQL database services. Strong understanding and experience with BI/visualization tools like Power BI. Chennai, Bangalore Full time 4+ years Other positions Principal Architect (Data and Cloud) Development

Posted 2 weeks ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Kochi, Bengaluru

Work from Office

Naukri logo

Overview As an MDM Technical Delivery Manager, you will be responsible for leading and overseeing the end-to-end delivery of Master Data Management (MDM) solutions. You will collaborate with cross-functional teams to drive technical implementation, ensure data governance, and align with business objectives. Your expertise in MDM platforms, integration strategies, and project execution will be key to delivering high-quality solutions Key Responsibilities Oversee a team of experienced professionals, fostering collaboration and high performance. Guide and mentor team members, supporting their job performance and career growth. Lead the technical delivery of MDM implementations, ensuring successful project execution. Define MDM architecture, strategy, and integration frameworks with enterprise systems. Collaborate with business stakeholders to understand data requirements and align solutions. Oversee data governance, quality, and compliance with regulatory standards. Manage MDM development teams, ensuring adherence to best practices and standards. Optimize data models, workflows, and processes for efficient MDM operations. Drive continuous improvements in MDM technologies, methodologies, and performance. Communicate project updates, risks, and resolutions to leadership and stakeholders. Required Qualifications Bachelor s degree in Computer Engineering, Computer Science, or a related field. 5-7+ years of experience in software development and data Management. 5+ years of expertise in MDM implementation, with hands-on experience in Reltio, DataBricks, Azure, Oracle, and Snowflake. Strong background in integration design and development. Strong expertise in data integration design, ETL processes, and API development. At least 2+ years in an MDM Technical Lead and Delivery role. Proven track record in leading MDM projects and cross-functional teams. Solid understanding of diverse data sets, sources, and country-specific data models. Experience in life sciences MDM implementations. Experience in life sciences, healthcare, or pharmaceutical industries is a plus. Excellent communication, leadership, and problem-solving skills. . We create intelligent connections to accelerate the development and commercialization of innovative medical treatments to help improve patient outcomes and population health worldwide . Learn more at https://jobs.iqvia.com

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies