Home
Jobs

1386 Data Governance Jobs - Page 22

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

12.0 - 15.0 years

10 - 14 Lacs

Navi Mumbai

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP Information Lifecycle management ILM Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : BE Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that solutions are effectively implemented across multiple teams, while maintaining a focus on quality and efficiency in application delivery. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with strategic goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Information Lifecycle management ILM.- Strong understanding of application design and architecture principles.- Experience with project management methodologies.- Ability to analyze and optimize application performance.- Familiarity with data governance and compliance standards. Additional Information:- The candidate should have minimum 12 years of experience in SAP Information Lifecycle management ILM.- This position is based at our Mumbai office.- A BE is required. Qualification BE

Posted 2 weeks ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP Master Data Governance MDG Tool Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are aligned with business objectives and user needs. Your role will require you to facilitate communication between stakeholders and the development team, ensuring that all parties are informed and engaged throughout the project lifecycle. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Master Data Governance MDG Tool.- Strong understanding of data governance principles and practices.- Experience with application design and development methodologies.- Ability to analyze and optimize application performance.- Familiarity with integration processes and tools related to SAP. Additional Information:- The candidate should have minimum 5 years of experience in SAP Master Data Governance MDG Tool.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Informatica MDM Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions, providing insights and solutions to enhance application performance and user experience. Your role will require you to stay updated with the latest technologies and methodologies to ensure the applications are built using best practices. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate communication between technical teams and stakeholders to ensure alignment on project goals.- Mentor junior team members, providing guidance and support in their professional development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM.- Strong understanding of data integration processes and methodologies.- Experience with data quality management and data governance practices.- Familiarity with database management systems and data modeling techniques.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 2 years of experience in Informatica MDM.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

MIS Analyst to support Retail Operations by managing and analyzing sales data, generating performance reports, and delivering business insights. The role involves preparing MIS dashboards, managing data pipelines, BI tools and analytics. Required Candidate profile 2–5 years of experience in MIS reporting, sales analysis, or business intelligence in a retail or FMCG environment, Hands-on experience with BI tools like Power BI, Tableau, or Looker

Posted 2 weeks ago

Apply

5.0 - 10.0 years

16 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

We are Hiring Data Engineer for a US based IT Company Based in Hyderabad. Candidates with minimum 5 Years of experience in Data Engineering can apply. This job is for 1 year contract only Job Title: Data Engineer Location: Hyderabad CTC: Upto 20 LPA Experience: 5+ Years Job Overview: We are looking for a seasoned Senior Data Engineer with deep hands-on experience in Talend and IBM DataStage to join our growing enterprise data team. This role will focus on designing and optimizing complex data integration solutions that support enterprise-wide analytics, reporting, and compliance initiatives. In this senior-level position, you will collaborate with data architects, analysts, and key stakeholders to facilitate large-scale data movement, enhance data quality, and uphold governance and security protocols. Key Responsibilities: Develop, maintain, and enhance scalable ETL pipelines using Talend and IBM DataStage Partner with data architects and analysts to deliver efficient and reliable data integration solutions Review and optimize existing ETL workflows for performance, scalability, and reliability Consolidate data from multiple sourcesboth structured and unstructuredinto data lakes and enterprise platforms Implement rigorous data validation and quality assurance procedures to ensure data accuracy and integrity Adhere to best practices for ETL development, including source control and automated deployment Maintain clear and comprehensive documentation of data processes, mappings, and transformation rules Support enterprise initiatives around data migration , modernization , and cloud transformation Mentor junior engineers and participate in code reviews and team learning sessions Required Qualifications: Minimum 5 years of experience in data engineering or ETL development Proficient with Talend (Open Studio and/or Talend Cloud) and IBM DataStage Strong skills in SQL , data profiling, and performance tuning Experience handling large datasets and complex data workflows Solid understanding of data warehousing , data modeling , and data lake architecture Familiarity with version control systems (e.g., Git) and CI/CD pipelines Strong analytical and troubleshooting skills Effective verbal and written communication, with strong documentation habits Preferred Qualifications: Prior experience in banking or financial services Exposure to cloud platforms such as AWS , Azure , or Google Cloud Knowledge of data governance tools (e.g., Collibra, Alation) Awareness of data privacy regulations (e.g., GDPR, CCPA) Experience working in Agile/Scrum environments For further assistance contact/whatsapp: 9354909518 or write to priya@gist.org.in

Posted 2 weeks ago

Apply

7.0 - 10.0 years

8 - 14 Lacs

Nagpur

Remote

Naukri logo

Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 2 weeks ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

JobOpening Senior Data Engineer (Remote, Contract 6 Months) Remote | Contract Duration: 6 Months | Experience: 6-8 Years We are hiring a Senior Data Engineer for a 6-month remote contract position. The ideal candidate is highly skilled in building scalable data pipelines and working within the Azure cloud ecosystem, especially Databricks, ADF, and PySpark. You'll work closely with cross-functional teams to deliver enterprise-level data engineering solutions. #KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices #GoodToHaveSkills Event Hubs, Logic Apps Power BI Strong logic building and competitive programming background Mode: Remote Duration: 6 Months Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 2 weeks ago

Apply

7.0 - 10.0 years

8 - 14 Lacs

Ahmedabad

Remote

Naukri logo

Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 2 weeks ago

Apply

1.0 - 3.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

The opportunity At Hitachi Energy, we are building a future-ready data ecosystem. As a Data Governance Specialist, you will be a key enabler in shaping and operationalizing our enterprise-wide data governance framework. You will focus on the implementation and evolution of our Data Catalog, Metadata Management, and Data Compliance initiatives ensuring our data assets are trusted, discoverable, and aligned with business value . This role is ideal for early-career professionals with a can-do mindset and a passion for making things happen. You will work in a dynamic, cross-functional environment that values curiosity, ownership, and ethical leadership. How you ll make an impact Data Catalog Compliance Implementation learning by doing Define and maintain the roadmap for the Enterprise Data Catalog and Data Supermarket Con figur e and execute deployment of cataloging tools (e. g. , metadata management, lineage, glossary) Ensure alignment with DAMA - DMBOK principles Governance Framework Execution Collaborate with Data Owners, Stewards, and Custodians to define and enforce data policies, standards, and RACI mode Support the Data Governance Council and contribute to the development of governance artifacts (e. g. , roles, regulations, KPIs) Data Quality Stewardship Partner with domain experts to drive data profiling, cleansing, and validation initiatives Monitor data quality metrics and support remediation efforts across domains Stakeholder Engagement Enablement Provide training and support to business users on catalog usage and governance practices Act as a liaison between business and IT to ensure data needs are met and governance is embedded in operations Innovation Continuous Improvement Stay current with industry trends and tool capabilities (e. g. , Databricks, SAP MDG) Propose enhancements to governance processes and tooling based on user feedback and analytics Your background Bachelors degree in information systems , Data Science, Business Informatics, or related field 1-3 years of experience in data governance, data management, or analytics roles Familiarity with DAMA DMBOK2 frame wo rk and data governance tools ( e. g. SAP MDG, Data Sphere, Business Warehouse , Data Intelligence, Informatica ETL , ) Strong communication and collaboration skills; ability to work across business and technical teams. Proactive, solution-oriented, and eager to learn ready to make it happen. Autonomy and ambiguity management capacities are competitive advantages Complete the preference for a candidate, new technology focus, next stop learning and embracing the challenge attitude CDMP certifications preference attribute More about us When joining us you may expect: A purpose-driven role in a global energy leader committed to sustainability and digital transformation Mentorship and development opportunities within a diverse and inclusive team. a initiatives and cutting-edge technologies A culture that values integrity, curiosity, and collaboration aligned with Hitachi Energy s Leadership Pillars: Lead with Purpose Opportunity to create customer value Drive Results Build Collaboration Develop Self Others Qualified individuals with a disability may request a reasonable accommodation if you are unable or limited in your ability to use or access the Hitachi Energy career site as a result of your disability. You may request reasonable accommodations by completing a general inquiry form on our website. Please include your contact information and specific details about your required accommodation to support you during the job application process. .

Posted 2 weeks ago

Apply

9.0 - 14.0 years

15 - 30 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

JD: Senior Snowflake Data Architect : Design , implements, and optimizes data solutions within the Snowflake cloud data platform, ensuring data security, governance, and performance, while also collaborating with cross-functional teams and providing technical leadership. Data architect include determining a data strategy. understanding data management technologies oversee data inventory. maintain a finger on the pulse of an organization's data management systems.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

15 - 20 Lacs

Hyderabad, Gurugram

Work from Office

Naukri logo

Hands-on experience with the OneTrust platform , including modules for Privacy Management, Data Governance, and Risk & Compliance. Ability to troubleshoot and support OneTrust workflows, integrations, and automation rules in a production environment. Experience managing incident response workflows, data subject access requests (DSARs), and third-party risk assessments using OneTrust. Proficiency in analyzing logs, audit trails, and system alerts generated by OneTrust to identify and resolve operational issues. Understanding of OneTrust API integrations and ability to support connected systems (e.g., CRM, ticketing tools, cloud platforms). Exposure to OneTrust AI features like Copilot or Agentic AI for automating compliance and governance tasks is a plus. Strong collaboration skills to work with compliance, legal, and IT teams to ensure platform stability and regulatory alignment Familiarity and working experience on ecommerce projects Working knowledge with ServiceNOW ITSM tool Knowledge of Production Support processes and procedures. Have ability to demonstrate functional and technical architecture knowledge and correlate between the two from past experiences Have good exposure of ITIL Processes like Incident Management, Problem Management, and Knowledge Management etc.

Posted 2 weeks ago

Apply

13.0 - 20.0 years

25 - 40 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Position Overview We are looking for a highly experienced and versatile Solution Architect Data to lead the solution design and delivery of next-generation data solutions for our BFS clients. The ideal candidate will have a strong background in data architecture and engineering, deep domain expertise in financial services, and hands-on experience with cloud-native data platforms and modern data analytics tools. The role will require architecting solutions across Retail, Corporate, Wealth, and Capital Markets, as well as Payments, Lending, and Onboarding journeys. Possession of Data Analytics and Exposure to Data regulatory domain will be of distinct advantage. Hands on experience of AI & Gen AI enabling data related solution will be a distinct advantage for the position. Key Responsibilities Design and implement end-to-end data solutions for BFS clients, covering data engineering and analytics involving modern data stacks and concepts. Architect cloud-native data platforms using AWS, Azure, and GCP (certifications preferred). Build and maintain data models aligned with Open Banking, Open Finance, SCA, AISP, and PISP requirements. Enrich Solution design by incorporating the construct of industry-standard data architectures using frameworks such as BIAN and lead data standardization programs for banks. Enrich solution architecture by enabling AI and Gen AI paradigm for data engineering, analytics and data regulatory Deliver data solutions in domains like Core Banking, Payments, Lending, Customer Onboarding, Wealth, and Capital Markets. Collaborate with business and technology stakeholders to gather requirements and translate them into scalable data architectures. Solution Design and if needed hands-on in developing lab-class Proof-of-Concepts (POCs) showcasing data-driven capabilities. Lead and contribute to RFX responses for banking and financial services clients and regulatory bodies across UK, EMEA regions. Provide architectural leadership in data initiatives related to regulatory compliance and risk analytics. In this regard, familiarity and working experience of with regulatory software and platform such as SAS, Nice Actimize, and Wolters Kluwer will be preferred. Required Skills & Experience 1218 years of experience in IT with a focus on data solution architecture in BFS domain. Strong delivery and development experience in Retail, Corporate, Wealth, and Capital Market banking domains. Deep understanding of data standards such as BIAN and experience implementing them in banking projects. Expertise in cloud platforms (AWS, Azure, GCP) and leveraging native services for data processing, storage, and analytics. Strong experience in building data models and data solutions for Open Banking, Open Finance, and regulatory needs including SCA, AISP, and PISP. Proficiency in data engineering pipelines and real-time/batch data processing. Experience in designing enterprise data lakes, data warehouses, and implementing data mesh and data lineage frameworks. Hands-on experience in developing rapid POCs and accelerators. Primary Technical Skills Cloud Platforms: AWS, Azure, GCP (certified preferred) Big Data Technologies: Hadoop, Spark, Databricks, Delta Lake Programming Languages: Python, Scala, SQL Data Engineering & Pipelines: Apache Airflow, Kafka, Glue, Data Factory Data Warehousing: Snowflake, Redshift, BigQuery, Synapse Visualization: Power BI, Tableau, Looker Data Governance: Data Lineage, Data Cataloging, Master Data Management Architecture Concepts: Data Mesh, Data Fabric, Event-driven architecture

Posted 2 weeks ago

Apply

4.0 - 9.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Overview The Senior Data Analyst is responsible for serving as a subject matter expert who can lead efforts to analyze data with the goal of delivering insights that will influence our products and customers. This position will report into the Data Analytics Manager , and will work closely with members of our product and marketing team s , data engineers, and members of our Customer Success organization supporting client outreach efforts. The chief function s of this role will be finding and sharing data-driven insights to deliver value to le ss technical a udiences , and instilling best practices for analytics in the rest of the team . Responsibilities Perform various data analysis functions to a nalyzedatafrom a variety of sources including external labor marketdataand research and internaldatasets fromourplatforms Incorporate information from a variety of systems to produce comprehensive and compelling narrative s for thought-leadership initiatives and customer engagements Demonstrate critical thinking - identify the story in context using multipledatasets, and present results . A strong proficiency in data storytelling will be critical to success in this role. Understand principles of quality data visualization and apply them in Tableau to create and maintain custom dashboards for consumption by other employees Find and i nvestigatedataquality issues, root causes andrecommendremedies to be implemented by thedatascientists and engineers Lia i se with teams around our business to understand their problems , determine how our team can help, then use our database to produce the content they need Identify datamappingand enrichmentrequirements . Familiarity with SQL, especially the logic behind different types of data joins and writing efficient queries, will be necessary Consistently ensure that business is always conducted with integrity and that behavior aligns with iCIMS policies, procedures, and core competencies Additional Job Responsibilities: Produce and adaptdatavisualizations in response to business requests for internaland externaluse Shows good judgement in prioritizing their own commitments and those of the larger team , while demonstrating initiative and appropriate urgency when needed Mentor junior team members in best practices for analytics, data visualization, and data storytelling . Exemplify these standards and guide teammates in following them. Think creatively to produce unique, actionable insights from complex datasets, which can deliver value to our business and to our customers. Qualifications 5-10 years professional experienceworking in an analytics capacity Excellent communication skills , especially with regards to data storytelling – finding insights from complex datasets and sharing those findings with key stakeholders Strongdataanalytics and visualization skills Expertise in Tableau Desktop (Tableau Server and Prep are preferable) producing clear and informative graphs and dashboards Proficiency in SQL and either Python or R to extract and prepare data for analysis Advanced knowledge of Excel (Pivot tables, VLOOKUPs, IF statements) Familiarity with data guardrails to ensure compliance with applicable data governance regulations and privacy laws (i.e., GDPR)

Posted 2 weeks ago

Apply

2.0 - 6.0 years

8 - 15 Lacs

Navi Mumbai

Work from Office

Naukri logo

Overview GEP is a diverse, creative team of people passionate about procurement. We invest ourselves entirely in our client’s success, creating strong collaborative relationships that deliver extraordinary value year after year. Our clients include market global leaders with far-flung international operations, Fortune 500 and Global 2000 enterprises, leading government and public institutions. We deliver practical, effective services and software that enable procurement leaders to maximise their impact on business operations, strategy and financial performance. That’s just some of the things that we do in our quest to build a beautiful company, enjoy the journey and make a difference. GEP is a place where individuality is prized, and talent respected. We’re focused on what is real and effective. GEP is where good ideas and great people are recognized, results matter, and ability and hard work drive achievements. We’re a learning organization, actively looking for people to help shape, grow and continually improve us. Are you one of us? GEP is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, ethnicity, color, national origin, religion, sex, disability status, or any other characteristics protected by law. We are committed to hiring and valuing a global diverse work team. For more information please visit us on GEP.com or check us out on LinkedIn.com. Responsibilities Job Title: Sr. Associate – Sales Operations & Data Analytics Office Location: Navi Mumbai Experience: 3+ years Shift Timing: 12:30 PM – 9:30 PM IST / 2:00 PM – 11:00 PM IST Role Summary: We are looking for a results-driven Sr. Associate with strong experience in Sales Operations, Data Analytics , and Power BI to support our global sales team. This role will play a key part in driving data governance, managing CRM systems, and creating impactful dashboards and executive presentations. Key Responsibilities: Manage and maintain data accuracy within the CRM system. Support software sales teams with actionable insights through data governance and analytics. Develop and maintain interactive dashboards and reports using Power BI. Automate data flows and reports using Power Automate and Power Query. Build high-quality leadership decks and sales performance presentations using PowerPoint. Collaborate with cross-functional teams to ensure alignment on sales metrics and KPIs. Identify process improvement opportunities through analytics and reporting. Qualifications Required Qualifications & Skills: Minimum 3 years of experience in Sales Operations and Data Analytics. Strong hands-on experience with Advanced Excel, Power BI, Power Automate, and Power Query . Proficiency in creating visually compelling dashboards and reports . Advanced skills in Microsoft PowerPoint for executive-level presentations. Prior experience working with CRM systems (e.g., Salesforce, HubSpot) preferred. Strong attention to detail, problem-solving mindset, and ability to work independently in a fast-paced environment. Preferred Traits: Excellent communication and collaboration skills. Comfortable working in overlapping time zones with international teams. Self-starter with a passion for using data to drive decisions.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a seasoned Senior Data Engineer to join our Marketing Data Platform team. This role is pivotal in designing, building, and optimizing scalable data pipelines and infrastructure that support our marketing analytics and customer engagement strategies. The ideal candidate will have extensive experience with big data technologies, cloud platforms, and a strong understanding of marketing data dynamics. Data Pipeline Development & Optimization Design, develop, and maintain robust ETL/ELT pipelines using Apache PySpark on GCP services like Dataproc and Cloud Composer. Ensure data pipelines are scalable, efficient, and reliable to handle large volumes of marketing data. Data Warehousing & Modeling Implement and manage data warehousing solutions using BigQuery, ensuring optimal performance and cost-efficiency. Develop and maintain data models that support marketing analytics and reporting needs. Collaboration & Stakeholder Engagement Work closely with marketing analysts, data scientists, and cross-functional teams to understand data requirements and deliver solutions that drive business insights. Translate complex business requirements into technical specifications and data architecture. Data Quality & Governance Implement data quality checks and monitoring to ensure the accuracy and integrity of marketing data. Adhere to data governance policies and ensure compliance with data privacy regulations. Continuous Improvement & Innovation Stay abreast of emerging technologies and industry trends in data engineering and marketing analytics. Propose and implement improvements to existing data processes and infrastructure Years of Experience 5 Years in Data Engineer space Education Qualification & Certifications B.Tech or MCA Experience Proven experience with Apache PySpark, GCP (including Dataproc, BigQuery, Cloud Composer), and data pipeline orchestration. Technical Skills Proficiency in SQL and Python. Experience with data modeling, ETL/ELT processes, and data warehousing concepts.

Posted 2 weeks ago

Apply

3.0 - 6.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Job Description As a Data Engineer, your role is to spearhead the data engineering teams and elevate the team to the next level! You will be responsible for laying out the architecture of the new project as well as selecting the tech stack associated with it. You will plan out the development cycles deploying AGILE if possible and create the foundations for good data stewardship with our new data products! You will also set up a solid code framework that needs to be built to purpose yet have enough flexibility to adapt to new business use cases tough but rewarding challenge! Responsibilities Collaborate with several stakeholders to deeply understand the needs of data practitioners to deliver at scale Lead Data Engineers to define, build and maintain Data Platform Work on building Data Lake in Azure Fabric processing data from multiple sources Migrating existing data store from Azure Synapse to Azure Fabric Implement data governance and access control Drive development effort End-to-End for on-time delivery of high-quality solutions that conform to requirements, conform to the architectural vision, and comply with all applicable standards. Present technical solutions, capabilities, considerations, and features in business terms. Effectively communicate status, issues, and risks in a precise and timely manner. Further develop critical initiatives, such as Data Discovery, Data Lineage and Data Quality Leading team and Mentor junior resources Help your team members grow in their role and achieve their career aspirations Build data systems, pipelines, analytical tools and programs Conduct complex data analysis and report on results Qualifications 3+ Years of Experience as a data engineer or similar role in Azure Synapses, ADF or relevant exp in Azure Fabric Degree in Computer Science, Data Science, Mathematics, IT, or similar fiel

Posted 2 weeks ago

Apply

7.0 - 12.0 years

10 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Job Description As a Sr Data Engineer, your role is to spearhead the data engineering teams and elevate the team to the next level! You will be responsible for laying out the architecture of the new project as well as selecting the tech stack associated with it. You will plan out the development cycles deploying AGILE if possible and create the foundations for good data stewardship with our new data products! You will also set up a solid code framework that needs to be built to purpose yet have enough flexibility to adapt to new business use cases tough but rewarding challenge! Responsibilities Collaborate with several stakeholders to deeply understand the needs of data practitioners to deliver at scale Lead Data Engineers to define, build and maintain Data Platform Work on building Data Lake in Azure Fabric processing data from multiple sources Migrating existing data store from Azure Synapse to Azure Fabric Implement data governance and access control Drive development effort End-to-End for on-time delivery of high-quality solutions that conform to requirements, conform to the architectural vision, and comply with all applicable standards. Present technical solutions, capabilities, considerations, and features in business terms. Effectively communicate status, issues, and risks in a precise and timely manner. Further develop critical initiatives, such as Data Discovery, Data Lineage and Data Quality Leading team and Mentor junior resources Help your team members grow in their role and achieve their career aspirations Build data systems, pipelines, analytical tools and programs Conduct complex data analysis and report on results Qualifications 7+ Years of Experience as a data engineer or similar role in Azure Synapses, ADF or relevant exp in Azure Fabric Degree in Computer Science, Data Science, Mathematics, IT, or similar fiel

Posted 2 weeks ago

Apply

5.0 - 7.0 years

8 - 12 Lacs

Kochi, Bengaluru

Work from Office

Naukri logo

Overview As an MDM Technical Delivery Manager, you will be responsible for leading and overseeing the end-to-end delivery of Master Data Management (MDM) solutions. You will collaborate with cross-functional teams to drive technical implementation, ensure data governance, and align with business objectives. Your expertise in MDM platforms, integration strategies, and project execution will be key to delivering high-quality solutions Key Responsibilities Oversee a team of experienced professionals, fostering collaboration and high performance. Guide and mentor team members, supporting their job performance and career growth. Lead the technical delivery of MDM implementations, ensuring successful project execution. Define MDM architecture, strategy, and integration frameworks with enterprise systems. Collaborate with business stakeholders to understand data requirements and align solutions. Oversee data governance, quality, and compliance with regulatory standards. Manage MDM development teams, ensuring adherence to best practices and standards. Optimize data models, workflows, and processes for efficient MDM operations. Drive continuous improvements in MDM technologies, methodologies, and performance. Communicate project updates, risks, and resolutions to leadership and stakeholders. Required Qualifications Bachelor s degree in Computer Engineering, Computer Science, or a related field. 5-7+ years of experience in software development and data Management. 5+ years of expertise in MDM implementation, with hands-on experience in Reltio, DataBricks, Azure, Oracle, and Snowflake. Strong background in integration design and development. Strong expertise in data integration design, ETL processes, and API development. At least 2+ years in an MDM Technical Lead and Delivery role. Proven track record in leading MDM projects and cross-functional teams. Solid understanding of diverse data sets, sources, and country-specific data models. Experience in life sciences MDM implementations. Experience in life sciences, healthcare, or pharmaceutical industries is a plus. Excellent communication, leadership, and problem-solving skills.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Location(s): Quay Building 8th Floor, Bagmane Tech Park, Bengaluru, IN Line Of Business: Technology Services Group(TSG) Job Category: Engineering & Technology Experience Level: Experienced Hire At Moodys, we unite the brightest minds to turn today s risks into tomorrow s opportunities. We do this by striving to create an inclusive environment where everyone feels welcome to be who they are-with the freedom to exchange ideas, think innovatively, and listen to each other and customers in meaningful ways. If you are excited about this opportunity but do not meet every single requirement, please apply! You still may be a great fit for this role or other open roles. We are seeking candidates who model our values: invest in every relationship, lead with curiosity, champion diverse perspectives, turn inputs into actions, and uphold trust through integrity. Position Overview We are looking for a skilled Configuration Management Data Analyst with expertise in ServiceNow to manage, analyze, and optimize configuration data across our organization. The ideal candidate will play a critical role in maintaining the accuracy and integrity of our Configuration Management within ServiceNow, ensuring alignment with ITIL best practices and supporting business decision-making. This role requires a strong background in ServiceNow, analytical capabilities, and a collaborative approach to working with cross-functional teams. Key Responsibilities 1. Configuration Management Administration and Maintenance o Maintain and enhance the ServiceNow CMDB to ensure data accuracy, completeness, and compliance with organizational standards. o Regularly audit configuration data to identify inconsistencies, address gaps, and enforce data governance policies. o Design and implement automated workflows within ServiceNow to streamline data updates and ensure real-time accuracy. 2. Data Analysis and Reporting o Analyze configuration data stored in ServiceNow to identify trends, risks, and opportunities for optimization. o Create and maintain dashboards, reports, and KPIs within ServiceNow to provide actionable insights to stakeholders. o Provide data-driven recommendations to improve IT infrastructure and configuration management processes. 3. Collaboration and Process Improvement o Work closely with IT, operations, and engineering teams to ensure the proper integration of configuration management processes with business objectives. o Act as a subject matter expert for CMDB best practices and ServiceNow capabilities, providing training and support to teams as needed. 4. ServiceNow Development and Optimization o Collaborate with ServiceNow developers to customize CMDB modules, workflows, and scripts based on organizational needs. o Stay up-to-date with ServiceNow platform updates, features, and releases to identify opportunities for improved functionality. o Troubleshoot and resolve technical issues related to ServiceNow CMDB operations. Experience and Qualification * 3+ years of experience in configuration management, data analysis, or CMDB administration, preferably in financial services. * Hands-on experience with ServiceNow, including CMDB module administration and customization. * Strong understanding of ITIL principles, particularly Configuration Management. * Proficiency in creating dashboards, reports, and workflows within ServiceNow. * Bachelor s degree in Information Technology, Computer Science, or a related field.

Posted 2 weeks ago

Apply

3.0 - 10.0 years

10 - 11 Lacs

Gurugram

Work from Office

Naukri logo

NAB is looking for Analyst to join our dynamic team and embark on a rewarding career journey. Collect, interpret, and analyze data to help the organization make informed business decisions Create reports, dashboards, and visual presentations to communicate insights clearly to stakeholders Identify trends, patterns, and discrepancies in large datasets using statistical tools and software Collaborate with departments to define metrics and KPIs for ongoing monitoring Recommend improvements to processes, strategies, and performance based on findings Ensure data accuracy, maintain databases, and follow data governance and compliance standards Continuously research industry trends to support strategic planning and optimization efforts

Posted 2 weeks ago

Apply

1.0 - 13.0 years

13 - 14 Lacs

Pune

Work from Office

Naukri logo

Design, develop, and maintain scalable data solutions using Starburst. Collaborate with cross-functional teams to integrate Starburst with existing data sources and tools. Optimize query performance and ensure data security and compliance. Implement monitoring and alerting systems for data platform health. Stay updated with the latest developments in data engineering and analytics. Skills Must have Bachelors degree or Masters in a related technical field; or equivalent related professional experience. Prior experience as a Software Engineer applying new engineering principles to improve existing systems including leading complex, well defined projects. Strong knowledge of Big-Data Languages including: SQL Hive Spark/Pyspark Presto Python Strong knowledge of Big-Data Platforms, such as:o The Apache Hadoop ecosystemo AWS EMRo Qubole or Trino/Starburst Good knowledge and experience in cloud platforms such as AWS, GCP, or Azure. Continuous learner with the ability to apply previous experience and knowledge to quickly master new technologies. Demonstrates the ability to select among technology available to implement and solve for need. Able to understand and design moderately complex systems. Understanding of testing and monitoring tools. Ability to test, debug, fix issues within established SLAs. Experience with data visualization tools (e.g., Tableau, Power BI). Understanding of data governance and compliance standards. Nice to have Data Architecture & Engineering: Design and implement efficient and scalable data warehousing solutions using Azure Databricks and Microsoft Fabric. Business Intelligence & Data Visualization: Create insightful Power BI dashboards to help drive business decisions. Other Languages English: C1 Advanced Seniority Senior Refer a Friend Positive work environments and stellar reputations attract and retain top talent. Find out why Luxoft stands apart from the rest. Recommend a friend Related jobs View all vacancies Senior Azure AI & ML Engineer Data Science United States of America Alpharetta Senior Azure AI & ML Engineer Data Science United States of America Remote United States Data Engineer with Neo4j Data Science India Chennai Pune, India Req. VR-114886 Data Science BCM Industry 05/06/2025 Req. VR-114886 Apply for Starburst Engineer in Pune *

Posted 2 weeks ago

Apply

5.0 - 10.0 years

13 - 15 Lacs

Pune

Work from Office

Naukri logo

Design, build, and manage data pipelines using Azure Data Integration Services (Azure DataBricks, ADF, Azure Functions.) Collaborate closely with the security team to develop robust data solutions that support our security initiatives. Implement, monitor, and optimize data processes, ensuring adherence to security and data governance best practices. Troubleshoot and resolve data-related issues, ensuring data quality and accessibility. Develop strategies for data acquisitions and integration of the new data into our existing architecture. Document procedures and workflows associated with data pipelines, contributing to best practices. Share knowledge about latest Azure Data Integration Services trends and techniques. Implement and manage CI/CD pipelines to automate data and UI testcases and integrate testing with development pipelines. Implement and manage CI/CD pipelines to automate development and integrate test pipelines. Conduct regular reviews of the system, identify possible security risks, and implement preventive measures. Skills Must have Excellent command of English Bachelors or Masters degree in Computer Science, Information Technology, or related field. 5+ years of experience in data integration and pipeline development using Azure Data Integration Services including Azure Data Factory and Azure Databricks. Hands-on with Python and Spark Strong understanding of security principles in the context of data integration. Proven experience with SQL and other data query languages. Ability to write, debug, and optimize data transformations and datasets. Extensive experience in designing and implementing ETL solutions using Azure Databricks, Azure Data Factory or similar technologies. Familiar with automated testing frameworks using Squash Nice to have Data Architecture & Engineering: Design and implement efficient and scalable data warehousing solutions using Azure Databricks and Microsoft Fabric. Business Intelligence & Data Visualization: Create insightful Power BI dashboards to help drive business decisions. Other Languages English: C1 Advanced Seniority Senior Refer a Friend Positive work environments and stellar reputations attract and retain top talent. Find out why Luxoft stands apart from the rest. Recommend a friend Related jobs View all vacancies Senior Azure AI & ML Engineer Data Science United States of America Alpharetta Senior Azure AI & ML Engineer Data Science United States of America Remote United States Data Engineer with Neo4j Data Science India Chennai Pune, India Req. VR-114884 Data Science BCM Industry 05/06/2025 Req. VR-114884 Apply for Databricks Developer in Pune *

Posted 2 weeks ago

Apply

8.0 - 13.0 years

15 - 19 Lacs

Gurugram

Work from Office

Naukri logo

Every day, tens of millions of people come to Roblox to explore, create, play, learn, and connect with friends in 3D immersive digital experiences- all created by our global community of developers and creators. At Roblox, we re building the tools and platform that empower our community to bring any experience that they can imagine to life. Our vision is to reimagine the way people come together, from anywhere in the world, and on any device. We re on a mission to connect a billion people with optimism and civility, and looking for amazing talent to help us get there. A career at Roblox means you ll be working to shape the future of human interaction, solving unique technical challenges at scale, and helping to create safer, more civil shared experiences for everyone. About the role: The Roblox Operating System (ROS) team is responsible for the foundational technology and services that power all experiences on Roblox. This critical team ensures a seamless, performant, and reliable platform for our global community of users and developers. You will be the first Product Manager hire for our India office, reporting to Theresa Johnson, the Head of Product for ROS. You will play a pivotal role in building and enhancing our data analytics capabilities within the Roblox operating system, collaborating closely with the India-based Data Engineering team, which includes an Engineering Manager, three engineers, and multiple data scientists. This is a full time onsite role based out of our Gurugram office. Shift Time: 2:00PM - 10:30PM IST (Cabs will be provided) You will: Collaborate with data engineering and product engineering teams in India to build integrated analytics tooling. Develop cross-functional data visualization and reporting capabilities. Implement advanced insights extraction methodologies. Develop self-service data exploration tools. Integrate data analytics capabilities into Roblox operating system. Ensure seamless data flow across organizational platforms. Implement cutting-edge data infrastructure solutions. Build a scalable data registry that will allow us to understand, register, classify and govern data across all of ROS. This will involve partnering with data engineers to build and maintain robust data pipelines integrating diverse sources like HR systems (Workday, Greenhouse), collaboration tools (Slack, Zoom), business applications (Pigment, Zendesk), and internal Roblox applications. Partner with Data Scientists to process and transform data into actionable insights, developing systems that generate builder development signals and promote positive behaviors. Contribute to achieving key outcomes such as reducing data access request resolution time by 60%, increasing self-service data exploration adoption by 75%, and achieving 99.9% data pipeline reliability. You have: A Bachelor s degree or equivalent experience in Computer Science, Computer Engineering, or a similar technical field. 8+ years of product management experience, with a focus on data platforms, analytics, or developer tools. Strong understanding of data infrastructure, data warehousing, and ETL processes, including experience with data governance tools focusing on discovery, cataloging, metadata management, classification, and quality assurance. Proven ability to work autonomously and define product scope in ambiguous environments. Experience collaborating with engineering and data science teams to deliver impactful data products. Excellent communication and interpersonal skills, with the ability to articulate complex technical concepts to diverse audiences. You are: Someone with strong product intuition of what we should be doing rather than just following instructions. Highly organized with a strong sense of urgency. A collaborative team player who can navigate cross-functional partnerships effectively. Adaptable and comfortable working in a fast-paced, evolving environment. A strategic thinker with a bias for action and a focus on delivering measurable results. Roles that are based in our San Mateo, CA Headquarters are in-office Tuesday, Wednesday, and Thursday, with optional in-office on Monday and Friday (unless otherwise noted).

Posted 2 weeks ago

Apply

5.0 - 12.0 years

18 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Core Competences Required and Desired Attributes: Bachelors degree in computer science, Information Technology, or a related field. Proficiency in Azure Data Factory, Azure Databricks and Unity Catalog, Azure SQL Database, and other Azure data services. Strong programming skills in SQL, Python and PySpark languages. Experience in the Asset Management domain would be preferable. Strong proficiency in data analysis and data modelling, with the ability to extract insights from complex data sets. Hands-on experience in Power BI, including creating custom visuals, DAX expressions, and data modelling. Familiarity with Azure Analysis Services, data modelling techniques, and optimization. Experience with data quality and data governance frameworks with an ability to debug, fine tune and optimise large scale data processing jobs. Strong analytical and problem-solving skills, with a keen eye for detail. Excellent communication and interpersonal skills, with the ability to work collaboratively in a team environment. Proactive and self-motivated, with the ability to manage multiple tasks and deliver high-quality results within deadlines.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

10 - 14 Lacs

Pune

Work from Office

Naukri logo

What you ll do Works independently within Data and Analytics with limited design help from manager or senior associates Leverage coding best practices and advanced techniques to ensure efficient execution of code against large datasets, ensuring code is repeatable and scalable Run, create and optimize standard processes to ensure metrics, reports and insights are delivered consistently to stakeholders with minimal manual intervention Leverage knowledge of data structures to prepare data for ingestion efforts, analysis, assembling data from disparate data sources for the creation of insights; accurately integrate new and complex data sources Integrate Equifax, customer and third party data to solve internal or customer analytical problems of moderate complexity and report findings to managers and stakeholders Review output of code for anomalies and perform analysis to determine cause, and work with Data, Analytics, Product and Technology counterparts to implement corrective measures Ability to communicate impacts and importance of findings on the business (either Equifax or external customer) and recommend appropriate course of action. Understands the concepts of quantitative and qualitative data and how to relate them to the customer to show value of analysis. Ensure proper use of Equifax data assets by working closely with data governance and compliance professionals What experience you need BS degree in a STEM major or equivalent discipline 2-5 years of experience in a related analyst role Cloud certification strongly preferred Technical capabilities including SQL, BigQuery, R, Python, MS Excel / Google Sheets, Tableau, Looker Experience working as a team and collaborating with others on producing descriptive and diagnostic analysis What could set you apart Cloud certification such as GCP strongly preferred Self Starter Excellent communicator / Client Facing Ability to work in fast paced environment Flexibility work across A/NZ time zones based on project needs

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies