Home
Jobs

2484 Data Quality Jobs - Page 32

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 15.0 years

12 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

Overview Driving DQ agenda within PGT deployment Define KPIs and Metrics in alignment with global Governance Lead Data migration planning, Execution & Communication E2E delivery of MDG workflow functionalities Integration with Global MDG team to adopt PGT design Lead the End user training and change management Responsibilities Drive continuous improvement of data governance and data maintenance processes for implementing countries/entities Create & Align data standards for master, supplemental and transactional data objects drive adoption of data standards and design principle to drive data consistency and bring efficiency to migration process Governance to Ensure data standards and key decisions are properly documented (i.e. KDDs, DDs, Metadata, DQ Rules, CRDs) build the capability with Pepsico to drive cost efficient delivery model by reducing the delivery work for external partner Excellent oral and written skills to provide clear documentation to project teams, the management and executive leadership, and across multiple geographies Qualifications Bachelors degree required 10+ years of experience with data / conversions / interfaces Demonstrated ability to effectively communicate with all levels of the organization Ability to work flexible hours based on varying business requirements Solves highly complex problems within their work team Ability to quickly adapt to changes in timelines and sequences Adaptability and flexibility including ability to manage deadline pressure, ambiguity and change Data-driven mindset, leveraging a structured analytical approach, and comfortable with manipulating and profiling large volumes of data (demonstrated by skills in SQL, Excel, Access) and data management responsibilities (such as leveraging WinShuttle, MDM/MDG, workflow)

Posted 2 weeks ago

Apply

9.0 - 13.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Data Quality Lead by Domain Role Description: You will play a key role in the implementation and adoption of the data governance framework which will modernize Amgen's data ecosystem, positioning Amgen as a leader in biopharma innovation. This role leverages state-of-the-art technologies, including Generative AI, Machine Learning, and integrated data. You will leverage domain, technical and business process expertise to provide exceptional support of Amgens data governance framework. This role involves working closely with business stakeholders and data analysts to ensure implementation and adoption of the data governance framework. You will collaborate with the Product Owner and other Business Analysts to ensure operational support and excellence from the team. Roles & Responsibilities: Responsible for the data governance and data management framework implementation for a given domain of expertise (Research, Development, Supply Chain, etc.). Responsible for the operationalization of the Enterprise data governance framework and aligning broader stakeholder community with their data governance needs, including data quality, data access controls, compliance with privacy and security regulations, foundational master data management, data sharing, communication and change management. Works with Enterprise MDM and Reference Data to enforce standards and data reusability. Drives cross functional alignment in his/her domain(s) of expertise to ensure adherence to Data Governance principles. Develop and implement data quality frameworks. Design and enforce standards for data quality and governance to ensure consistent, accurate, and reliable data. Maintain documentation on data definitions, data standards, data flows, legacy data structures / hierarchies, common data models, data harmonization etc. data quality best practices and tools for assigned domains. Ensure compliance requirements with data privacy, security, and regulatory policies for the assigned domains. Collaborate with stakeholders and work closely with data stewards, analysts, IT, and business units to understand data requirements and address quality concerns. Jointly with Technology teams, business functions, and enterprise teams (e.g., MDM, Enterprise Data Fabric, etc.) define the specifications shaping the development and implementation of data foundations. Build strong relationships with key business leaders and partners to ensure their needs are being met. Lead data quality initiatives aimed at improving data quality, including data cleansing, enrichment, and validation processes. Functional Skills: Must-Have Functional Skills: Technical skills with knowledge of Pharma processes with specialization in a domain (e.g., Research, Clinical Trials, Commercial, etc.). In depth knowledge of data management, common data models, metadata management, data quality, master data management, data stewardship, data protection, etc. In depth experience with data products development life cycle, including the enablement of data dictionaries, business glossary to increase data products reusability and data literacy. Customer-focused with excellent written and verbal communication skills who can confidently work with internal Amgen business stakeholders and external service partners on business process and technology topics In depth experience of working with or supporting systems used to data governance framework. E.g. Collibra, Alation Excellent problem-solving skills and committed attention to detail in finding solutions. Proficiency in data analysis and quality tools (e.g., SQL, Excel, Python, or SAS) Good-to-Have Functional Skills: Experience of working with data governance councils or forums Experience with Agile software development methodologies (Scrum) 3-5 years of experience in data quality management, data governance, or related roles. Soft Skills: Highly organized and able to work under minimal supervision Excellent analytical and assessment skills Ability to work effectively with global, virtual teams Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Ambitious to further develop their skills and career Ability to build business relationships and understand end-to-end data use and needs. Excellent interpersonal skills (team player). People management skills either in matrix or direct line function. Strong verbal and written communication skills High degree of initiative and self-motivation. Good presentation and public speaking skills. Strong attention to detail, quality, time management and customer focus. Basic Qualifications: Any Degree and 9-13 years of experience

Posted 2 weeks ago

Apply

7.0 - 10.0 years

8 - 14 Lacs

Kolkata

Work from Office

Naukri logo

Job Title : Data Engineer / Data Modeler. Location : Remote (India). Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 2 weeks ago

Apply

7.0 - 10.0 years

8 - 14 Lacs

Noida

Work from Office

Naukri logo

Job Title : Data Engineer / Data Modeler. Location : Remote (India). Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 2 weeks ago

Apply

7.0 - 10.0 years

8 - 14 Lacs

Surat

Work from Office

Naukri logo

Job Title : Data Engineer / Data Modeler. Location : Remote (India). Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 2 weeks ago

Apply

4.0 - 5.0 years

9 - 10 Lacs

Pune

Work from Office

Naukri logo

Senior Database Administrator Job Summary We are looking for a seasoned Senior Database Administrator (DBA) with strong expertise in SQL Server to join our IoT Software team . In this role, you will manage, optimize, and scale databases that support massive volumes of machine-generated and telemetry data from connected devices. The ideal candidate is passionate about data performance, experienced in high-ingestion environments, and comfortable working in fast-paced, innovation-driven teams. This is a critical role that directly supports our mission to deliver real-time insights and intelligent automation through connected devices and edge-to-cloud platforms. Duties and Responsibilities Architect, deploy, and maintain SQL Server databases optimized for high-throughput, real-time IoT data ingestion and analytics. Perform advanced query tuning, index optimization, and partitioning to support time-series data and sensor data warehousing. Manage and optimize ETL pipelines and data flows from IoT gateways, edge devices, and cloud platforms. Collaborate with data engineers, DevOps, and software developers to design scalable, high-performance data structures. Ensure data quality, availability, and integrity across distributed and hybrid environments (on-premise + cloud). Implement high availability (HA) and disaster recovery (DR) strategies including Always On Availability Groups, log shipping, and database mirroring. Utilize monitoring tools to proactively address latency, I/O performance issues, and system health concerns. Secure databases and ensure compliance with industry standards and regulations, including device data protection. Maintain documentation, automate maintenance tasks, and continuously optimize database operations. Develop proof of concept to rapidly validate design ideas. Other duties as required. Requirements Mandatory: BA, BSc, or MSc in Computer Science, Information Systems or other related technical discipline. 5+ years of hands-on experience as a SQL Server DBA, ideally in IoT, telemetry, or other data-intensive environments. Deep understanding of SQL Server internals, query optimization, and performance tuning. Expertise with high-volume, time-series, and sensor data ingestion and processing. Strong T-SQL scripting skills, including stored procedures and advanced query constructs. Experience with partitioning strategies, data archiving, and storage optimization for large datasets. Familiarity with edge computing concepts, streaming (e. g. , RabbitMQ), or telemetry processing. Knowledge of cloud platforms such as Azure (especially Azure SQL, IoT Hub). Experience in HA/DR design, including clustering, replication, and Always On. Preferred: Microsoft Certified: Azure Database Administrator Associate or equivalent. Exposure to time-series databases (e. g. , InfluxDB, TimescaleDB) or hybrid data architectures. Experience integrating SQL Server with real-time data processing frameworks and IoT platforms. Working Conditions/Other Normal office environment. Geographically distributed (virtual) team. Extensive use of computer to complete assignments. Ability to multi-task in a fast-paced environment with multiple deadlines is essential.

Posted 2 weeks ago

Apply

4.0 - 10.0 years

18 - 20 Lacs

Pune

Work from Office

Naukri logo

Title Senior Software Engineer Job Brief The ideal candidate will have a strong technical background, excellent problem-solving skills, and the ability to work collaboratively. This role involves designing, developing, and maintaining cloud-based applications, cloud management tools like self-service and cost management tools , ensuring they are secure, scalable, and efficient. Key Responsibilities This list may not include all the duties that may be assigned. Design, develop, and maintain a self-service cloud cost optimization portal. Provide technical support and guidance to self-service users, resolving any issues related to data quality, system access, and report generation. Develop and maintain documentation for system configurations, processes, and standard operating procedures. Implement and manage CI/CD pipelines. Ensure the security and compliance of cloud-based applications. Collaborate with IT and development teams to identify and implement optimal cloud solutions. Monitor and optimize the performance of cloud applications. Troubleshoot and resolve issues related to cloud infrastructure. Apply FinOps principles to manage and optimize cloud costs. Stay updated with the latest cloud technologies and best practices. Requirements Five or more years of Experience as Software Engineer or similar role. Experience with containerization technologies like Docker and Kubernetes. Proficiency in full stack web development. (Backend- Python and Front-end React. js / Angular) Familiarity with infrastructure such as code tools such as Terraform or CloudFormation. Proficiency in data transformation, preparation, modelling, and visualization practices. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Bachelor s degree in computer science, Engineering, or a related field. Preferred Qualifications Relevant certifications (e. g. , DevOps Engineer/ Architect, PCEP, Full Stack Development Certifications, Cloud Architect / Engineer/ Administrator). Proficiency with cloud platforms such as AWS, Azure, or Google Cloud. Experience with DevOps practices and tools. Experience with Business Intelligence Tools like Power BI. Knowledge of FinOps principles and practices. Experience in developing cost optimization strategies for cloud environments. Knowledge of networking, security, and database management.

Posted 2 weeks ago

Apply

5.0 - 12.0 years

16 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Are you looking for an exciting opportunity to join a dynamic and growing team in a fast paced and challenging areaThis is a unique opportunity for you to work in the Product Owner team to partner with the Business. The Data Product Owner for Data Modernization is responsible for executing critical data management activities that support strategic business and product objectives, process design, advanced analytics, and reporting, with a focus on the Work Capabilities Data Domain. This role involves collaborating with multiple stakeholders to ensure data is well-understood, documented, and effectively utilized across the organization. As a Data Owner in the work capabilities team, you will support the strategic direction led by the Work Capabilities Data Domain Owner to manage data quality, governance, and risk, while fostering strong relationships with data delivery partners and consumers. Job Responsibilities Implement strategic plans to deliver data solutions that effectively support business operations and strategic objectives, ensuring alignment with organization. Manage discovery efforts and market research to uncover customer solutions and integrate them into the product roadmap. Own, maintain, and develop a product backlog that enables development to support the overall strategic roadmap and value proposition. Define, describe, and register data products and offerings, leveraging strategic data dictionary tools. Work closely with Product Owner and other product leadership to understand overall product priorities and champion data needs. Ensure data is described, available, and accessible to consumers where needed. Partner regularly with the Data Domain Owner to stay aligned with evolving strategies for the Data Domain. Communicate regular updates and provide feedback on the effectiveness of data strategies and execution effectiveness. Lead both deep delivery and level up to ascertain, formulate, and communicate a clear strategic plan and approach. Assess independently and proactively through data to ensure requirements are clear and forward-thinking as we modernize the platform. Navigate seamlessly through a complex web of products, teams, and infrastructure that has legacy burdens. Required Qualifications, Capabilities, and Skills 5 years of experience in product management or a related role. In-depth understanding of data management principles, governance frameworks, and lifecycle management, including data protection, data quality and data classification and Experience managing delivery across multiple workstreams with varying timelines, priorities, and complexities. Experience with Agile methodologies and tools (e. g. , Scrum, JIRA). Technical understanding of data management and governance, cloud-based data platforms, or data architecture required. Influence a culture of data ownership and accountability across client domain functions, inclusive of sales, marketing, and client service. Review and monitor monthly data risk metrics and drive remediation efforts to address any metric breaches. and Familiarity with data modeling, BDM, and the ability to model decision models in Signavio. Hands-on in large data analysis using Excel, Alteryx, etc. to provide actionable insights and Proficient in MS Office suite of applications and Project Management, Governance and collaboration tools including JIRA, SharePoint, Confluence etc. Experience with rules execution engines such as Drools and Ability to run ad-hoc reporting and write SQL queries.

Posted 2 weeks ago

Apply

9.0 - 13.0 years

11 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

You will play a key role in the implementation and adoption of the data governance framework which will modernize Amgen's data ecosystem, positioning Amgen as a leader in biopharma innovation. This role leverages state-of-the-art technologies, including Generative AI, Machine Learning, and integrated data. You will leverage domain, technical and business process expertise to provide exceptional support of Amgens data governance framework. This role involves working closely with business stakeholders and data analysts to ensure implementation and adoption of the data governance framework. You will collaborate with the Product Owner and other Business Analysts to ensure operational support and excellence from the team. Roles & Responsibilities: Responsible for the data governance and data management framework implementation for the Research domain of the biopharma lifecycle. Responsible for the operationalization of the Enterprise data governance framework and aligning broader stakeholder community with their data governance needs, including data quality, data access controls, compliance with privacy and security regulations, foundational master data management, data sharing, communication and change management. Works with Enterprise MDM and Reference Data to enforce standards and data reusability. Drives cross functional alignment in his/her domain(s) of expertise to ensure adherence to Data Governance principles. Maintain documentation on data definitions, data standards, data flows, legacy data structures / hierarchies, common data models, data harmonization etc. for assigned domains. Ensure compliance requirements with data privacy, security, and regulatory policies for the assigned domains Jointly with Technology teams, business functions, and enterprise teams (e.g., MDM, Enterprise Data Fabric, etc.) define the specifications shaping the development and implementation of data foundations. Build strong relationships with key business leads and partners to ensure their needs are being met Functional Skills: Must-Have Functional Skills: Technical skills with knowledge of Pharma processes with specialization in the Research domain of the biopharma lifecycle. In depth knowledge of data management, common data models, metadata management, data quality, master data management, data stewardship, data protection, etc. In depth experience with data products development life cycle, including the enablement of data dictionaries, business glossary to increase data products reusability and data literacy. Customer-focused with excellent written and verbal communication skills who can confidently work with internal Amgen business stakeholders and external service partners on business process and technology topics In depth experience of working with or supporting systems used to data governance framework. E.g. Collibra, Alation Excellent problem-solving skills and committed attention to detail in finding solutions Good-to-Have Functional Skills: Experience of working with data governance councils or forums Experience with Agile software development methodologies (Scrum) Proficiency in data analysis and quality tools (e.g., SQL, Excel, Python, or SAS) Soft Skills: Highly organized and able to work under minimal supervision Excellent analytical and assessment skills Ability to work effectively with global, virtual teams Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Ambitious to further develop their skills and career Ability to build business relationships and understand end-to-end data use and needs. Excellent interpersonal skills (team player). People management skills either in matrix or direct line function. Strong verbal and written communication skills High degree of initiative and self-motivation. Good presentation and public speaking skills. Strong attention to detail, quality, time management and customer focus. Basic Qualifications: Any Degree and 9-13 years of experience

Posted 2 weeks ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

You will play a key role in the implementation and adoption of the data governance framework which will modernize Amgen's data ecosystem, positioning Amgen as a leader in biopharma innovation. This role leverages state-of-the-art technologies, including Generative AI, Machine Learning, and integrated data. You will leverage domain, technical and business process expertise to provide exceptional support of Amgens data governance framework. This role involves working closely with business stakeholders and data analysts to ensure implementation and adoption of the data governance framework. You will collaborate with the Product Owner and other Business Analysts to ensure operational support and excellence from the team. Roles & Responsibilities: Responsible for the data governance and data management framework implementation for a given domain of expertise (Research, Development, Supply Chain, etc.). Responsible for the operationalization of the Enterprise data governance framework and aligning broader stakeholder community with their data governance needs, including data quality, data access controls, compliance with privacy and security regulations, foundational master data management, data sharing, communication and change management. Works with Enterprise MDM and Reference Data to enforce standards and data reusability. Drives cross functional alignment in his/her domain(s) of expertise to ensure adherence to Data Governance principles. Create and maintain privacy policies and procedures to protect sensitive data and ensure compliance. Conduct regular privacy risk assessments and audits to identify and mitigate potential risks as required Maintain documentation on data definitions, data standards, data flows, legacy data structures / hierarchies, common data models, data harmonization etc. for assigned domains. Ensure compliance requirements with data privacy, security, and regulatory policies for the assigned domains including GDPR, CCPA, and other relevant legislations. Jointly with Technology teams, business functions, and enterprise teams (e.g., MDM, Enterprise Data Fabric, etc.) define the specifications shaping the development and implementation of data foundations. Build strong relationships with key business leads and partners to ensure their needs are being met Functional Skills: Must-Have Functional Skills: Technical skills with knowledge of Pharma processes with specialization in a domain (e.g., Research, Clinical Trials, Commercial, etc.) In depth knowledge of data management, common data models, metadata management, data quality, master data management, data stewardship, data protection, etc. Strong understanding of data protection laws and regulations, including GDPR, CCPA, and other relevant legislations. In depth experience with data products development life cycle, including the enablement of data dictionaries, business glossary to increase data products reusability and data literacy. Customer-focused with excellent written and verbal communication skills who can confidently work with internal Amgen business stakeholders and external service partners on business process and technology topics In depth experience of working with or supporting systems used to data governance framework. E.g. Collibra, Alation Excellent problem-solving skills and committed attention to detail in finding solutions Good-to-Have Functional Skills: Experience of working with data governance councils or forums Experience with Agile software development methodologies (Scrum) Proficiency in data analysis and quality tools (e.g., SQL, Excel, Python, or SAS) 3-5 years of experience in data privacy, compliance, or a related field. Soft Skills: Integrity: Commitment to maintaining the highest ethical standards and protecting confidential information. Adaptability: Ability to adapt to changing regulations and emerging privacy challenges. Proactivity: Self-motivated with a proactive approach to identifying and addressing privacy issues. Leadership: Strong leadership skills and the ability to influence and drive change within the organization. Highly organized and able to work under minimal supervision Excellent analytical and assessment skills Ability to work effectively with global, virtual teams Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Ambitious to further develop their skills and career Ability to build business relationships and understand end-to-end data use and needs. Excellent interpersonal skills (team player). People management skills either in matrix or direct line function. Strong verbal and written communication skills High degree of initiative and self-motivation. Good presentation and public speaking skills. Strong attention to detail, quality, time management and customer focus. Basic Qualifications: Any Degree and 9-13 years of experience

Posted 2 weeks ago

Apply

10.0 - 15.0 years

20 - 35 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Create and execute test plans/UAT scripts for EDC system (Veeva CDMS/Rave EDC). Ensure build quality, validate integrations, document result, and collaborate cross-functionally. Expertise in clinical data system, Agile methodologies, software testing

Posted 2 weeks ago

Apply

10.0 - 15.0 years

22 - 37 Lacs

Bengaluru

Work from Office

Naukri logo

Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. As GCP Data Engineer at Kyndryl, you will be responsible for designing and developing data pipelines, participating in architectural discussions, and implementing data solutions in a cloud environment using GCP data services. You will collaborate with global architects and business teams to design and deploy innovative solutions, supporting data analytics, automation, and transformation needs. Responsibilities: Design, develop, and maintain scalable data pipelines using GCP services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage. Participate in architectural discussions, conduct system analysis, and suggest optimal solutions that are scalable, future-proof, and aligned with business requirements. Collaborate with stakeholders to gather requirements and create high-level and detailed technical designs. Design data models suitable for both transactional and big data environments, supporting Machine Learning workflows. Build and optimize ETL/ELT infrastructure using a variety of data sources and GCP services. Develop and maintain Python / PySpark for data processing and integrate with GCP services for seamless data operations. Develop and optimize SQL queries for data analysis and reporting. Monitor and troubleshoot data pipeline issues to ensure timely resolution. Implement data governance and security best practices within GCP. Perform data quality checks and validation to ensure accuracy and consistency. Support DevOps automation efforts to ensure smooth integration and deployment of data pipelines. Provide design expertise in Master Data Management (MDM), Data Quality, and Metadata Management. Provide technical support and guidance to junior data engineers and other team members. Participate in code reviews and contribute to continuous improvement of data engineering practices. Implement best practices for cost management and resource utilization within GCP. If you're ready to embrace the power of data to transform our business and embark on an epic data adventure, then join us at Kyndryl. Together, let's redefine what's possible and unleash your potential. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Technical and Professional Experience: Bachelor’s or master’s degree in computer science, Engineering, or a related field with over 8 years of experience in data engineering More than 3 years of experience with the GCP data ecosystem Hands-on experience and Strong proficiency in GCP components such as Dataflow, Dataproc, BigQuery, Cloud Functions, Composer, Data Fusion. Excellent command of SQL with the ability to write complex queries and perform advanced data transformation. Strong programming skills in PySpark and/or Python, specifically for building cloud-native data pipelines. Familiarity with GCP tools like Looker, Airflow DAGs, Data Studio, App Maker, etc. Hands-on experience implementing enterprise-wide cloud data lake and data warehouse solutions on GCP. Knowledge of data governance, security, and compliance best practices. Experience with private and public cloud architectures, pros/cons, and migration considerations. Excellent problem-solving, analytical, and critical thinking skills. Ability to manage multiple projects simultaneously, while maintaining a high level of attention to detail. Communication Skills: Must be able to communicate with both technical and nontechnical. Able to derive technical requirements with the stakeholders. Ability to work independently and in agile teams. Preferred Technical And Professional Experience GCP Data Engineer Certification is highly preferred. Professional certification, e.g., Open Certified Technical Specialist with Data Engineering Specialization. Experience working as a Data Engineer and/or in cloud modernization. Knowledge of Databricks, Snowflake, for data analytics. Experience in NoSQL databases Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes). Familiarity with BI dashboards and Google Data Studio is a plus. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Chennai, Bengaluru, Mumbai (All Areas)

Hybrid

Naukri logo

Data Cleansing & Integration Project Delivery: Execute high visibility data programs as assigned by the Data Cleansing Manager. Utilize SAP data load solutions such as SAP Migration Cockpit and LSMW for data loading and template creation. FDO Data Change Management Methodology: Assist in defining data cleansing approaches using Mass Change functionality. Develop and prepare data cleansing strategies. Data Cleansing & Integration Technical Guidance: Understand SAP landscape and data flow to underlying/consumed systems to prevent data synchronization issues. Data Quality: Collaborate with the Data Quality (DQ) team to define DQ rules and enhance visibility of existing data quality. Data Governance: Work with the Data Governance (DG) team to ensure proper governance before implementing system changes. Conduct necessary data load testing in test systems. Data Sourcing: Maintain and update the data catalogue/data dictionary, creating a defined list of data sources indicating the best versions (golden copies). Data Ingestion: Collaborate with DG and project teams on data harmonization by integrating data from multiple sources. Develop sustainable integration routines and methods. Qualifications: Experience: Minimum of 6 years in data-related disciplines such as data management, quality, and cleansing. Technical Skills: Proven experience in delivering data initiatives (cleansing, integration, migrations) using established technical data change methodologies. Proficiency in handling large data sets with tools like Microsoft Excel and Power BI. Experience with SAP native migration and cleansing tools such as SAP Migration Cockpit, LSMW, and MASS. Knowledge of Master Data Management in SAP MDG, SAP ECC, and associated data structures. Collaboration: Ability to work effectively with internal cross-functional teams.

Posted 2 weeks ago

Apply

7.0 - 10.0 years

8 - 14 Lacs

Kanpur

Remote

Naukri logo

Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 2 weeks ago

Apply

7.0 - 10.0 years

8 - 14 Lacs

Hyderabad

Remote

Naukri logo

Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!A day in the life of an Infoscion As part of the Infosys testing team, your primary role would be to anchor testing requirements, develop test strategy, track & monitor project plans, review test plans, test cases and test scripts. You will develop project quality plans, validate defective prevention plans to deliver effective testing solutions for clients. You will ensure right test environment is available and provide necessary review feedback for test data setup to ensure timely commencement of test execution You will validate "go live" activities such as production verification to ensure that the application runs in the production environment without any issues. In addition, you will mentor the team and provide regular feedback and coaching to help them continually improve their performance If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Possess end-to-end knowledge and experience in testing Extensive experience in test planning/ test strategy, test estimates Excellent communication and client handling skills Experience in one or more scripting languages and automation tools Analytical, Client interfacing and stakeholder management skills Knowledge of SDLC and agile methodologies Project and Team Management Technical and Professional Requirements: Primary skills:Cloud testing->AWS Testing,Data Services->DWT (Data Warehouse Testing)/ (ETL),Data Services->TDM (Test Data Management),Data Services->TDM (Test Data Management)->Delphix,Data Services->TDM (Test Data Management)->IBM Optim,Database->PL/SQL,Package testing->MDM,Python Desirables:Bigdata->Python Preferred Skills: Technology->ETL & Data Quality->ETL & Data Quality - ALL Educational Requirements Bachelor of Engineering Service Line Infosys Quality Engineering

Posted 2 weeks ago

Apply

9.0 - 13.0 years

11 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Role Description: You will play a key role in the implementation and adoption of the data governance framework which will modernize Amgen's data ecosystem, positioning Amgen as a leader in biopharma innovation. This role leverages state-of-the-art technologies, including Generative AI, Machine Learning, and integrated data. You will leverage domain, technical and business process expertise to provide exceptional support of Amgens data governance framework. This role involves working closely with business stakeholders and data analysts to ensure implementation and adoption of the data governance framework. You will collaborate with the Product Owner and other Business Analysts to ensure operational support and excellence from the team. Roles & Responsibilities: Responsible for the data governance and data management framework implementation for the General and Administrative operations (G&A) domain of the biopharma lifecycle. Responsible for the operationalization of the Enterprise data governance framework and aligning broader stakeholder community with their data governance needs, including data quality, data access controls, compliance with privacy and security regulations, foundational master data management, data sharing, communication and change management. Works with Enterprise MDM and Reference Data to enforce standards and data reusability. Drives cross functional alignment in his/her domain(s) of expertise to ensure adherence to Data Governance principles. Maintain documentation on data definitions, data standards, data flows, legacy data structures / hierarchies, common data models, data harmonization etc. for assigned domains. Ensure compliance requirements with data privacy, security, and regulatory policies for the assigned domains Jointly with Technology teams, business functions, and enterprise teams (e.g., MDM, Enterprise Data Fabric, etc.) define the specifications shaping the development and implementation of data foundations. Build strong relationships with key business leads and partners to ensure their needs are being met Functional Skills: Must-Have Functional Skills: Technical skills with knowledge of Pharma processes with specialization in the General and Administrative operations (G&A) domain of the biopharma lifecycle. In depth knowledge of data management, common data models, metadata management, data quality, master data management, data stewardship, data protection, etc. In depth experience with data products development life cycle, including the enablement of data dictionaries, business glossary to increase data products reusability and data literacy. Customer-focused with excellent written and verbal communication skills who can confidently work with internal Amgen business stakeholders and external service partners on business process and technology topics In depth experience of working with or supporting systems used to data governance framework. E.g. Collibra, Alation Excellent problem-solving skills and committed attention to detail in finding solutions Good-to-Have Functional Skills: Experience of working with data governance councils or forums Experience with Agile software development methodologies (Scrum) Proficiency in data analysis and quality tools (e.g., SQL, Excel, Python, or SAS) Soft Skills: Highly organized and able to work under minimal supervision Excellent analytical and assessment skills Ability to work effectively with global, virtual teams Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Ambitious to further develop their skills and career Ability to build business relationships and understand end-to-end data use and needs. Excellent interpersonal skills (team player). People management skills either in matrix or direct line function. Strong verbal and written communication skills High degree of initiative and self-motivation. Good presentation and public speaking skills. Strong attention to detail, quality, time management and customer focus. Basic Qualifications: Any Degree and 9-13 Years of Experience

Posted 2 weeks ago

Apply

7.0 - 10.0 years

8 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title : Data Engineer / Data Modeler. Location : Remote (India). Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 2 weeks ago

Apply

7.0 - 10.0 years

8 - 14 Lacs

Jaipur

Work from Office

Naukri logo

Job Title : Data Engineer / Data Modeler. Location : Remote (India). Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 2 weeks ago

Apply

7.0 - 10.0 years

8 - 14 Lacs

Chennai

Remote

Naukri logo

Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 2 weeks ago

Apply

7.0 - 10.0 years

8 - 14 Lacs

Lucknow

Remote

Naukri logo

Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 2 weeks ago

Apply

6.0 - 10.0 years

20 - 27 Lacs

Gurugram

Work from Office

Naukri logo

Transport Commercial / Data Analyst Role requires multitasking capabilities for effectively handling multiple opportunities at the same time, ensuring each preparation receives the necessary attention and meets the deadlines. We are seeking motivated individuals to join the Transport Commercial team and help us to drive growth within the market. This position will involve a variety of tasks, working with the team to support: measuring and reporting on our performance, data analysis and supporting our market knowledge through CRM. Key responsibilities include: Gathering and analyzing data from various sources such as CRM and additional databases to generate comprehensive reports and dashboards for the Transport Commercial team and the Transport Leadership Team Working with our team and Opportunity Owners to support and develop the use of tools and processes to support effective tendering, including the use of AI. Conducting market intelligence tasks to support business decisions by performing online market research and developing tools to optimize this research turning data insights into effective business intelligence. Implementing automation processes to improve data quality and visualization through charts, views, and interactive dashboards, to support strategic planning and decision-making Supporting live tenders as necessary, in particular with the preparation of governance and progress report documentation. Establish and nurture relationships with internal stakeholders. Qualification This role is ideal for an experienced Graduate passionate about managing business operations and driving growth within the Transport Commercial sector. If you have a proactive mindset, strong analytical skills, and a keen interest in this field, we encourage you to apply. Bachelor's degree in Business Administration, Economics, Engineering, Data Science, or a related field. Demonstrated skills in Business development softwares such as MS Office, PowerBI and CRM systems. Experience on scripting tools (e.g., SQL, Python) would be advantageous. Strong writing and presentation skills. Capabilities to multitask, managing multiple opportunities simultaneously while meeting deadlines. Excellent networking skills and a global mindset to establish and nurture relationships. Proactive approach and excellent collaboration skills. Experience in data analysis, market intelligence, and business decision support. Familiarity with automation processes and data visualization techniques. Previous experience in a similar role is preferred. Interest in business management and work-winning strategies. Demonstrate a global mindset and strong networking skills Additional Information What we can offer you Investment in your development Leaders you can count on, guided by our Leadership Principles Be valued for the unique person you are Never be short of inspiration from colleagues, clients, and projects The long-term thinking of a foundation-owned company Work at the heart of sustainable change Ramboll is a global architecture, engineering, and consultancy company. We believe that the purpose of sustainable change is to create a thriving world for both nature and people. So, that’s where we start – and how we work. At Ramboll, our core strength is our people, and our history is rooted in a clear vision of how a responsible company should act. Being open and curious is a cornerstone of our culture. We embrace an inclusive mindset that looks for fresh, diverse, and innovative perspectives. We respect, embrace, and invite diversity in all forms to actively cultivate an environment where everyone can flourish and realise their full potential. Ready to join us? Please submit your application. Be sure to include all relevant documents including your CV, cover letter, etc. Thank you for taking the time to apply! We look forward to receiving your application. About Ramboll Founded in Denmark, Ramboll is a foundation-owned people company. We have more than 18,000 experts working across our global operations in 35 countries. Our experts are leaders in their fields, developing and delivering innovative solutions in diverse markets including Buildings, Transport, Planning & Urban Design, Water, Environment & Health, Energy, and Management Consulting. We invite you to contribute to a more sustainable future working in an open, collaborative, and empowering company. Combining local experience with global knowledge, we together shape the societies of tomorrow. Equality, diversity, and inclusion is at the heart of what we do We believe in the strength of diversity and know that unique experiences and perspectives are vital for creating truly sustainable societies. Therefore, we are committed to providing an inclusive and supportive work environment where everyone can flourish and reach their potential. We welcome applications from candidates of all backgrounds and encourage you to contact our recruitment team to discuss any accommodations you need during the application process.

Posted 2 weeks ago

Apply

12.0 - 15.0 years

12 - 17 Lacs

Mumbai, Hyderabad

Work from Office

Naukri logo

We are seeking an experienced Product Manager-Data Management to lead the development and adoption of our 3rd party data platforms, including D&B and other similar platforms. The successful candidate will be responsible for driving the integration and utilization of 3rd party data across marketing campaigns, improving data quality and accuracy, and expanding the use cases and applications for 3rd party data. About the Role Develop and execute a comprehensive strategy for 3rd party data platform adoption and expansion across the organization, with a focus on driving business outcomes and improving marketing effectiveness. Collaborate with marketing teams to integrate 3rd party data into their campaigns and workflows and provide training and support to ensure effective use of the data. Develop and showcase compelling use cases that demonstrate the value of 3rd party data in improving marketing effectiveness and measure the success of these use cases through metrics such as adoption rate, data quality, and marketing ROI. Develop and maintain a roadmap for 3rd party data platform adoption and expansion across the organization, with a focus on expanding use cases and applications for 3rd party data and developing new data-driven products and services. Monitor and measure the effectiveness of 3rd party data in driving business outcomes, and adjust the adoption strategy accordingly Work with cross-functional teams to ensure data quality and governance, and develop and maintain relationships with 3rd party data vendors to ensure seamless data integration and delivery. Drive the development of new data-driven products and services that leverage 3rd party data, and collaborate with stakeholders to prioritize and develop these products and services. Shift Timings: 2 PM to 11 PM (IST). Work from office for 2 days in a week (Mandatory). About You 12+ years of experience in data management, product management, or a related field. Bachelors or Masters degree in Computer Science, Data Science, Information Technology, or a related field. Experience with data management tools such as data warehousing, ETL (Extract, Transform, Load), data governance, and data quality. Understanding of the Marketing domain and data platforms such as Treasure Data, Salesforce, Eloqua, 6Sense, Alteryx, Tableau and Snowflake within a MarTech stack. Experience with machine learning and AI frameworks (eg, TensorFlow, PyTorch). Expertise in SQL and Alteryx. Experience with data integration tools and technologies such as APIs, data pipelines, and data virtualization. Experience with data quality and validation tools and techniques such as data profiling, data cleansing, and data validation. Strong understanding of data modeling concepts, data architecture, and data governance. Excellent communication and collaboration skills. Ability to drive adoption and expansion of D&B data across the organization. Certifications in data management, data governance, or data science is nice to have. Experience with cloud-based data platforms (eg, AWS, GCP, Azure) nice to have. Knowledge of machine learning and AI concepts, including supervised and unsupervised learning, neural networks, and deep learning nice to have. What s in it For You Hybrid Work Model: we've adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial we'llbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world.

Posted 2 weeks ago

Apply

0.0 - 2.0 years

18 - 20 Lacs

Gurugram

Work from Office

Naukri logo

With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities, and each other. Here, you will learn and grow as we help you create a career journey that is unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you will be recognized for your contributions, leadership, and impact every colleague can share in the company s success. Together, we will win as a team, striving to uphold our company values and powerful backing promise to provide the world s best customer experience every day. And we will do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. Join Team Amex and let us lead the way together. Amex Flex: We back our colleagues with the support they need to thrive, professionally and personally. That is why we have Amex Flex, our enterprise working model, that provides greater flexibility to colleagues while ensuring we preserve the important aspects of our unique in-person culture. Depending on role and business needs, colleagues will work onsite in a hybrid model. Business Overview: Credit and Fraud Risk (CFR) team helps drive profitable business growth by reducing the risk of fraud and maintaining industry lowest credit loss rates. It utilizes an array of tools and ever-evolving technology to detect and combat fraud, minimize the disruption of good spending, and provide a world-class customer experience. The team leads efforts that leverage data and digital advancements to improve risk management, enable commerce and drive innovation. A single decision can have many outcomes. And when that decision affects millions of card members and merchants, it needs to be the right one. That is where our Product teams come in. Product teams are the backbone of all financial services operations at American Express it impacts every aspect of the company. As a part of this team, you will work with the industry s best talent to create smart and innovative strategies that advance our market share and the way we'do business. If you are interested in getting to know all areas of our business and can translate our business needs into remarkable solutions, you should consider a career in Product teams. Job Responsibilities: There are diverse set of roles within the Product job family, with varying responsibilities and skill requirements. A brief description of the roles and skills is outlined below: (1) Product Development Develop next generation software products and solutions to solve complex business problems using the latest tools and technologies. Collaborate with multiple business stakeholders, technology teams and other product teams to build and iterate on products that directly impact millions of customers and prospects. Manage the implementation of critical products, drive global, reusable, and configurable design, rule authoring, testing, integration, and product launch using low-code tools. This cluster includes a diverse set of roles, with varying requirements on technical acumen from Low-Code tools to Pro-Code programming skills. (2) Product Management Solve complex business problems by ideation, development & ownership of next generation technology products and solutions to meet business objectives in a fast-changing dynamic economic environment. Support execution of all product lifecycle processes including market research, competitive analysis, planning, positioning, roadmap development, requirements development, and product launch. (3) Data Steward Manage end-to-end ownership of enterprise data assets that are used in making business decisions for millions of customers and billions of transactions across the globe. Develop strong subject matter expertise on both internal and external data assets. Act as the custodian for data standardization, data governance, data quality and data ownership, while ensuring compliance and security of the data. Build strong relationships, operate effectively within large cross-functional teams, and influence business stakeholders to drive change. (4) Data Governance Planning or facilitating the execution of Data Risk management and governance requirements to ensure compliance of CFR data with enterprise governance and data related policies. Close collaboration with policy owners, enterprise governance & product teams, CFR Data Stewards, Data custodians (and/or Operational Excellence teams) to execute requirements for managing Data Risk and provide subject matter expertise for remediation of Data Risk Issues. Demonstrate deeper understanding of evolving risk management space and bring external best practices in-house. The selected candidate will be allocated to one of these roles depending on the fitment and business needs. The role will entail some of the below responsibilities: - Develop robust data management, data integration and data quality processes by leveraging best-in-class technology Innovate with a focus on developing newer and better approaches using big data technologies Find innovative techniques to bring scale to critical initiatives and enhance productivity Manage world class data products by partnering with enterprise teams including Technology, Design and End-Users to enable building of new capabilities, modules, and maintenance of existing assets. Qualifications and Skills Required: 0-2 years of relevant experience preferred Strong analytical and problem-solving skills Hands-on experience on Big-data, SQL will be preferred Effective communication and interpersonal skills Ability to work effectively in a team environment Ability to learn quickly and work independently with complex, unstructured initiatives Ability to challenge the status quo and drive innovation Good Programming skills, Knowledge of GCP native tools and other platforms will be preferred. Prior experience of product development, Data analytics, governance or stewardship will be an added advantage Job Location : Hybrid (Gurgaon) Timings: Mon-Wed: 1 - 9:30pm IST Thu: 11 - 7:30pm IST Fri: 8:30am-5pm IST Campus Benefits : Competitive base salaries Flexible work arrangements and schedules with hybrid options Free access to global on-site we'llness centers staffed with nurses and doctors (depending on location) Free and confidential counselling support through our Healthy Minds program Career development and training opportunities At American Express, you will be recognized for your contributions, leadership, and impact every colleague can share in the company s success. Together, we will win as a team, striving to uphold our company values and powerful backing promise to provide the world s best customer experience every day. And we will do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. We back you with benefits that support your holistic we'll-being so you can be and deliver your best. This means caring for you and your loved ones physical, financial, and mental health, as we'll as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-we'll-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site we'llness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities

Posted 2 weeks ago

Apply

10.0 - 18.0 years

8 - 12 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Stakeholder Collaboration & Business Engagement Participate and engage in Data Quality implementation discussions with business and technical product owners in procurement. Participate in architecture discussions around the Data Quality framework. Collaborate with business users from procurement, P2P finance, supply chain etc to capture Business rules. Provide business requirements to BODS / LSMW team to enable data loads. Review and track exceptions for data quality rules and incorporate logic in the tool. Identify and investigate continuous improvement opportunities in the data standards, data quality rules, and data maintenance processes for each master data object by working with Business and other MDM Teams as required. Ensure to capture sign off from key stakeholders at critical stages of the project. Data Quality Rules & Standards Definition. Define and document data quality rules based on business requirements, functional specifications, and compliance needs. Derive technical definitions for agreed business rules / Data Quality rules to support tool development. Support Rule development in DQ tool. Review and validate rule output and inform business. Data Profiling, Cleansing & Monitoring Create and maintain data quality dashboards and reports to track improvements and errors. Review data profiling results for supplier master data and identify opportunities for data cleansing. Support the design and execution of the Data Quality Framework. Perform root cause analysis and recommend process changes and system controls. Perform pre-validation and post-validation as part of Data cleansing execution. Supplier/vendor Master Data Management (SAP) Manage and maintain supplier Master data in SAP where BODs cannot be used. Ensure data accuracy, consistency, and completeness across all supplier master records. Drive discussions on obsolete records identification and define the deactivation criteria. Data Migration, Integration & Tool Support Work closely with IT teams to support data migration, cleansing, and validation activities during SAP projects or enhancements. Perform process analysis and translate business issues and requirements into actionable plans to improve the Master Data Process from a tool and programming perspective. Recommend system and process enhancements to improve data quality and governance practices. Skills required 10+ years experience in SAP vendor Master objects along with knowledge on SAP T-codes and fields Hands-on experience in creating, maintaining, and validating supplier/vendor master data in SAP Good understanding of purchase order process, invoice process, payment process etc within P2P. Familiarity with industry-specific rules in procurement, P2P finance, various vendor types. Knowledge of supplier/vendor lifecycle processes (creation, change, extension, obsolescence, deletion) Strong skills in data quality rule definition, documentation, and enforcement Experience with data profiling, cleansing, and standardization techniques Ability to perform root cause analysis of data issues and recommend remediation Working knowledge of SAP BODS (BusinessObjects Data Services) for ETL, transformation, and data quality operations Ability to provide functional input for BODS developers - mapping rules, logic, validation Hands-on experience with SAP LSMW - recording, field mapping, conversion rules, batch input methods Understanding of data load processes and best practices for data migration in SAP Strong skills in data validation techniques - pre-load and post-load checks, reconciliation, exception reporting Ability to interpret and analyze SAP data using SE16N, SQVI, and custom reports Ability to document data standards, data definitions, and quality rules Technical skills in exploring, analysing, profiling, manipulating data sets through Excel, Power BI, SAP IS, Tableau etc Good understanding of Ariba, SAP ECC, SAP MDG.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies