Home
Jobs

2452 Data Quality Jobs - Page 16

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

8 - 16 Lacs

Pune

Remote

Naukri logo

Rudder Analytics is looking for Senior BI Developer(Tableau / Power BI) at Pune, with 4-8 yrs of experience. Please see details at https://shorturl.at/7O9fa for job code BI-SA-01 Required Candidate profile Knack for professional design layouts and for visual storytelling. Precision and attention to detail. Ability to lead a team and manage projects independently.

Posted 1 week ago

Apply

7.0 - 12.0 years

20 - 25 Lacs

Gurugram

Work from Office

Naukri logo

Not Applicable Specialism Risk & Summary A career within Internal Audit services, will provide you with an opportunity to gain an understanding of an organisation s objectives, regulatory and risk management environment, and the diverse needs of their critical stakeholders. We focus on helping organisations look deeper and see further considering areas like culture and behaviours to help improve and embed controls. In short, we seek to address the right risks and ultimately add value to their organisation. Why PWC & Summary PricewaterhouseCoopers is a multinational professional services network of firms, operating as partnerships under the PwC brand. PwC ranks as the secondlargest professional services network in the world and is considered one of the Big Four accounting firms, along with Deloitte, EY and KPMG. PwC Careers PwC offers a diverse and exciting approach to development which puts you in the drivers seat. Driving your development and growth means that you have the opportunity to learn from your colleagues and clients around you through onthejob experiences. Brief note on the requirement is given below Risk Assurance Services (RAS) is one of PwC s high growth verticals. It supports clients in defining their strategy, formulating business objectives and managing performance while achieving a balance between risk and opportunity or return. Our services within the Risk Assurance practice cover the entire risk & controls spectrum across Internal Audit, Governance, Risk & Controls, Contract & Compliance, Data analytics etc. Technical Skills Experience in Internal Audit/ Process Audit concepts & methodology Processes, Subprocesses, and Activities as well as their relationship Must be proficient in MS Office Sarbanes Oxley Act (SOX)/ IFC Reviews, SOP s Internal control concepts (e.g., Preventive Controls; Detective Controls; Risk Assessment; Antifraud Controls; etc.) Soft Skills Clarity of thought, articulation, and expression Takes ownership, sincere and focused on execution Confident and good verbal communication skills Ability to organize, prioritize and meet deadlines Responsibilities PricewaterhouseCoopers is a multinational professional services network of firms, operating as partnerships under the PwC brand. PwC ranks as the secondlargest professional services network in the world and is considered one of the Big Four accounting firms, along with Deloitte, EY and KPMG. PwC Careers PwC offers a diverse and exciting approach to development which puts you in the drivers seat. Driving your development and growth means that you have the opportunity to learn from your colleagues and clients around you through onthejob experiences. Brief note on the requirement is given below Risk Assurance Services (RAS) is one of PwC s high growth verticals. It supports clients in defining their strategy, formulating business objectives and managing performance while achieving a balance between risk and opportunity or return. Our services within the Risk Assurance practice cover the entire risk & controls spectrum across Internal Audit, Governance, Risk & Controls, Contract & Compliance, Data analytics etc. Technical Skills Experience in Internal Audit/ Process Audit concepts & methodology Processes, Subprocesses, and Activities as well as their relationship Must be proficient in MS Office Sarbanes Oxley Act (SOX)/ IFC Reviews, SOP s Internal control concepts (e.g., Preventive Controls; Detective Controls; Risk Assessment; Antifraud Controls; etc.) Soft Skills Clarity of thought, articulation, and expression Takes ownership, sincere and focused on execution Confident and good verbal communication skills Ability to organize, prioritize and meet deadlines Mandatory skill sets Internal Audit Preferred skill sets Internal Audit Years of experience required 7 to 12 Years Education qualification MBA/ M.Com/ MCA/ CA Education Degrees/Field of Study required Chartered Accountant Diploma, Master of Business Administration Degrees/Field of Study preferred Required Skills Internal Auditing Accepting Feedback, Accepting Feedback, Accounting and Financial Reporting Standards, Active Listening, Analytical Thinking, Artificial Intelligence (AI) Platform, Auditing, Auditing Methodologies, Business Process Improvement, Coaching and Feedback, Communication, Compliance Auditing, Corporate Governance, Creativity, Data Analysis and Interpretation, Data Ingestion, Data Modeling, Data Quality, Data Security, Data Transformation, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Financial Accounting {+ 29 more} Travel Requirements Up to 60% No

Posted 1 week ago

Apply

5.0 - 10.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Job Description We are looking for an experienced Senior Data Engineer with a strong foundation in Python, SQL, and Spark , and hands-on expertise in AWS, Databricks . In this role, you will build and maintain scalable data pipelines and architecture to support analytics, data science, and business intelligence initiatives. You ll work closely with cross-functional teams to drive data reliability, quality, and performance. Responsibilities: Design, develop, and optimize scalable data pipelines using Databricks in AWS such as Glue, S3, Lambda, EMR, Databricks notebooks, workflows and jobs. Building data lake in WS Databricks. Build and maintain robust ETL/ELT workflows using Python and SQL to handle structured and semi-structured data. Develop distributed data processing solutions using Apache Spark or PySpark . Partner with data scientists and analysts to provide high-quality, accessible, and well-structured data. Ensure data quality, governance, security, and compliance across pipelines and data stores. Monitor, troubleshoot, and improve the performance of data systems and pipelines. Participate in code reviews and help establish engineering best practices. Mentor junior data engineers and support their technical development. Qualifications Requirements Bachelors or masters degree in computer science, Engineering, or a related field. 5+ years of hands-on experience in data engineering , with at least 2 years working with AWS Databricks . Strong programming skills in Python for data processing and automation. Advanced proficiency in SQL for querying and transforming large datasets. Deep experience with Apache Spark/PySpark in a distributed computing environment. Solid understanding of data modelling, warehousing, and performance optimization techniques. Proficiency with AWS services such as Glue , S3 , Lambda and EMR . Experience with version control Git or Code commit Experience in any workflow orchestration like Airflow, AWS Step funtions is a plu

Posted 1 week ago

Apply

6.0 - 9.0 years

22 - 27 Lacs

Hyderabad

Work from Office

Naukri logo

Job Description We are seeking a highly skilled and motivated Senior Snowflake Developer to join our growing data engineering team. In this role, you will be responsible for building scalable and secure data pipelines and Snowflake-based architectures that power data analytics across the organization. You ll collaborate with business and technical stakeholders to design robust solutions in an AWS environment and play a key role in driving our data strategy forward. Responsibilities Design, develop, and maintain efficient and scalable Snowflake data warehouse solutions on AWS. Build robust ETL/ELT pipelines using SQL, Python, and AWS services (e.g., Glue, Lambda, S3). Collaborate with data analysts, engineers, and business teams to gather requirements and design data models aligned with business needs. Optimize Snowflake performance through best practices in clustering, partitioning, caching, and query tuning. Ensure data quality, accuracy, and completeness across data pipelines and warehouse processes. Maintain documentation and enforce best practices for data architecture, governance, and security. Continuously evaluate tools, technologies, and processes to improve system reliability, scalability, and performance. Ensure compliance with relevant data privacy and security regulations (e.g., GDPR, CCPA). Qualifications Bachelor s degree in Computer Science, Information Technology, or a related field. Minimum 6 years of experience in data engineering, with at least 3 years of hands-on experience with Snowflake. Strong experience working with AWS services such as S3, Glue, Lambda, Redshift, and IAM. Proficient in SQL and Python for data transformation and scripting. Solid understanding of data modeling principles (Star/Snowflake schema, normalization/denormalization). Experience in performance tuning and Snowflake optimization techniques. Excellent problem-solving skills and ability to work independently or as part of a team. Strong communication skills, both written and verbal.

Posted 1 week ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Gurugram

Work from Office

Naukri logo

Key Responsibilities: ETL Development and Maintenance: Design, develop, and implement ETL processes using SSIS to support data integration and warehousing requirements. Maintain and enhance existing ETL workflows to ensure data accuracy and integrity. Collaborate with data analysts, data architects, and other stakeholders to understand data requirements and translate them into technical specifications. Extract, transform, and load data from various source systems into the data warehouse. Perform data profiling, validation, and cleansing to ensure high data quality. Monitor ETL processes to ensure timely and accurate data loads. Write and optimize complex SQL queries to extract and manipulate data. Work with SQL Server to manage database objects, indexes, and performance tuning. Ensure data security and compliance with industry standards and regulations. Business Intelligence and Reporting: Develop and maintain interactive dashboards and reports using Power BI or SSRS. Collaborate with business users to gather requirements and create visualizations that provide actionable insights. Integrate Power BI with other data sources and platforms for comprehensive reporting. Scripting and Automation: Utilize Python for data manipulation, automation, and integration tasks. Develop scripts to automate repetitive tasks and improve efficiency. Insurance Domain Expertise: Leverage knowledge of insurance industry processes and terminology to effectively manage and interpret insurance data. Work closely with business users and stakeholders within the insurance domain to understand their data needs and provide solutions. Qualifications Required Skills and Qualifications: Technical Skills: Proficient in SQL and experience with SQL Server. Strong experience with SSIS for ETL development and data integration. Proficiency in Python for data manipulation and scripting. Experience with Power BI/SSRS for developing interactive dashboards and reports. Knowledge of data warehousing concepts and best practices. Domain Knowledge: Solid understanding of insurance industry processes, terminology, and data structures. Experience working with insurance-related data, such as policies, claims, underwriting, and actuarial data. Additional Skills: Strong problem-solving and analytical skills. Excellent communication and collaboration abilities.

Posted 1 week ago

Apply

3.0 - 7.0 years

3 - 7 Lacs

Pune

Work from Office

Naukri logo

Its fun to work at a company where people truly believe in what they are doing! Job Description: Job Summary: The Operations Analyst role is to provide technical support for the full lifecycle of the electronic discovery reference model (EDRM) including ingestion of data, quality control, document production and document review projects. The position will require attention to detail, multi-tasking, analytical skills, as well as someone who works well in a team. The candidate must be able to work under the pressure of strict deadlines on multiple projects in a fast-paced environment. Essential Job Responsibilities Utilize proprietary and 3rd party eDiscovery software applications for electronic discovery and data recovery processes. Load, Process and Search client data in many different file formats. Conducting relevant searches of electronic data using proprietary tools. Work closely with team members to troubleshoot data issues (prior to escalation to operations senior management and/or IT/Development), research software and/or techniques to solve problems, and carry out complex data analysis tasks. Providing end user and technical documentation and training for applications supported. Communicate and collaborate with other company departments. Generate reports from various database platforms for senior management. Generating written status reports to clients, managers, and project managers. Working closely with internal departments on streamlining processes and development of proprietary tools Qualifications & Certifications Solid understanding of Windows and all MS Office applications is required. Basic UNIX skills, understanding of hardware, networking, and delimited files would be an advantage. Experience with database applications and knowledge of litigation support software is desirable. Strong analytical and problem-solving skills are essential for this role. Demonstrated ability to work in a team environment, follow detailed instructions and meet established deadlines. A self-starter with ability to visualize data and software behavior and coordinate the two. Fluency in English (verbal and written) is required. Bachelor s degree or final year student, preferably in computer/technical or legal field or equivalent combination of education and/or experience required. If you like wild growth and working with happy, enthusiastic over-achievers, youll enjoy your career with us!

Posted 1 week ago

Apply

3.0 - 7.0 years

15 - 17 Lacs

Hyderabad, Bengaluru

Work from Office

Naukri logo

We are looking for a detail-oriented and analytical BI Engineer to join our data team. The ideal candidate will have a strong background in SQL, data visualization tools like Looker, Superset, Tableau, and MicroStrategy, and experience working with cloud platforms such as GCP and AWS. You will be responsible for transforming raw data into actionable insights and building scalable BI solutions to support data-driven decision-making across the organization. Job Summary: We are looking for a detail-oriented and analytical BI Engineer to join our data team. The ideal candidate will have a strong background in SQL , data visualization tools like Looker , Superset , Tableau , and MicroStrategy , and experience working with cloud platforms such as GCP and AWS . You will be responsible for transforming raw data into actionable insights and building scalable BI solutions to support data-driven decision-making across the organization. Key Responsibilities: Design, develop, and maintain interactive dashboards and reports using Looker , Superset , Tableau , and MicroStrategy . Write complex and optimized SQL queries to extract, transform, and analyze data from various sources. Collaborate with stakeholders to gather requirements and translate business needs into technical specifications. Build and maintain data models and semantic layers to support self-service BI. Ensure data accuracy, consistency, and performance across all BI platforms. Work with data engineers and analysts to integrate data from multiple sources into cloud data warehouses (e.g., BigQuery , Redshift , Snowflake ). Implement best practices for data visualization, dashboard design, and user experience. Monitor BI tools and troubleshoot issues related to data quality, performance, and access. Required Skills: Proficiency in SQL and experience with large-scale data sets. Hands-on experience with at least two of the following BI tools: Looker , Superset , Tableau , MicroStrategy . MicroStrategy is a must Strong understanding of data modeling , ETL processes , and data warehousing . Experience working with cloud platforms, especially Google Cloud Platform (GCP) and Amazon Web Services (AWS) . Familiarity with version control systems (e.g., Git) and agile development practices. Excellent problem-solving skills and attention to detail. Preferred Qualifications: Bachelor s or Master s degree in Computer Science, Information Systems, or a related field. Experience with scripting languages like Python or R for data analysis. Knowledge of data governance , security , and compliance in BI environments. Exposure to CI/CD pipelines and DevOps practices for BI deployments. Impact Youll Make: NA This is a hybrid position and involves regular performance of job responsibilities virtually as well as in-person at an assigned TU office location for a minimum of two days a week. TransUnion Job Title Developer, Applications Development

Posted 1 week ago

Apply

8.0 - 13.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Job Description We are seeking a seasoned Data Engineering Manager with 8+ years of experience to lead and grow our data engineering capabilities. This role demands strong hands-on expertise in Python, SQL, Spark , and advanced proficiency in AWS and Databricks . As a technical leader, you will be responsible for architecting and optimizing scalable data solutions that enable analytics, data science, and business intelligence across the organization. Key Responsibilities: Lead the design, development, and optimization of scalable and secure data pipelines using AWS services such as Glue, S3, Lambda, EMR , and Databricks Notebooks , Jobs, and Workflows. Oversee the development and maintenance of data lakes on AWS Databricks , ensuring performance and scalability. Build and manage robust ETL/ELT workflows using Python and SQL , handling both structured and semi-structured data. Implement distributed data processing solutions using Apache Spark / PySpark for large-scale data transformation. Collaborate with cross-functional teams including data scientists, analysts, and product managers to ensure data is accurate, accessible, and well-structured. Enforce best practices for data quality, governance, security , and compliance across the entire data ecosystem. Monitor system performance, troubleshoot issues, and drive continuous improvements in data infrastructure. Conduct code reviews, define coding standards, and promote engineering excellence across the team. Mentor and guide junior data engineers, fostering a culture of technical growth and innovation. Qualifications Requirements 8+ years of experience in data engineering with proven leadership in managing data projects and teams. Expertise in Python, SQL, Spark (PySpark) , and experience with AWS and Databricks in production environments. Strong understanding of modern data architecture, distributed systems, and cloud-native solutions. Excellent problem-solving, communication, and collaboration skills. Prior experience mentoring team members and contributing to strategic technical decisions is highly desirable.

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Kochi

Work from Office

Naukri logo

As a Senior MDM Developer, you will play a critical role in designing, developing, and optimizing Master Data Management (MDM) solutions. You will work closely with business and technical teams to ensure data integrity, efficient integration, and compliance with enterprise standards. Your expertise in MDM platforms, data modeling, and integration technologies will be key to delivering high-quality solutions. Key Responsibilities: Design, develop, and implement MDM solutions based on business requirements. Ensure data quality, consistency, and governance across multiple domains. Collaborate with architects and business analysts to define MDM strategies and best practices. Develop integrations between MDM platforms and enterprise applications using APIs and ETL tools. Optimize data models, workflows, and MDM performance for scalability and efficiency. Troubleshoot and resolve data-related issues, ensuring system reliability and integrity. Stay updated with emerging MDM technologies and trends to enhance technical capabilities. Required Qualifications: Bachelor s degree in Computer Science, Engineering, or a related field. 5+ years of experience in MDM development and implementation. Hands-on experience with platforms such as Reltio, Informatica, DataBricks, Azure, Oracle, and Snowflake. Strong expertise in data integration, ETL processes, and API development. Solid understanding of data governance, quality management, and compliance standards. Experience working with multiple data sources, country-specific data models, and life sciences MDM implementations. Excellent problem-solving skills and the ability to work in a fast-paced environment. IQVIA is a leading global provider of clinical research services, commercial insights and healthcare intelligence to the life sciences and healthcare industries. We create intelligent connections to accelerate the development and commercialization of innovative medical treatments to help improve patient outcomes and population health worldwide . Learn more at Save this job LEARN ABOUT HOW WE WORK Join our Global Talent Network Let s stay connected. Sign up to receive alerts when new opportunities become available that match your career ambitions.

Posted 1 week ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

content="Power BI Development (Mandatory requirements)Design and develop interactive dashboards, KPIs, and reports using Power BI.Create data visualizations and reports based on business requirements.Optimize Power BI datasets and reports for performance.Publish reports to Power BI Service and manage user access and permissions.Collaborate with business stakeholders to gather requirements and translate them into effective dashboards.Data EngineeringDevelop and maintain robust ETL/ELT pipelines using tools like ADF, SSIS, or custom scripts. - Nice to have Work with large datasets from multiple sources (SQL, APIs, cloud storage, etc.). - Must haveCreate and maintain data models and data warehouses (Azure Synapse, Snowflake, etc.). - Nice to have Implement data quality checks, logging, and performance monitoring. - Must haveCollaborate with data architects and analysts to define data standards and governance. - Must haveRequired Skills & Experience: (Must Have)4-5 years of professional experience in data engineering and Power BI development.Strong experience with SQL and relational databases (e.g., SQL Server, PostgreSQL, MySQL).Proficient in Power BI (DAX, Power Query, Data Modeling).Hands-on experience with Azure Data Factory, Synapse Analytics, or other cloud-based data tools.Knowledge of Python or other scripting languages for data processing (optional but preferred).Strong understanding of data warehousing concepts and dimensional modelling.Excellent problem-solving, communication, and documentation skills.Strong in Business Analytics skills ">

Posted 1 week ago

Apply

5.0 - 10.0 years

22 - 27 Lacs

Pune, Chennai

Work from Office

Naukri logo

Specialist, Data Management Engineer We re seeking a future team member for the role of Specialist, Data Management Engineer to join our Data Management (Insight Investment) team . This role is located in Pune, MH - HYBRID. In this role, you ll make an impact in the following ways: Databricks/Snowflakes experience specifically using Unity Catalogue and onboarding data. Experience organizing data products using Data Market place approach to describe and promote content. Familiarity with Data Management Maturity Frameworks such as EDM DCAM. Ability to be perform a Translator between business users (Non-tech) and the data engineering/tech communities when using Databricks and Collibra. Ability to present data in high impact interactive dashboards. Support adoption of data & data quality culture using Databricks to embed best practices. Asset Valuations: is responsible for valuing OTC and Secured Finance assets. It ensures timely and accurate valuations of assets. The team supports various stakeholders while maintaining strong controls and meeting regulatory standards. Third Party Market data engagement Management - working closely with vendor management to oversee the products and commercial relationships used to acquire Market data. Data Stewardship - implementing the framework for organizing processes, controls, responsibilities for managing and governing data. Data Quality Management - is responsible for the monitoring, remediation and oversight of data quality exceptions supporting core data domains including: Client, Portfolio, Instruments, Positions and Analytics. Data stewardship experience e.g. maintain Data Catalogues, classifications, data lineage and logical (business friendly) data modelling. Implement a collaborative data platform for a more streamlined way of moving, transforming, analyzing. Build and maintain relationships with key stakeholders. To be successful in this role, we re seeking the following: B.Tech/BE/BS Degree (stats, math, engineering degrees are a plus) 5 to 10 years of experience working in Data Quality and Data Management 3+ years of experience with Databricks, Snowflake, Dataiku Excellent interpersonal and client-facing skills 2+ years of experience with SQL Experience in financial industry is preferred Good knowledge of Excel America s Most Innovative Companies, Fortune, 2024 World s Most Admired Companies, Fortune 2024 Human Rights Campaign Foundation, Corporate Equality Index, 100% score, 2023-2024 Best Places to Work for Disability Inclusion , Disability: IN - 100% score, 2023-2024 Most Just Companies , Just Capital and CNBC, 2024 Dow Jones Sustainability Indices, Top performing company for Sustainability, 2024 Bloomberg s Gender Equality Index (GEI), 2023

Posted 1 week ago

Apply

4.0 - 9.0 years

14 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary . In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC , you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC , we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. & Summary Looking for an SAP Data Migration Consultant with extensive experience working with SAP Syniti to join a large global brands S/4 HANA program. s Proven experience working with Data Migration on S/4 HANA. Extensive hands on experience working with Syniti. Experience with Syniti ADMM, SDR, and Syniti DQ. Endtoend Data Migration experience DMC, LSMW, BAPI, IDOCs. Ability to lead crossfunctional teams. Experience with Deduplication, Data Quality, Data Reconciliation, Preload, and Postload reports. Experience in managing SAP S/4 HANA data migration, implementing an SAP ERP solution, and working with various SAP modules (FICO, MM, PP, PS, SD, SC, and EWM). Understanding of SAP database schema and data loading concepts. Ability to design and implement an SAP technical solution and a data solution. Debugging and LTMOM code development experience. Strong knowledge of MS SQL server programming, ETL tools (Syniti/Backoffice), and Master Data Management. Responsibilities Contribute to crossfunctional teams to deliver largescale SAP implementations. Collaborate with stakeholders to gather requirements and define project scope. Maintain highquality standards throughout the project lifecycle. Analyze and resolve complex technical issues related to SAP S/4 HANA and Syniti ADMM. Mandatory skill sets Minimum 4 years of experience working with Syniti ADMM and BODS. Strong oral and written communication skills. Flexibility to travel onsite for client meetings and project implementations Preferred skill sets Minimum 4 years of experience working with Syniti ADMM and BODS. Strong oral and written communication skills. Flexibility to travel onsite for client meetings and project implementations Years of experience required 4 to 8 years Education qualification Graduate Engineer or Management Graduate Education Degrees/Field of Study required Bachelor of Technology, Bachelor of Engineering, Master of Engineering Degrees/Field of Study preferred Required Skills SAP Advanced Data Migration and Management (ADMM) Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} No

Posted 1 week ago

Apply

4.0 - 9.0 years

14 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary . In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC , you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC , we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. & Summary Looking for an SAP Data Migration Consultant with extensive experience working with SAP Syniti to join a large global brands S/4 HANA program. s Roles & Responsibilities Proven experience working with Data Migration on S/4 HANA. Extensive hands on experience working with Syniti. Experience with Syniti ADMM, SDR, and Syniti DQ. Endtoend Data Migration experience DMC, LSMW, BAPI, IDOCs. Ability to lead crossfunctional teams. Experience with Deduplication, Data Quality, Data Reconciliation, Preload, and Postload reports. Experience in managing SAP S/4 HANA data migration, implementing an SAP ERP solution, and working with various SAP modules (FICO, MM, PP, PS, SD, SC, and EWM). Understanding of SAP database schema and data loading concepts. Ability to design and implement an SAP technical solution and a data solution. Debugging and LTMOM code development experience. Strong knowledge of MS SQL server programming, ETL tools (Syniti/Backoffice), and Master Data Management. Responsibilities Contribute to crossfunctional teams to deliver largescale SAP implementations. Collaborate with stakeholders to gather requirements and define project scope. Maintain highquality standards throughout the project lifecycle. Analyze and resolve complex technical issues related to SAP S/4 HANA and Syniti ADMM. Mandatory skill sets Minimum 4 years of experience working with Syniti ADMM and BODS. Strong oral and written communication skills. Flexibility to travel onsite for client meetings and project implementations Preferred skill sets Minimum 4 years of experience working with Syniti ADMM and BODS. Strong oral and written communication skills. Flexibility to travel onsite for client meetings and project implementations Years of experience required 4 to 8 years Education qualification Graduate Engineer or Management Graduate Education Degrees/Field of Study required Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred Required Skills SAP Advanced Data Migration and Management (ADMM), Syniti Data Replication Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} No

Posted 1 week ago

Apply

4.0 - 8.0 years

11 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Not Applicable Specialism Data, Analytics AI Management Level Senior Associate Summary . In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . Design and implement highly available and scalable Azure cloud data solutions that meet business requirements. Collaborate with crossfunctional teams to design and implement solutions that meet business requirements. Participates in the testing process through test review and analysis. Developing and maintaining data lake and data warehouse schematics, layouts, architectures and relational/nonrelational databases for data access and Advanced Analytics. Required Qualifications Strong experience with Azure cloud technologies say Azure Data Bricks, Lake house architecture, DataLake, Cosmos DB, Eventhub, Servicebus, Topic, blob storages, azure functions, AKS, key Vault, Experience with data transformation and manipulation using Azure Databricks or similar tools Solid scripting experience with Python, Pyspark Experience with streaming data platforms Kafka Good working knowledge with Azure Infrastructure Experience on data integrations with APIs. Experience working in an Agile development environment. Working knowledge of CI/CD tools and concepts Azure Pipelines, Github actions Workflow, Architect data ingestion solutions such as ETL, ELT s on a wide set of data storage solutions. Independent and able to manage and prioritize workload. Ability to guide or lead junior resources to get desired project results. Advanced troubleshooting skills to drive to root cause Ability to manage ambiguity and solve undefined problems Mandatory skill sets Azure Data Services, including Azure Data Factory, Synapse Analytics, and Databricks. Proficiency in data transformation and ETL processes. Handson experience with Oracle to Azure Data Lake migrations. Preferred skill sets Azure Solutions Architect Expert certification. Databricks Certification. Years of experience required 4 to 8 years Education qualification Bachelor s degree in computer science, Applied Mathematics, Data Science, or Machine Learning. Education Degrees/Field of Study required Bachelor of Engineering Degrees/Field of Study preferred Required Skills Microsoft Azure Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} No

Posted 1 week ago

Apply

5.0 - 6.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking forward to hire Microsoft Power BI Professionals in the following areas : Senior Power BI developer JD as below Power BI SQL, T-SQL Any ETL Tool : 5-6 years of IT experience in Analysis, Design, Development of Power BI dashboard and report publish. Develop and maintain complex Power BI reports and dashboards. Create visualizations using various tools like tables, charts, maps, etc. Design and implement ETL processes using SSIS to extract data from multiple sources and load it into SQL Server databases. Write T-SQL code for stored procedures, functions, views, and triggers to support business intelligence requirements. Troubleshoot issues related to Power BI report performance, data quality, and security. Optimize Power BI reports for performance and scalability. Ensure proper configuration management and change controls. Writing and optimizing SQL queries for efficient data retrieval and analysis. Deliver robust solutions through Query optimization ensuring Data Quality. Should have experience in writing Functions and Stored Procedures. Strong understanding of the principles of Data Warehouse using Fact Tables, Dimension Tables, star and snowflake schema modelling. Analyse & translate functional specifications /user stories into technical specifications. Good to have experience in Design/ Development in any ETL tool Good interpersonal skills, experience in handling communication and interactions between different teams. Good To Have: Azure or AWS (any cloud exposure) , ETL Tool (Any). Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 1 week ago

Apply

1.0 - 6.0 years

5 - 6 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

Are you eager to make a huge impact to a program, which will help Amazon s Sellers growAre you ready to setup the best-in-class Sellers operations, define processes to drive Sellers satisfaction and work with internal teams to improve their business with Amazon Amazon s Retail Business Services (RBS) team is looking for a dynamic and talented candidate to achieve business/operations goals for our Sellers. You will be responsible for root causing data quality issues, researching Sellers chargeback escalations, identifying selection gaps and help expanding selection from the Sellers, and develop business relationship with Sellers in the end. This position offers an introduction to our online retail business and a broad training ground for future success. You should be a effective listener, communicator & problem-solver, and able to balance the needs and requirements of both Amazon.com and strategic Sellers. You must be able to effectively drive operational metrics and exceed ambitious business goals by engaging with internal business and operations partners. Minimum 1 year of experience in managing small/medium scale projects independently. Proven skill in identifying and fixing process gaps, improvement opportunities and use of small scale automation, technology to increase productivity or drive process simplification Experience in providing support for data collection, preparing reports, exercising push back & realignment of expectations with multiple stakeholders Work with the Sellers/internal teams to improve selection, identify and fix catalog defects, analyze profitability metrics and help their business grow Implement and track metrics for recording the success and quality of their products Willingness to work in flexible shifts (including Night Shifts), weekends and Indian holidays. A day in the life Partnering with internal teams to manage seller relationship by championing the seller s needs at Amazon. Build communication channels at all levels, set proper expectations, provide clear status communications, and manage towards a growth plan for the sellers. Build and execute on a strategic account plan that delivers on key business opportunities and relevant KPIs for the sellers and Amazon. Work with internal Amazon teams & the seller to improve operational aspects of their business to providing a great consumer experience. Conduct deep analysis on the issues for the sellers and develop recommendations and action plans based on data to improve seller experience Bachelors degree Speak, write, and read fluently in English Experience with Microsoft Office products and applications Experience with Excel Ability to drive process or procedure improvements

Posted 1 week ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Noida

Work from Office

Naukri logo

Join us as Assistant Vice President – Data Analyst, for the Financial Crime Operations Data Domain to implement data quality process and procedures, ensuring that data is reliable and trustworthy, then extract actionable insights from it to help the organisation improve its operation, and optimise resources.. Accountabilities. Investigation and analysis of data issues related to quality, lineage, controls, and authoritative source identification.. Execution of data cleansing and transformation tasks to prepare data for analysis.. Designing and building data pipelines to automate data movement and processing.. Development and application of advanced analytical techniques, including machine learning and AI, to solve complex business problems.. Documentation of data quality findings and recommendations for improvement.. To Be Successful In This Role, You Should Have:. Experience in Data Management, Data Governance including records management. Ability to review business processes from data lens and identify critical upstream and downstream components especially in financial services organisation – understanding of models, EUDAs etc.. Strong understanding of Data Governance, Data Quality & Controls, Data Lineage and Reference Data/Metadata Management including relevant policies and frameworks.. A clear understanding of the elements of an effective control environment, enterprise risk management framework, operational risk or other principal risk frameworks. Experience of managing stakeholders directly & indirectly and across geographies & cultures.. Strong understanding and practical exposure to application of BCBS 239 principles and related frameworks. Commercially astute, demonstrates a consultative, yet pragmatic approach with integrity to solving issues, focusing on areas of significance and value to the business.. A strong understanding of Risk and Control environment/control frameworks/op risk, including understanding of second and third line functions and impact across people, process and technology.. analytical techniques and tools to extract meaningful insights from complex data sets and drive dataStrategic Leadership: Provide strategic direction and leadership for data analysis initiatives, ensuring alignment with organizational and program goals. Functional understanding of financial crime and fraud data domains would be preferred.. Data Governance: Oversee data governance policies and procedures to ensure data integrity, security, and compliance with regulatory requirements.. Stakeholder Collaboration: Collaborate with cross-functional teams to identify data needs and deliver actionable insights.. Advanced Analytics: Utilize advanced driven decision-making. Deliver best in class insights to enable stakeholders to make informed business decisions and support data quality issue remediation.. Perform robust reivew and QA of key deliverables being sent out by the team to stakeholders. Demonstrate a collaborative communication style, promoting trust and respect with a range of stakeholders including Operational Risk/Chief Controls Office/ Chief Data Office/ Financial Crime Operations subject matter experts (SMEs), Chief Data Office, Risk Information Services, Technology. Some Other Desired Skills Include:. Graduate in any discipline. Effective communication and presentation skills.. Experience in Data Management/ Data Governance/ Data Quality Controls, Governance, Reporting and Risk Management preferably in a financial services organisation. Experience in Data Analytics and Insights (using latest tools and techniques e.g. Python, Tableau, Tableau Prep, Power Apps, Aletryx ), analytics on structured and unstructured data. Experience on data bases and data science/ analytics tools and techniques like SQL, AI and ML (on live projects and not just academic projects). Proficient in MS Office – PPT, Excel, Word & Visio. Comprehensive understanding of Risk, Governance and Control Frameworks and Processes. Location Noida. Purpose of the role. To implement data quality process and procedures, ensuring that data is reliable and trustworthy, then extract actionable insights from it to help the organisation improve its operation, and optimise resources.. Accountabilities. Investigation and analysis of data issues related to quality, lineage, controls, and authoritative source identification.. Execution of data cleansing and transformation tasks to prepare data for analysis.. Designing and building data pipelines to automate data movement and processing.. Development and application of advanced analytical techniques, including machine learning and AI, to solve complex business problems.. Documentation of data quality findings and recommendations for improvement.. Assistant Vice President Expectations. To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions.. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others.. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes.. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues.. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda.. Take ownership for managing risk and strengthening controls in relation to the work done.. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function.. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy.. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc).to solve problems creatively and effectively.. Communicate complex information. 'Complex' information could include sensitive information or information that is difficult to communicate because of its content or its audience.. Influence or convince stakeholders to achieve outcomes.. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.. Show more Show less

Posted 1 week ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Chennai

Work from Office

Naukri logo

Join us in bringing joy to customer experience. Five9 is a leading provider of cloud contact center software, bringing the power of cloud innovation to customers worldwide, Living our values everyday results in our team-first culture and enables us to innovate, grow, and thrive while enjoying the journey together. We celebrate diversity and foster an inclusive environment, empowering our employees to be their authentic selves, The Data Engineer will help design and implement a Google Cloud Platform (GCP) Data Lake, build scalable data pipelines, and ensure seamless access to data for business intelligence and data science tools. They will support a wide range of projects while collaborating closely with management teams and business leaders. The ideal candidate will have a strong understanding of data engineering principles, data warehousing concepts, and the ability to document technical knowledge into clear processes and procedures, This position is based out of one of the offices of our affiliate Acqueon Technologies in India, and will adopt the hybrid work arrangements of that location. You will be a member of the Acqueon team with responsibilities supporting Five9 products, collaborating with global teammates based primarily in the United States, Responsibilities. Design, implement, and maintain a scalable Data Lake on GCP to centralize structured and unstructured data from various sources (databases, APIs, cloud storage), Utilize GCP services including BigQuery, Dataflow, Pub/Sub, and Cloud Storage to optimize and manage data workflows, ensuring scalability, performance, and security, Collaborate closely with data analytics and data science teams to understand data needs, ensuring data is properly prepared for consumption by various systems (e-g. DOMO, Looker, Databricks). Implement best practices for data quality, consistency, and governance across all data pipelines and systems, ensuring compliance with internal and external standards, Continuously monitor, test, and optimize data workflows to improve performance, cost efficiency, and reliability, Maintain comprehensive technical documentation of data pipelines, systems, and architecture for knowledge sharing and future development, Requirements. Bachelor's degree in Computer Science, Data Engineering, Data Science, or a related quantitative field (e-g. Mathematics, Statistics, Engineering), 3+ years of experience using GCP Data Lake and Storage Services. Certifications in GCP are preferred (e-g. Professional Cloud Developer, Professional Cloud Database Engineer), Advanced proficiency with SQL, with experience in writing complex queries, optimizing for performance, and using SQL in large-scale data processing workflows, Proficiency in programming languages such as Python, Java, or Scala, with practical experience building data pipelines, automating data workflows, and integrating APIs for data ingestion, Five9 embraces diversity and is committed to building a team that represents a variety of backgrounds, perspectives, and skills. The more inclusive we are, the better we are. Five9 is an equal opportunity employer, View our privacy policy, including our privacy notice to California residents here: https://www,five9,/pt-pt/legal, Note: Five9 will never request that an applicant send money as a prerequisite for commencing employment with Five9, Show more Show less

Posted 1 week ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Chennai

Work from Office

Naukri logo

Join us in bringing joy to customer experience. Five9 is a leading provider of cloud contact center software, bringing the power of cloud innovation to customers worldwide, Living our values everyday results in our team-first culture and enables us to innovate, grow, and thrive while enjoying the journey together. We celebrate diversity and foster an inclusive environment, empowering our employees to be their authentic selves, The Data Engineer will help design and implement a Google Cloud Platform (GCP) Data Lake, build scalable data pipelines, and ensure seamless access to data for business intelligence and data science tools. They will support a wide range of projects while collaborating closely with management teams and business leaders. The ideal candidate will have a strong understanding of data engineering principles, data warehousing concepts, and the ability to document technical knowledge into clear processes and procedures, This position is based out of one of the offices of our affiliate Acqueon Technologies in India, and will adopt the hybrid work arrangements of that location. You will be a member of the Acqueon team with responsibilities supporting Five9 products, collaborating with global teammates based primarily in the United States, Responsibilities. Design, implement, and maintain a scalable Data Lake on GCP to centralize structured and unstructured data from various sources (databases, APIs, cloud storage), Utilize GCP services including BigQuery, Dataflow, Pub/Sub, and Cloud Storage to optimize and manage data workflows, ensuring scalability, performance, and security, Collaborate closely with data analytics and data science teams to understand data needs, ensuring data is properly prepared for consumption by various systems (e-g. DOMO, Looker, Databricks). Implement best practices for data quality, consistency, and governance across all data pipelines and systems, ensuring compliance with internal and external standards, Continuously monitor, test, and optimize data workflows to improve performance, cost efficiency, and reliability, Maintain comprehensive technical documentation of data pipelines, systems, and architecture for knowledge sharing and future development, Requirements. Bachelor's degree in Computer Science, Data Engineering, Data Science, or a related quantitative field (e-g. Mathematics, Statistics, Engineering), 4+ years of experience using GCP Data Lake and Storage Services. Certifications in GCP are preferred (e-g. Professional Cloud Developer, Professional Cloud Database Engineer), Advanced proficiency with SQL, with experience in writing complex queries, optimizing for performance, and using SQL in large-scale data processing workflows, Proficiency in programming languages such as Python, Java, or Scala, with practical experience building data pipelines, automating data workflows, and integrating APIs for data ingestion, Five9 embraces diversity and is committed to building a team that represents a variety of backgrounds, perspectives, and skills. The more inclusive we are, the better we are. Five9 is an equal opportunity employer, View our privacy policy, including our privacy notice to California residents here: https://www,five9,/pt-pt/legal, Note: Five9 will never request that an applicant send money as a prerequisite for commencing employment with Five9, Show more Show less

Posted 1 week ago

Apply

8.0 - 12.0 years

12 - 16 Lacs

Mumbai

Work from Office

Naukri logo

Operational Risk Data Management. Job Summary / Objective. Act as a strategic advisor and engagement lead, providing executive oversight and direction for the client’s OCC-driven data remediation initiatives. Ensure alignment of data management and governance and quality improvement strategies with regulatory requirements and business objectives.. Key Responsibilities / Duties. Define and communicate the strategic vision for data governance remediation to client executives.. Guide the client in modernizing data architecture, risk aggregation, and regulatory reporting processes.. Advise on development and enforcement of enterprise-wide data policies, standards, and controls.. Support executive and Board-level reporting and engagement with OCC or other regulators.. Lead efforts to foster a culture of data accountability and continuous improvement within the client organization.. Required Skill Sets & Requirements. Enterprise Data Analysis and Management:. Extensive experience designing and implementing data analysis and management programs in large financial institutions.. Strong understanding of data quality metrics, master data management, and metadata management.. Regulatory & Risk Management:. Experience in Operational risk domains including but not limited to Data risk, Fraud risk, Tech risk, Cyber risk, Op resiliency risk, third party risk, Processing risk, Services and Enterprise ops risk, Regulatory management reporting and financial statement reporting risk.. Responsibilities include requirements gathering, data acquisition, data quality assessment, and building risk monitoring tools. Deep knowledge of regulatory frameworks (BCBS 239) and experience supporting regulatory remediation.. Technical & Analytical:. Programing proficiency in Python, SQL and reporting tools like Tableau, PowerBI, and Jira. Experience guiding IT modernization, system integration, and process optimization.. Advanced problem-solving, decision-making, and client advisory skills.. Communication & Board Reporting:. Excellent communication, negotiation, and presentation skills with demonstrated experience in Board-level engagement.. Qualifications. Master’s or advanced degree preferred.. 6+ years’ experience in consulting or executive roles in financial services.. Professional certifications (CDMP, PMP) highly desirable.. ORM-Level 1 Support experience required. Indian Passport with 1 Year Validity Mandatory. Show more Show less

Posted 1 week ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Chennai

Work from Office

Naukri logo

Title : Logistics Analyst - Process Controlled Transportation GCL : C2 Introduction to role: Are you ready to make a significant impact in the world of pharmaceuticals? As a Logistics Analyst specializing in Process Controlled Transportation, youll be part of a dynamic team responsible for establishing and maintaining global standards for the transportation and storage of AstraZenecas pharmaceutical products. Your role will involve coordinating complaints management, compliance monitoring, and risk management, ensuring the smooth operation of our global network. With a focus on GDP-defined temperature requirements, youll play a crucial role in delivering life-changing medicines to patients worldwide. Accountabilities: Dive into three key specialisms: Source Data Analysts, Integration Data Analyst, and Data Steward. Process and investigate supply and logistics (S&L) and temperature excursion (TE) complaints. Collaborate with Supply Sites and Logistic Service Providers (LSP) to identify root causes and implement corrective actions. Manage workload to ensure timely resolution of complaints. Prepare performance reports against key performance indicators, including resolution time and closure rate. Present complaints trending reports to the PCT team for continuous process improvement. Provide compliance monitoring support to regional and global logistics leaders. Essential Skills/Experience: Education, Qualifications, Skills and Experience: Essential: Undergraduate degree in Computer Science, Data Management or related discipline; Proven experience in data analysis and information management; Domain data understanding; Business process knowledge in data generation and consumption. Skills and Capabilities: Essential: Blend of data requirement analysis, data quality analysis, data stewardship skills; Experience in translating requirements into data models; Knowledge of AZ policies for data privacy and security; Excellent communication skills; Experience in agile multi-location teams; Risk-based methodology application; Metadata cataloguing tools experience; Data Analysis enabling tool kits. A minimum of years prior experience in logistics area Proven analytic skills Good problem solving/investigational skills Excellent interpersonal and communication skills, team player, with the ability to identify and communicate key issues for resolution Good understanding of the business context, e.g. logistics, shipping, freight management, Good Distribution Practices, Compliance & Sustainability policies & standards, Safety, Health and Environment and Standard Operating Procedures Intermediate, proficient Microsoft office skills Logistics Service Providers Fluent in English Desirable Skills/Experience: Pharmaceutical/Bio pharmaceutical experience Good working knowledge of the pharmaceutical industry Experience in risk & compliance monitoring Experience in Business Process Management processes AstraZeneca offers an environment where innovation thrives! With constant new products and launches, youll have the opportunity to shape the future of supply chain management. Our resilience drives us forward as we continuously seek new ways to deliver medicines to patients. Here, youll be encouraged to share ideas and problem-solve as part of a diverse team connected across the globe. We focus on staying ahead in rapidly changing markets, applying digital and Lean processes to accelerate progress. If youre driven, adaptable, and ready to make a big impact, AstraZeneca is the place for you. Ready to take on this exciting challenge? Apply now and be part of our journey to deliver life-changing medicines! 19-Jun-2025 29-Jun-2025

Posted 1 week ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Job Summary: We are seeking a highly motivated and experienced Data Engineer to design, build, and maintain robust data pipelines. The ideal candidate will have deep expertise in cloud-based data processing technologies using GCP, AWS, or Azure. You will play a critical role in ensuring data reliability, quality, and availability to support our business needs. You will get to work on cutting-edge data infrastructure with a real-world impact. Responsibilities: Design, develop, and maintain scalable data pipelines using cloud services (GCP, AWS, or Azure). Implement data ingestion, transformation, and storage solutions. Ensure data quality, integrity, and security across all data systems. Monitor and troubleshoot data pipelines to identify and resolve issues. Has to ensure a 99.999% uptime & performance SLAs of the production environment. Collaborate with data scientists and other stakeholders to understand data requirements. Optimize data pipelines for performance, reliability, and cost-effectiveness. Create and maintain clear documentation of data flows, system architecture, and operational procedures. Qualifications: B.Tech. / M. Tech. degree or higher educational qualification. 3+ years of experience in Data Engineering or related roles. Must have worked in production deployments, including managing scalable data pipelines that process high-volume, real-time data. Proficiency with cloud services such as GCP, AWS, or Azure is required. Hands-on experience with Distributed Stream Processing Technology ecosystem, including Apache Kafka, Flink or Beam etc. Strong Programming skills in languages such as Python, Scala, or NodeJS. Experience with SQL, NoSQL & especially Timeseries databases. Experience with parallel programming & distributed data processing. Solid understanding of data warehousing concepts and technologies. Ability to design and implement data architectures. Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Preferred Qualifications: Experience developing data pipelines & processing services, with a deeper understanding of the Quality Controls. Knowledge of IoT data and related technologies. A Google Cloud Certified Professional Data Engineer will be given preference.

Posted 1 week ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

WHO WE ARE: As a Technical Lead specializing in job scheduling and automation, you bring extensive expertise in managing software support operations and ensuring seamless cycle management. You are adept at leveraging tools like Broadcom Automic to streamline workflows and optimize processes. With a strong background in Python, API integration, and database management, you excel in resolving complex technical challenges and driving efficiency enhancements. Your commitment to providing round-the-clock support underscores your dedication to customer satisfaction and operational excellence. WHAT YOU LL DO: Lead a team of software support engineers in providing technical assistance for job scheduling tools and cycle management. Spearhead troubleshooting efforts to swiftly resolve software issues reported by customers, ensuring minimal disruption to operations. Collaborate closely with the development team to address intricate technical problems and implement robust solutions. Drive the configuration and optimization of job scheduling workflows, utilizing your expertise in Broadcom Automic scripting and automation. Champion the integration of job scheduling tools with external systems and APIs, enhancing interoperability and functionality. Conduct comprehensive system performance analyses and devise strategies for continual improvement. Document and disseminate solutions to technical and non-technical stakeholders, fostering transparency and knowledge sharing within the organization. WHAT YOU LL NEED: Experience with Broadcom Automic scripting and other automation and scheduling tools. Experience with ETL/ELT processes. Knowledge of Informatica or SSIS for data integration and transformation. Familiarity with data warehousing concepts and practices. Understanding of data quality and data governance principles. Experience in cloud-based environments and technologies. WHAT S IN IT FOR YOU? We re looking for the best and brightest innovators in the industry to join our team. At Zinnia, you collaborate with smart, creative professionals dedicated to delivering cutting-edge technologies, deeper data insights, and enhanced services to transform how insurance is done. Visit our website at www.zinnia.com for more information. Apply by completing the online application on the careers section of our website. We are an Equal Opportunity employer committed to a diverse workforce. We do not discriminate based on race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability. #LI-SN1

Posted 1 week ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

WHO WE ARE: As a Technical Lead specializing in job scheduling and automation, you bring extensive expertise in managing software support operations and ensuring seamless cycle management. You are adept at leveraging tools like Broadcom Automic to streamline workflows and optimize processes. With a strong background in Python, API integration, and database management, you excel in resolving complex technical challenges and driving efficiency enhancements. Your commitment to providing round-the-clock support underscores your dedication to customer satisfaction and operational excellence. WHAT YOU LL DO: Lead a team of software support engineers in providing technical assistance for job scheduling tools and cycle management. Spearhead troubleshooting efforts to swiftly resolve software issues reported by customers, ensuring minimal disruption to operations. Collaborate closely with the development team to address intricate technical problems and implement robust solutions. Drive the configuration and optimization of job scheduling workflows, utilizing your expertise in Broadcom Automic scripting and automation. Champion the integration of job scheduling tools with external systems and APIs, enhancing interoperability and functionality. Conduct comprehensive system performance analyses and devise strategies for continual improvement. Document and disseminate solutions to technical and non-technical stakeholders, fostering transparency and knowledge sharing within the organization. WHAT YOU LL NEED: Experience with Broadcom Automic scripting and other automation and scheduling tools. Experience with ETL/ELT processes. Knowledge of Informatica or SSIS for data integration and transformation. Familiarity with data warehousing concepts and practices. Understanding of data quality and data governance principles. Experience in cloud-based environments and technologies. WHAT S IN IT FOR YOU? We re looking for the best and brightest innovators in the industry to join our team. At Zinnia, you collaborate with smart, creative professionals dedicated to delivering cutting-edge technologies, deeper data insights, and enhanced services to transform how insurance is done. Visit our website at www.zinnia.com for more information. Apply by completing the online application on the careers section of our website. We are an Equal Opportunity employer committed to a diverse workforce. We do not discriminate based on race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability. #LI-SN1

Posted 1 week ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Gurugram

Work from Office

Naukri logo

WHO WE ARE: As a Technical Lead specializing in job scheduling and automation, you bring extensive expertise in managing software support operations and ensuring seamless cycle management. You are adept at leveraging tools like Broadcom Automic to streamline workflows and optimize processes. With a strong background in Python, API integration, and database management, you excel in resolving complex technical challenges and driving efficiency enhancements. Your commitment to providing round-the-clock support underscores your dedication to customer satisfaction and operational excellence. WHAT YOU LL DO: Lead a team of software support engineers in providing technical assistance for job scheduling tools and cycle management. Spearhead troubleshooting efforts to swiftly resolve software issues reported by customers, ensuring minimal disruption to operations. Collaborate closely with the development team to address intricate technical problems and implement robust solutions. Drive the configuration and optimization of job scheduling workflows, utilizing your expertise in Broadcom Automic scripting and automation. Champion the integration of job scheduling tools with external systems and APIs, enhancing interoperability and functionality. Conduct comprehensive system performance analyses and devise strategies for continual improvement. Document and disseminate solutions to technical and non-technical stakeholders, fostering transparency and knowledge sharing within the organization. WHAT YOU LL NEED: Experience with Broadcom Automic scripting and other automation and scheduling tools. Experience with ETL/ELT processes. Knowledge of Informatica or SSIS for data integration and transformation. Familiarity with data warehousing concepts and practices. Understanding of data quality and data governance principles. Experience in cloud-based environments and technologies. WHAT S IN IT FOR YOU? We re looking for the best and brightest innovators in the industry to join our team. At Zinnia, you collaborate with smart, creative professionals dedicated to delivering cutting-edge technologies, deeper data insights, and enhanced services to transform how insurance is done. Visit our website at www.zinnia.com for more information. Apply by completing the online application on the careers section of our website. We are an Equal Opportunity employer committed to a diverse workforce. We do not discriminate based on race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability. #LI-SN1

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies