Home
Jobs

2452 Data Quality Jobs - Page 37

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

10 - 14 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure Analytics Services Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the effort to design, build, and configure applications- Act as the primary point of contact- Manage the team and ensure successful project delivery Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Analytics Services- Good To Have Skills: Experience with cloud platforms such as AWS or Google Cloud- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have a minimum of 5 years of experience in Microsoft Azure Analytics Services- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Apache Spark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the effort to design, build, and configure applications- Act as the primary point of contact- Manage the team and ensure successful project delivery Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have a minimum of 5 years of experience in Apache Spark- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Informatica MDM Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions, providing insights and solutions to enhance application performance and user experience. Your role will require you to stay updated with the latest technologies and methodologies to ensure the applications are built using best practices. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate communication between technical teams and stakeholders to ensure alignment on project goals.- Mentor junior team members, providing guidance and support in their professional development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM.- Strong understanding of data integration processes and methodologies.- Experience with data quality management and data governance practices.- Familiarity with database management systems and data modeling techniques.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 2 years of experience in Informatica MDM.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

Description: ACCOUNTABILITIES: Designs, codes, tests, debugs and documents software according to Dell s systems quality standards, policies and procedures. Analyzes business needs and creates software solutions. Responsible for preparing design documentation. Prepares test data for unit, string and parallel testing. Evaluates and recommends software and hardware solutions to meet user needs. Resolves customer issues with software solutions and responds to suggestions for improvements and enhancements. Works with business and development teams to clarify requirements to ensure testability. Drafts, revises, and maintains test plans, test cases, and automated test scripts. Executes test procedures according to software requirements specifications Logs defects and makes recommendations to address defects. Retests software corrections to ensure problems are resolved. Documents evolution of testing procedures for future replication. May conduct performance and scalability testing. RESPONSIBILITIES: Leads small to moderate budget projects; may perform in project leadership role and/or may supervise the activities of lower level personnel. Provides resolutions to a diverse range of complex problems. Executes schedules, costs and documentation to ensure assigned projects come to successful conclusion. May assist in training, assigning and checking the work of less experienced developers. Performs estimation efforts on projects and tracks progress. Drafts and revises test plans and scripts with consideration to end-to-end system flows. Executes test scripts according to application requirements documentation. Logs defects, identifies course of action and performs preliminary root cause analysis. Analyzes and communicates test results to project team. Description Comments Additional Details Description Comments : Skills: Python, PySpark and SQL 5 years of experience in Spark, Scala, PySpark for big data processing Proficiency in Python programming for data manipulation and analysis. Experience with Python libraries such as Pandas, NumPy. Knowledge of Spark architecture and components (RDDs, DataFrames, Spark SQL). Strong knowledge of SQL for querying databases. Experience with database systems like Lakehouse, PostgreSQL, Teradata, SQL Server. Ability to write complex SQL queries for data extraction and transformation. Strong analytical skills to interpret data and provide insights. Ability to troubleshoot and resolve data-related issues. Strong problem-solving skills to address data-related challenges Effective communication skills to collaborate with cross-functional teams.Role/Responsibilities: Work on development activities along with lead activities Coordinate with the Product Manager (PdM) and Development Architect (Dev Architect) and handle deliverables independently Collaborate with other teams to understand data requirements and deliver solutions. Design, develop, and maintain scalable data pipelines using Python and PySpark. Utilize PySpark and Spark scripting for data processing and analysis Implement ETL (Extract, Transform, Load) processes to ensure data is accurately processed and stored. Develop and maintain Power BI reports and dashboards. Optimize data pipelines for performance and reliability. Integrate data from various sources into centralized data repositories. Ensure data quality and consistency across different data sets. Analyze large data sets to identify trends, patterns, and insights. Optimize PySpark applications for better performance and scalability. Continuously improve data processing workflows and infrastructure. Not to Exceed Rate : (No Value)

Posted 2 weeks ago

Apply

1.0 - 8.0 years

3 - 10 Lacs

Chennai

Work from Office

Naukri logo

Job Title: Data Catalogue Analyst Career Leve : C3 Introduction to role: Are you ready to make a significant impact in the world of data management? As a Data Catalogue Analyst, youll play a crucial role in ensuring that data is findable, accessible, and fit for use across various business units. Youll be responsible for capturing metadata and developing our data catalogue, supporting the Commercial and Enabling Units business areas. This is your chance to contribute to meaningful work that drives excellence and breakthroughs. Accountabilities: Support the Data Catalogue Principal to define Information Asset Registers across business areas to help profile information risk/value Participate in projects to mitigate and control identified priority risk areas Take responsibility for nominated markets/business areas, develop domain knowledge and leverage internal customer relationships to respond to localised use cases Act as point of contact for nominated business areas or markets Support initiatives to enhance the reusability and transparency of our data by making it available in our global data catalogue Support the capture of user requirements for functionality and usability, and document technical requirements Work with IT partners to capture metadata for relevant data sets and lineage, and populate the catalogue Work with data stewards and business users to enrich catalogue entries with business data dictionary, business rules, glossaries Complete monitoring controls to assure metadata quality remains at a high level Support catalogue principles and data governance leads for tool evaluation and UAT Essential Skills/Experience: Demonstrable experience of working in a data management, data governance or data engineering domain Strong business and system analysis skills Demonstrable experience with Data Catalogue, Search and Automation software (Collibra, Informatica, Talend etc) Ability to interpret and communicate technical information into business language and in alignment with AZ business Solid grasp of metadata harvesting methodologies and ability to create business and technical metadata sets. Strong engagement, communication and collaborator management skills, including excellent organizational, presentation and influencing skills High level of proficiency with common business applications (Excel, Visio, Word, PowerPoint & SAP business user) Desirable Skills/Experience: Demonstrable experience of working with Commercial or Finance data and systems (Veeva, Reltio, SAP) and consumption Domain knowledge of life sciences/pharmaceuticals; manufacturing; corporate finance; or sales & marketing Experience with data quality and profiling software Experience of working in a complex, diverse global organization AstraZeneca offers an environment where you can apply your skills to genuinely impact patients lives. With a focus on innovation and growth, youll be part of a team that challenges norms and embraces intelligent risks. Our collaborative community thrives on sharing knowledge and celebrating successes together. Here, youll find opportunities to learn from diverse perspectives, drive change, and contribute to our digital transformation journey. Ready to take the next step in your career? Apply now and become a key player in shaping the future at AstraZeneca! 11-Jun-2025 11-Jun-2025

Posted 2 weeks ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Key responsibilities Deliver quality analytics, from data preparation, data analysis, data exploration, data quality assessment, data manipulation, method selection, design & application, insights generation and visualisation Develop and implement basic machine learning models and algorithms under the guidance of senior data scientists to extract insights and solve business problems Proactive learning and acquisition of key analytical, technical and commercial skills and business knowledge to become a proficient Analyst working under the supervision of the senior/lead data science analysts. KPIs: Timeliness, accuracy, manager and client feedback (Internal and external as required) Collaborate with internal stakeholders and demonstrate the ability to transform client questions and problems into analytical solutions Active team member in providing the required support to help business understand and optimise use of analytical products and / or solutions Build industry knowledge on the advancements in the field of analytics, data science and GenAI Comply with the IM Cigna and CHSI Policies, procedures and processes, and continuously demonstrate Cigna Data and Analytics culture. Key activities Working in a team to support end-to-end analytical projects Liaising with stakeholders to determine objectives / scope of upcoming projects Data exploration, cleansing and manipulation Determining appropriate type of analysis and undertaking analysis/modelling Extracting insights Clear presentation of insights via spreadsheets, PowerPoint presentations, self-service analytical visualisation tools Participate in client meetings Ongoing stakeholder interaction (internal and external as required) on project progress Contribute to the Feedback process (between stakeholders and the team) to ensure continuous improvement with team Participate and contribute in learning forums such as Analytics Community and sharing knowledge with wider team Experience and education required 2-4+ years experience in a technical analytics environment, carrying out data analytics and data science/AI projects and initiatives Tertiary qualifications in engineering, mathematics, actuarial studies, statistics, physics, or a related discipline Knowledge of technical analytics discipline, including data preparation and foundational analytics concepts Experience with successfully managing both internal and external stakeholders, delivering against projects, tasks and activities in a dynamic deadline driven environment Commercial acumen to understand business needs and be able to suggest the commercial impacts of different analytics solutions or approaches Coding and modelling experience in SQL / R / Python and / or Cloud data platforms e.g. AWS Experience in visualization and data management tools is an added advantage Experience in GenAI/ LLMs is an added advantage Experience working with complex datasets Attention to detail and self driven continuous learning Participation in external data hackathons and competitions will be an added advantage

Posted 2 weeks ago

Apply

7.0 - 10.0 years

8 - 14 Lacs

Nagpur

Remote

Naukri logo

Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 2 weeks ago

Apply

7.0 - 10.0 years

8 - 14 Lacs

Ahmedabad

Remote

Naukri logo

Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 2 weeks ago

Apply

5.0 - 7.0 years

4 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

About The Role: We are looking for a highly motivated, results-oriented Order Management Analyst to join our QTC organization. As an Order Management Analyst, you will support the Sales, Revenue Operations and Quote to Cash teams by processing deals in an Order Management capacity. This role requires meticulous reviewing of executed agreements for all transaction types as well as auditing Salesforce data in preparation for booking. This is a key role that helps drive effective financial reporting, revenue booking, sales commissions, provisioning and customer invoicing. Your responsibilities will include but are not limited to: reviewing the quote approval process, conducting quality checks on order forms, assistance with deal structuring, ensuring accurate data for each sales opportunity and account in Salesforce, correcting discrepancies based on order form information and existing data as well as serve as a cross-functional sounding board and liaison. This critical hire will report to the India-based Accounting Director and U. S. -based Accounting Senior Manager. This role will be expected to work a U. S. Pacific time work day. Location: Hyderabad, India Your Daily Adventures: Audit all customer order forms submitted for booking to confirm accuracy Review and approve CPQ quotes to ensure they align with company policies and objectives Perform financial, commission, and sales data quality checks in Salesforce for each opportunity and quote to ensure all records meet booking requirements Identify and address incorrect metrics and information according to booking policies and ensure accuracy before opportunities are closed won Serve as the first and last line of defense beginning from quote creation to booking in Salesforce and Zuora Collaborate with RevOps, BizSys, Deal Desk and Legal teammates to address booking issues and to identify areas for improvement across the systems Partner with Billing, Revenue and Provisioning teammates to address issues post order booking Complete assigned Salesforce cases submitted by Salesforce end users Review and reply to Zendesk cases submitted by various stakeholders Complete the month- and quarter-end audit tasks Any other responsibilities that may be assigned to help the company meet its goals Our Vision of You: Must have Bachelors degree with preference to master s degree in accounting, Finance, or a related field. 5-7 years of experience with order management, deal desk, finance, contracts, sales operations, or revenue with overall experience of 8+ years. Software/SaaS experience required Experience with Salesforce and CPQ required Familiarity with Zuora and Zendesk a plus Proficient in Microsoft Excel Readily available during the end of the month/quarter Acute attention to detail and the ability to closely follow policies and instructions Strong listening, analytical and organizational skills Operational mindset and approach to work Willingness and eagerness to learn with a team player attitude Flexibility and ability to work and adapt to change in a fast-paced and fully remote environment

Posted 2 weeks ago

Apply

1.0 - 3.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

The opportunity At Hitachi Energy, we are building a future-ready data ecosystem. As a Data Governance Specialist, you will be a key enabler in shaping and operationalizing our enterprise-wide data governance framework. You will focus on the implementation and evolution of our Data Catalog, Metadata Management, and Data Compliance initiatives ensuring our data assets are trusted, discoverable, and aligned with business value . This role is ideal for early-career professionals with a can-do mindset and a passion for making things happen. You will work in a dynamic, cross-functional environment that values curiosity, ownership, and ethical leadership. How you ll make an impact Data Catalog Compliance Implementation learning by doing Define and maintain the roadmap for the Enterprise Data Catalog and Data Supermarket Con figur e and execute deployment of cataloging tools (e. g. , metadata management, lineage, glossary) Ensure alignment with DAMA - DMBOK principles Governance Framework Execution Collaborate with Data Owners, Stewards, and Custodians to define and enforce data policies, standards, and RACI mode Support the Data Governance Council and contribute to the development of governance artifacts (e. g. , roles, regulations, KPIs) Data Quality Stewardship Partner with domain experts to drive data profiling, cleansing, and validation initiatives Monitor data quality metrics and support remediation efforts across domains Stakeholder Engagement Enablement Provide training and support to business users on catalog usage and governance practices Act as a liaison between business and IT to ensure data needs are met and governance is embedded in operations Innovation Continuous Improvement Stay current with industry trends and tool capabilities (e. g. , Databricks, SAP MDG) Propose enhancements to governance processes and tooling based on user feedback and analytics Your background Bachelors degree in information systems , Data Science, Business Informatics, or related field 1-3 years of experience in data governance, data management, or analytics roles Familiarity with DAMA DMBOK2 frame wo rk and data governance tools ( e. g. SAP MDG, Data Sphere, Business Warehouse , Data Intelligence, Informatica ETL , ) Strong communication and collaboration skills; ability to work across business and technical teams. Proactive, solution-oriented, and eager to learn ready to make it happen. Autonomy and ambiguity management capacities are competitive advantages Complete the preference for a candidate, new technology focus, next stop learning and embracing the challenge attitude CDMP certifications preference attribute More about us When joining us you may expect: A purpose-driven role in a global energy leader committed to sustainability and digital transformation Mentorship and development opportunities within a diverse and inclusive team. a initiatives and cutting-edge technologies A culture that values integrity, curiosity, and collaboration aligned with Hitachi Energy s Leadership Pillars: Lead with Purpose Opportunity to create customer value Drive Results Build Collaboration Develop Self Others Qualified individuals with a disability may request a reasonable accommodation if you are unable or limited in your ability to use or access the Hitachi Energy career site as a result of your disability. You may request reasonable accommodations by completing a general inquiry form on our website. Please include your contact information and specific details about your required accommodation to support you during the job application process. .

Posted 2 weeks ago

Apply

4.0 - 8.0 years

17 - 19 Lacs

Bengaluru

Work from Office

Naukri logo

As a Data engineer in our team, you work with large scale manufacturing data coming from our globally distributed plants. You will focus on building efficient, scalable data-driven applications. The data sets produced by these applications - whether data streams or data at rest - need to be highly available, reliable, consistent and quality-assured so that they can serve as input to wide range of other use cases and downstream applications. We run these applications on Azure databricks, you will be building applications, you will also contribute to scaling the platform including topics such as automation and observability. Finally, you are expected to interact with customers and other technical teams e. g. for requirements clarification definition of data models. Primary responsibilities: Be a key contributor to the Bosch hybrid cloud data platform (on-prem cloud) Designing building data pipelines on a global scale, ranging from small to huge datasets Design applications and data models based on deep business understanding and customer requirements Directly work with architects and technical leadership to design implement applications and / or architectural components Architectural proposal and estimation for the application, technical leadership to the team Coordination/Collaboration with central teams for tasks and standards Develop data integration workflow in Azure Developing streaming application using scala. Integrating the end-to-end Azure Databricks pipeline to take data from source systems to target system ensuring the quality and consistency of data. Defining data quality and validation checks. Configuring data processing and transformation. Writing unit test cases for data pipelines. Defining and implementing data quality and validation check. Tuning pipeline configurations for optimal performance. Participate in Peer review and PR review for the code written by team members

Posted 2 weeks ago

Apply

3.0 - 6.0 years

8 - 13 Lacs

Hyderabad

Work from Office

Naukri logo

Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist In this role you will be Design and Develop ETL Processes: Lead the design and implementation of ETL processes using all kinds of batch/streaming tools to extract, transform, and load data from various sources into GCP. Collaborate with stakeholders to gather requirements and ensure that ETL solutions meet business needs. Data Pipeline Optimization: Optimize data pipelines for performance, scalability, and reliability, ensuring efficient data processing workflows. Monitor and troubleshoot ETL processes, proactively addressing issues and bottlenecks. Data Integration and Management: I ntegrate data from diverse sources, including databases, APIs, and flat files, ensuring data quality and consistency. Manage and maintain data storage solutions in GCP (e. g. , BigQuery, Cloud Storage) to support analytics and reporting. GCP Dataflow Development: Write Apache Beam based Dataflow Job for data extraction, transformation, and analysis, ensuring optimal performance and accuracy. Collaborate with data analysts and data scientists to prepare data for analysis and reporting. Automation and Monitoring: Implement automation for ETL workflows using tools like Apache Airflow or Cloud Composer, enhancing efficiency and reducing manual intervention. Set up monitoring and alerting mechanisms to ensure the health of data pipelines and compliance with SLAs. Data Governance and Security: Apply best practices for data governance, ensuring compliance with industry regulations (e. g. , GDPR, HIPAA) and internal policies. Collaborate with security teams to implement data protection measures and address vulnerabilities. Documentation and Knowledge Sharing: Document ETL processes, data models, and architecture to facilitate knowledge sharing and onboarding of new team members. Conduct training sessions and workshops to share expertise and promote best practices within the team. Requirements To be successful in this role, you should meet the following requirements: Education: Bachelor s degree in Computer Science, Information Systems, or a related field. Experience: Minimum of 5 years of industry experience in data engineering or ETL development, with a strong focus on Data Stage and GCP. Proven experience in designing and managing ETL solutions, including data modeling, data warehousing, and SQL development. Technical Skills: Strong knowledge of GCP services (e. g. , BigQuery, Dataflow, Cloud Storage, Pub/Sub) and their application in data engineering. Experience of cloud-based solutions, especially in GCP, cloud certified candidate is preferred. Experience and knowledge of Bigdata data processing in batch mode and streaming mode, proficient in Bigdata eco systems, e. g. Hadoop, HBase, Hive, MapReduce, Kafka, Flink, Spark, etc. Familiarity with Java Python for data manipulation on Cloud/Bigdata platform. Analytical Skills: Strong problem-solving skills with a keen attention to detail. Ability to analyze complex data sets and derive meaningful insights. Benefits: Competitive salary and comprehensive benefits package. Opportunity to work in a dynamic and collaborative environment on cutting-edge data projects. Professional development opportunities to enhance your skills and advance your career. If you are a passionate data engineer with expertise in ETL processes and a desire to make a significant impact within our organization, we encourage you to apply for this exciting opportunity!

Posted 2 weeks ago

Apply

2.0 - 6.0 years

6 - 11 Lacs

Pune

Work from Office

Naukri logo

Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist In this role, you will: Highly skilled and experienced Developer Engineer with expertise in Java, Java 8, Microservices, Springboot 3. 0. 0, postgres, JPA, UI -React, Typescript, JS, Apache Flink, Apache Beam, MongoDB. Strong knowledge on Google Cloud Platform (GCP) services such as Dataflow, Big Query/Clickhouse, Cloud storage, Pub/Sub, Google Cloud Storage (GCS), and Composer. The ideal candidate should also have hands-on experience with Apache Airflow, Google Kubernetes Engine (GKE), and Python for scripting and automation , automation testing. You will play a critical role in designing, developing, and maintaining scalable, high-performance pipelines and cloud-native solutions Strong focus on real-time stream processing using Apache Flink. Collaborate with cross-functional teams to define, design, and deliver new features and enhancements. Monitor and optimize the performance of data pipelines and applications. Ensure data quality, integrity, and security across all data pipelines and storage solutions. Provide technical guidance and mentorship to junior team members. Stay up-to-date with the latest data engineering technologies, best practices, and industry trends. Requirements To be successful in this role, you should meet the following requirements: 2 to 6 years of experience in development engineering, with a focus on ETL processes and data pipeline development. Bachelor s or Master s degree in Computer Science, Engineering, or a related field. Strong expertise in Java, SQL for data extraction, transformation, and loading. Strong problem-solving, analytical skills and the ability to troubleshoot complex data and application issues. Excellent communication and collaboration skills, with the ability to work effectively in a team environment. Familiarity with Helm charts for Kubernetes deployments. Experience with monitoring tools like Prometheus, Grafana, or Stackdriver. Knowledge of security best practices for cloud and Kubernetes environments. Knowledge of DevOps Skills will be an added advantage

Posted 2 weeks ago

Apply

2.0 - 11.0 years

16 - 18 Lacs

Pune

Work from Office

Naukri logo

Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Software Engineer In this role you will be Expertise in Scala-Spark/Python-Spark development and should be able to Work with Agile application dev team to implement data strategies. Design and implement scalable data architectures to support the banks data needs. Develop and maintain ETL (Extract, Transform, Load) processes. Ensure the data infrastructure is reliable, scalable, and secure. Oversee the integration of diverse data sources into a cohesive data platform. Ensure data quality, data governance, and compliance with regulatory requirements. Monitor and optimize data pipeline performance. Troubleshoot and resolve data-related issues promptly. Implement monitoring and alerting systems for data processes. Troubleshoot and resolve technical issues optimizing system performance ensuring reliability. Create and maintain technical documentation for new and existing system ensuring that information is accessible to the team. Implementing and monitoring solutions that identify both system bottlenecks and production issues. Requirements To be successful in this role, you should meet the following requirements: Experience in data engineering or related field and hands-on experience of building and maintenance of ETL Data pipelines Good experience in Designing and Developing Spark Applications using Scala or Python. Good experience with database technologies (SQL, NoSQL), data warehousing solutions, and big data technologies (Hadoop, Spark) Proficiency in programming languages such as Python, Java, or Scala. Optimization and Performance Tuning of Spark Applications GIT Experience on creating, merging and managing Repos. Perform unit testing and performance testing. Good understanding of ETL processes and data pipeline orchestration tools like Airflow, Control-M. Strong problem-solving skills and ability to work under pressure. Excellent communication and interpersonal skills.

Posted 2 weeks ago

Apply

0.0 - 6.0 years

10 - 14 Lacs

Mumbai

Work from Office

Naukri logo

Join our dynamic team to innovate and refine technology operations, impacting the core of our business services. As a Technology Support Lead in Commercial Investment Bankings Regulatory Compliance Surveillance team, you will play a leadership role in ensuring the operational stability, availability, and performance of our production services. Critical thinking while overseeing day-to-day maintenance of the firm s systems will be key and set you up for success as you navigate tasks related to identifying, troubleshooting, and resolving issues to ensure a seamless user experience. Job responsibilities Lead teams of technologists that provide end-to-end application service delivery for the successful business operations of the firm on FINRA CAT regulatory reporting Execute policies and procedures that ensure operational stability and availability Monitor production environments for anomalies, address data quality issues in Regulatory reporting and drive evolution of utilization of standard observability tools and adhere to FINRA CAT SLA requirements Escalate and communicate issues and solutions to the business and technology stakeholders, actively participating on FINRA CAT Reporting business governance meetings Required qualifications, capabilities, and skills 5+ years of experience or equivalent expertise troubleshooting, resolving and maintaining information technology services Minimum 10 years of experience doing application development and/or production support in finance domain. Technically hands-on working with Linux scripting (Shell, Python, Pyspark ) and database systems (Oracle SQL/PLSQL). Basic knowledge of application development. Quick thinker, problem solver (even under pressure) and a fast learner under constantly changing environment. Ability to automate and build tools for improved productivity, or eagerness to learn it Preferred qualifications, capabilities, and skills Working knowledge in one or more general purpose programming languages and/or automation scripting Exposure to AWS Cloud would be plus

Posted 2 weeks ago

Apply

0.0 - 4.0 years

8 - 12 Lacs

Mumbai

Work from Office

Naukri logo

Working in Regulatory Compliance Surveillance Application Support means youll use both creative and critical thinking skills to maintain application systems that are crucial to the daily operations of the firm. Youll work collaboratively in teams on a wide range of projects based on your primary area of focus While learning to fix application and data issues as they arise, youll also gain exposure to software development, testing, deployment, maintenance and improvement, in addition to production lifecycle methodologies and risk guidelines. Finally, youll have the opportunity to develop professionally and to grow your career in any direction you choose. This role requires a wide variety of strengths and capabilities, including Must have a bachelor s or master s degree in engineering, computer science or information systems. Minimum 5 years of experience doing application development and/or production support in finance domain. Technically hands-on working with Linux, scripting (Shell, Python and/or Perl) and database systems (Oracle SQL/PLSQL). Basic knowledge of application development. Working knowledge in one or more general purpose programming languages, plus an interest in learning other coding languages and skills as needed. Working knowledge of development toolset to design, develop, test, deploy, maintain, and improve software. Few years of programming experience in any modern language would be a great plus. Exposure to AWS Cloud and Kafka would be plus. Quick thinker, problem solver (even under pressure) and a fast learner under constantly changing environment. Ability to automate and build tools for improved productivity, or eagerness to learn it Job responsibilities Engage in the management of incidents, problems, and changes to support full stack technology systems, applications, and infrastructure. Your role will involve identifying, analyzing, and resolving issues to ensure optimal performance and reliability across all technological components. Utilize APM tools such as GENEOS, Dynatrace, Grafana, Datadog, and Splunk to oversee daily batch processes and promptly address any issues that arise Provide comprehensive application service delivery to support successful business operations, focusing on regulatory reporting requirements Implement and execute policies and procedures designed to maintain operational stability and ensure system availability Vigilantly monitor production environments for anomalies, resolve data quality issues in regulatory reporting, and enhance the use of standard observability tools while adhering to SLA requirements. Escalate and communicate issues and solutions effectively to both business and technology stakeholders, actively participating in stability and governance meetings. Drive initiatives for automation and reduce manual workload (TOIL) to optimize processes and improve efficiency.

Posted 2 weeks ago

Apply

1.0 - 4.0 years

14 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

You are a strategic thinker passionate about driving solutions in External Reporting . You have found the right team. As an External Reporting Analyst in our Finance team, you will spend each day defining, refining, and delivering set goals for our firm. Our external reporting function is responsible for overseeing the financial statements and external reporting. We ensure a robust control environment, apply USGAAP/IFRS in compliance with corporate and regulatory requirements, and understand the uses and reporting of financial statements. Job Responsibilities Apply up-to-date product, industry, and market knowledge in specialty areas of reporting. Consolidate, review, and analyze financial data for accuracy and completeness, performing period-over-period variance analytics. Coordinate data collection and business results with various lines of business, Regulatory Controllers, and SEC reporting teams. Assist in thoroughly assessing issues, outcomes, and resolutions. Communicate financial information clearly to the lines of business and flag potential issues. Participate in the production, review, and filing of monthly, quarterly, semi-annual, and annual reports for various regulatory agencies. Adhere to proof and control procedures to ensure accurate reconciliation between regulatory filings, SEC filings, and other published financial reports. Follow various control procedures and edit checks to ensure the integrity of reported financial results. Ensure accurate and complete data submission to the Regulators. Interpret and define regulatory and/or SEC requirements and coordinate internal and external policies. Establish and manage relationships with the line of business and external regulatory agency constituents through ongoing partnership and dialogue. Engage in continuous improvement efforts around data quality review and external reporting improvement projects. Required qualifications, capabilities, and skills 1+ years in a Finance organization with exposure to accounting, financial statements, and/or regulatory reporting Experience in Product Control, Financial Control or knowledge of SEC reporting/Reg Reporting Project management experience/skills Strong skills in time management, problem solving, written and oral communication Team player, with ability to work effectively across diverse functions, locations and businesses Strong analytical skills Preferred qualifications, capabilities, and skills Bachelors degree in Accounting or Finance preferred Proficient in MS Excel and Business Intelligent Solutions like Alteryx, Tableau or Python Prior experience with US regulatory filings (TIC/FFIEC009/FR2510)

Posted 2 weeks ago

Apply

5.0 - 10.0 years

8 - 14 Lacs

Noida, Hyderabad, Bengaluru

Work from Office

Naukri logo

Work Location : Hyderabad, Bangalore, Noida, Pune Qualifications and Skills : - Proven expertise in Oracle BRM (Mandatory skill) with a strong understanding of its architecture and modules to effectively manage data migration processes. - Hands-on experience in data migration activities, particularly with Oracle BRM, ensuring high efficiency and accuracy throughout migration projects. - Knowledge in SQL for querying and managing databases, crucial for data migration and integration tasks. - Strong knowledge of ETL tools and processes for efficient data extraction, transformation, and loading from various sources. - Ability to perform detailed data mapping, ensuring logical transformation and compatibility between source and target system data structures. - Experience in data cleansing techniques to ensure data integrity and consistency throughout the migration process. - Understanding of data quality principles and practices, essential to maintain high standards of data accuracy and dependability. - Proficiency in scripting for automation of data migration tasks, enhancing efficiency and reducing potential for errors. - Excellent analytical and problem-solving skills to identify and address data-related challenges and opportunities. - Handling the execution of the data migration and validations. - Handle the develop Migration strategy documents and techniques. Execute data integrity testing post migration. - Understanding BRM : Having a working knowledge of BRM data migration components, the BRM 12 schema, and the data model - Data migration strategy : Developing a migration strategy and implementation plan - Data loading : Being able to load data and integrate it with systems - Post-migration analysis : Performing post-migration analysis on events, invoices, open items, bills, and dunning - Data reconciliation : Developing scripts to reconcile migrated data - Working Knowledge of all the BRM Data migration components. - Must have hands-on in BRM to verify the sanity of the Data migration. - Advantage - Programming skills on Java technologies. Exp. in C/C++, Oracle 12c/19c, PL/SQL, PCM Java, BRM Webservice, Scripting language (perl/python) Roles and Responsibilities : - Analyze client data and formulating effective data migration plans tailored to Oracle BRM specifications. - Collaborate with cross-functional teams to gather and interpret data migration requirements accurately. - Develop and implement efficient data migration scripts and processes, ensuring minimal disruption to business operations. - Conduct thorough testing and validation of data migration outputs to guarantee data accuracy and conformity. - Monitor and troubleshoot migration activities to ensure seamless execution and rectify any issues promptly. - Document data migration processes, maps, and transformations for knowledge sharing and continuous improvement. - Liaise with stakeholders to present progress updates and discuss ongoing improvements to data migration practices. - Contribute to the development of data migration best practices and reusable frameworks within the organization.

Posted 2 weeks ago

Apply

3.0 - 6.0 years

6 - 10 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

About KPI Partners. KPI Partners is a leading provider of data analytics solutions, dedicated to helping organizations transform data into actionable insights. Our innovative approach combines advanced technology with expert consulting, allowing businesses to leverage their data for improved performance and decision-making. Job Description. We are seeking a skilled and motivated Data Engineer with experience in Databricks to join our dynamic team. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and data processing solutions that support our analytics initiatives. You will collaborate closely with data scientists, analysts, and other engineers to ensure the consistent flow of high-quality data across our platforms. Key skills: Python, Pyspark, Databricks, ETL, Cloud (AWS, Azure, or GCP) Key Responsibilities. - Develop, construct, test, and maintain data architectures (e.g., large-scale data processing systems) in Databricks. - Design and implement ETL (Extract, Transform, Load) processes to move and transform data from various sources to target systems. - Collaborate with data scientists and analysts to understand data requirements and design appropriate data models and structures. - Optimize data storage and retrieval for performance and efficiency. - Monitor and troubleshoot data pipelines to ensure reliability and performance. - Engage in data quality assessments, validation, and troubleshooting of data issues. - Stay current with emerging technologies and best practices in data engineering and analytics. Qualifications. - Bachelor's degree in Computer Science, Engineering, Information Technology, or related field. - Proven experience as a Data Engineer or similar role, with hands-on experience in Databricks. - Strong proficiency in SQL and programming languages such as Python or Scala. - Experience with cloud platforms (AWS, Azure, or GCP) and related technologies. - Familiarity with data warehousing concepts and data modeling techniques. - Knowledge of data integration tools and ETL frameworks. - Strong analytical and problem-solving skills. - Excellent communication and teamwork abilities. Why Join KPI Partners? - Be part of a forward-thinking team that values innovation and collaboration. - Opportunity to work on exciting projects across diverse industries. - Continuous learning and professional development opportunities. - Competitive salary and benefits package. - Flexible work environment with hybrid work options. If you are passionate about data engineering and excited about using Databricks to drive impactful insights, we would love to hear from you! KPI Partners is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 12 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

Hello Candidates, We are Hiring !! Job Position - Data Streaming Engineer Experience - 5+ years Location - Mumbai, Pune , Chennai , Bangalore Work mode - Hybrid ( 3 days WFO) JOB DESCRIPTION Request for Data Streaming Engineer Data Streaming @ offshore : • Flink , Python Language. • Data Lake Systems. (OLAP Systems). • SQL (should be able to write complex SQL Queries) • Orchestration (Apache Airflow is preferred). • Hadoop (Spark and Hive: Optimization of Spark and Hive apps). • Snowflake (good to have). • Data Quality (good to have). • File Storage (S3 is good to have) NOTE - Candidates can share their resume on - shrutia.talentsketchers@gmail.com

Posted 2 weeks ago

Apply

13.0 - 20.0 years

25 - 40 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Position Overview We are looking for a highly experienced and versatile Solution Architect Data to lead the solution design and delivery of next-generation data solutions for our BFS clients. The ideal candidate will have a strong background in data architecture and engineering, deep domain expertise in financial services, and hands-on experience with cloud-native data platforms and modern data analytics tools. The role will require architecting solutions across Retail, Corporate, Wealth, and Capital Markets, as well as Payments, Lending, and Onboarding journeys. Possession of Data Analytics and Exposure to Data regulatory domain will be of distinct advantage. Hands on experience of AI & Gen AI enabling data related solution will be a distinct advantage for the position. Key Responsibilities Design and implement end-to-end data solutions for BFS clients, covering data engineering and analytics involving modern data stacks and concepts. Architect cloud-native data platforms using AWS, Azure, and GCP (certifications preferred). Build and maintain data models aligned with Open Banking, Open Finance, SCA, AISP, and PISP requirements. Enrich Solution design by incorporating the construct of industry-standard data architectures using frameworks such as BIAN and lead data standardization programs for banks. Enrich solution architecture by enabling AI and Gen AI paradigm for data engineering, analytics and data regulatory Deliver data solutions in domains like Core Banking, Payments, Lending, Customer Onboarding, Wealth, and Capital Markets. Collaborate with business and technology stakeholders to gather requirements and translate them into scalable data architectures. Solution Design and if needed hands-on in developing lab-class Proof-of-Concepts (POCs) showcasing data-driven capabilities. Lead and contribute to RFX responses for banking and financial services clients and regulatory bodies across UK, EMEA regions. Provide architectural leadership in data initiatives related to regulatory compliance and risk analytics. In this regard, familiarity and working experience of with regulatory software and platform such as SAS, Nice Actimize, and Wolters Kluwer will be preferred. Required Skills & Experience 1218 years of experience in IT with a focus on data solution architecture in BFS domain. Strong delivery and development experience in Retail, Corporate, Wealth, and Capital Market banking domains. Deep understanding of data standards such as BIAN and experience implementing them in banking projects. Expertise in cloud platforms (AWS, Azure, GCP) and leveraging native services for data processing, storage, and analytics. Strong experience in building data models and data solutions for Open Banking, Open Finance, and regulatory needs including SCA, AISP, and PISP. Proficiency in data engineering pipelines and real-time/batch data processing. Experience in designing enterprise data lakes, data warehouses, and implementing data mesh and data lineage frameworks. Hands-on experience in developing rapid POCs and accelerators. Primary Technical Skills Cloud Platforms: AWS, Azure, GCP (certified preferred) Big Data Technologies: Hadoop, Spark, Databricks, Delta Lake Programming Languages: Python, Scala, SQL Data Engineering & Pipelines: Apache Airflow, Kafka, Glue, Data Factory Data Warehousing: Snowflake, Redshift, BigQuery, Synapse Visualization: Power BI, Tableau, Looker Data Governance: Data Lineage, Data Cataloging, Master Data Management Architecture Concepts: Data Mesh, Data Fabric, Event-driven architecture

Posted 2 weeks ago

Apply

2.0 - 5.0 years

10 - 18 Lacs

Thane, Navi Mumbai, Mumbai (All Areas)

Work from Office

Naukri logo

SAP data-quality, MDM, or MDG projects. Practical exposure to S/4HANA migration or support activities is beneficial. Demonstrable skills in data profiling, cleansing, de-duplication, and validation within SAP environments. ETL

Posted 2 weeks ago

Apply

4.0 - 9.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Overview The Senior Data Analyst is responsible for serving as a subject matter expert who can lead efforts to analyze data with the goal of delivering insights that will influence our products and customers. This position will report into the Data Analytics Manager , and will work closely with members of our product and marketing team s , data engineers, and members of our Customer Success organization supporting client outreach efforts. The chief function s of this role will be finding and sharing data-driven insights to deliver value to le ss technical a udiences , and instilling best practices for analytics in the rest of the team . Responsibilities Perform various data analysis functions to a nalyzedatafrom a variety of sources including external labor marketdataand research and internaldatasets fromourplatforms Incorporate information from a variety of systems to produce comprehensive and compelling narrative s for thought-leadership initiatives and customer engagements Demonstrate critical thinking - identify the story in context using multipledatasets, and present results . A strong proficiency in data storytelling will be critical to success in this role. Understand principles of quality data visualization and apply them in Tableau to create and maintain custom dashboards for consumption by other employees Find and i nvestigatedataquality issues, root causes andrecommendremedies to be implemented by thedatascientists and engineers Lia i se with teams around our business to understand their problems , determine how our team can help, then use our database to produce the content they need Identify datamappingand enrichmentrequirements . Familiarity with SQL, especially the logic behind different types of data joins and writing efficient queries, will be necessary Consistently ensure that business is always conducted with integrity and that behavior aligns with iCIMS policies, procedures, and core competencies Additional Job Responsibilities: Produce and adaptdatavisualizations in response to business requests for internaland externaluse Shows good judgement in prioritizing their own commitments and those of the larger team , while demonstrating initiative and appropriate urgency when needed Mentor junior team members in best practices for analytics, data visualization, and data storytelling . Exemplify these standards and guide teammates in following them. Think creatively to produce unique, actionable insights from complex datasets, which can deliver value to our business and to our customers. Qualifications 5-10 years professional experienceworking in an analytics capacity Excellent communication skills , especially with regards to data storytelling – finding insights from complex datasets and sharing those findings with key stakeholders Strongdataanalytics and visualization skills Expertise in Tableau Desktop (Tableau Server and Prep are preferable) producing clear and informative graphs and dashboards Proficiency in SQL and either Python or R to extract and prepare data for analysis Advanced knowledge of Excel (Pivot tables, VLOOKUPs, IF statements) Familiarity with data guardrails to ensure compliance with applicable data governance regulations and privacy laws (i.e., GDPR)

Posted 2 weeks ago

Apply

12.0 - 17.0 years

20 - 25 Lacs

Kochi, Bengaluru

Work from Office

Naukri logo

12 years of experience: Significant experience in project management, with a focus on MDM and ETL projects in life science. PMP certified preferred. Project Planning and Execution: Define project scope, objectives, timelines, and resources. Develop project plans, schedules, and budgets, and manage their execution. Should have a background in data integration projects and hands-on experience in managing risks and interdependencies with upstream and downstream applications. MDM and ETL Expertise : Should have lead MDM and ETL projects, ensuring data quality, consistency, and accuracy. Stakeholder Management : Communicate effectively with stakeholders, manage expectations, and ensure their satisfaction. Project Management Methodologies : Familiarity with project management methodologies (e.g., Agile, Waterfall). Experience with JIRA or other tools. Strong Leadership and Communication Skills : Ability to lead and motivate teams, communicate effectively with stakeholders, and manage expectations. Communicates progress and escalates key decisions, issues, risks, and opportunities as required to achieve project objectives and deliverables. Location : Kochi primary, Bangalore- Secondary

Posted 2 weeks ago

Apply

8.0 - 11.0 years

15 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

SAP Data Management and Landscape Transformation (DMLT) group serves SAP customers in managing their data and transforming their landscapes. Business, market, and technology changes often result in realignment of business processes and structures. This reflects in various business and IT driven activities, such as Fast-track data migration and the move to SAP S/4HANA via selective data transition (SAP S/4HANA SDT) options into SAP Manage business and IT challenges that come with mergers, acquisitions, and divestitures Optimization of processes and data, or organizational restructuring Ensure data harmonization, data restructuring or the consolidation of IT system landscapes Improve data governance, master data management, data quality and information lifecycle management We support and advise our customers globally in strategy definition, conceptual planning, and the realization of their business and digital transformation requirements. THE ROLE AND EXPECTATIONS The SAP Data Management and Landscape Transformation (DMLT) as part of SAP s Customer Success & Delivery Board Area is searching for a senior consultant to join the DMLT consulting team in India, Bangalore office. As a DMLT Senior consultant, your job will involve working in a global team focusing on delivering national and international projects across global regions to realize planned changes in productive SAP applications and system landscapes and drive customers transition to SAP S4/H. The DMLT team offers a full solution package encompassing analysis, architecture and service delivery putting the customers in the center of all our activities. Your main tasks are: Identifying customer s holistic business needs and designing and architecting the appropriate solution proposal from the DMLT Portfolio and beyond Leading and facilitating customer s workshops and discussions to validate the best solution design for the customers requirement and leading a global solution architect team Delivering of DMLT services projects remotely and onsite, in distributed regional and global teams. Define solution approach and business blueprinting for data migration projects Conducting feasibility studies and system analyses, supported by SAP DMLT products, tools and methodology Contributing to continuously enhance the DMLT solution design according to changing market requirements and identify potential enhancements Closely collaborate with people in different roles and locations, and with different areas and levels of expertise (for example, customer engagement lead, developers, product owners, solution architects, sales teams) Establish and maintaining trustful relationships with people on all levels of both externally on customer side and SAP internally Drive customer opportunities throughout all engagement phases (presales, sales and delivery) We expect: Continuous learning about new technologies, products, solutions and from project delivery experience Strong team player attitude: actively establish relationships and motivate through inspiration Thinking outside the box, be aware of trends in the changing workforce and changing concepts of customer communication and engagement (ie Design Thinking Methods; lot size of one etc) Clarifies existing expectations with customer s technology/ operations experts and stakeholders; creates an approach which integrates diverse points of views and which drives success from a technology / operation perspective. Collaborates with customer / SAP teams for the best possible implementation. Assist in the development of the overall project plan (scoping process) as we'll as individual work plans; acting as liaison with client for troubleshooting (investigates, analyses, and solves software problems) Collaborates and co-innovates with the respective development teams. QUALIFICATION/ SKILLS AND COMPETENCIES Minimum of 8 to10 years of relevant work experience as Consultant or Solution Architect of SAP Software solution portfolio, in which 3+ years of work experience in handling full cycle of Data Migration and Architectural design responsibility Knowledge and experience in executing 2 End-to-End data migration and conversion projects, using various data migration technologies viz. DMO, LTMC, SLT or SAP Landscape Transformation Capabilities Strong work experience in ABAP Programming and Troubleshooting Skill. Proficiency in Object-Oriented ABAP, ABAP on HANA, CDS, AMDP Strong experience to work with customer s project team to define the functional and technical specification towards implementation for Finance and Logistic module with integration experience. Project management, Enterprise Architect, Business Consulting skills. Technology and architecture knowledge about SAP Enterprise Cloud Services. bachelors degree in computer application, or MCA or Certified Management Accountant from a recognized and we'll accredited university or institution. Added Advantage: Knowledge of any one SAP ERP core application module namely Finance, Controlling, Logistic and any other SAP solution integration experience. Knowledge on SAP s best practice standards and tools for Application Lifecycle Management of Cloud Solutions Relevant SAP Technology experience, SAP Business Technology Platform is a plus

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies