Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As the SAP S4HANA Data Migration BODS Developer, your primary responsibility will be to design, develop, and execute data migration processes from legacy systems to SAP S4HANA using SAP BusinessObjects Data Services (BODS). You will collaborate closely with functional and technical teams to ensure a smooth data extraction, transformation, and loading process while upholding data integrity and quality standards. Your key tasks will include proficiency in SAP BusinessObjects Data Services (BODS), encompassing job design, dataflows, and transformations. You should have hands-on experience with SAP S4HANA data models, both master and transactional data, along with strong SQL skills for data extraction, transformation, and validation. Additionally, expertise in data profiling, cleansing, and validation techniques is essential for this role. Collaboration with business and functional teams is crucial to understand migration requirements and define data mapping rules. You will also contribute to the development of the overall data migration strategy, which involves identifying data sources, cleansing needs, and cutover planning. In terms of technical tasks, you will design and develop ETL jobs, workflows, and dataflows in SAP BODS for migrating master and transactional data. Ensuring the optimal performance of ETL processes by implementing best practices and tuning dataflows will be part of your responsibilities. Your role will also involve data extraction, cleansing, transformation, and loading into SAP S4HANA, implementing data validation rules, reconciliation processes, and error handling mechanisms. Data profiling and cleansing activities will be performed to maintain data consistency and accuracy. Furthermore, you will be responsible for preparing and maintaining detailed technical documentation, including data mapping, transformation logic, and migration steps. Providing regular status updates and reports on data migration progress and issues is essential, along with supporting cutover activities and post-migration data validation. Key Skills: SAP DS, ETL If you possess the mandatory skills and expertise to thrive in this role, we look forward to your contribution as the SAP S4HANA Data Migration BODS Developer.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
maharashtra
On-site
You will have the opportunity to work as an individual contributor and maintain positive relationships with stakeholders. It is essential to be proactive in learning new skills as per business requirements. Your responsibilities will include extracting relevant data, cleansing, and transforming data to generate insights that drive business value. This will involve utilizing data analytics, data visualization, and data modeling techniques effectively. Your Impact - Strong proficiency in MS PowerBI. - Hands-on experience with MS Azure SQL Database. - Proficient in developing ETL packages using Visual Studio or Informatica. - Skilled in data analysis and business analysis. - Expertise in database management and reporting, particularly in SQL & MS Azure SQL. - Strong critical-thinking and problem-solving abilities. - Excellent verbal and written communication skills. - Review and validate customer data during collection. - Supervise the deployment of data to the data warehouse. - Collaborate with the IT department for software and hardware upgrades to support big data use cases. - Monitor analytics and metrics results. - Implement new data analysis methodologies. - Conduct data profiling to identify and understand anomalies. - Good to have knowledge of Python/R. About You - Possess 2 to 5 years of experience in PowerBI. - Hold a Technical Bachelor's Degree. - Non-Technical Degree holders should have 3+ years of relevant experience. We value inclusion and diversity at Gallagher. It is an integral part of our business, reflecting our commitment to sustainability and supporting the communities where we operate. Embracing the diverse identities, experiences, and talents of our employees enables us to better serve our clients and communities. Inclusion is a conscious commitment, and diversity is recognized as a vital strength. Through embracing diversity in all its forms, we embody The Gallagher Way to its fullest. Equal employment opportunities are extended across all aspects of the employer-employee relationship, including recruitment, hiring, training, promotion, transfer, demotion, compensation, benefits, layoff, and termination. Gallagher is committed to making reasonable accommodations for known physical or mental limitations of qualified individuals with disabilities, unless such accommodations would impose undue hardship on our business operations.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
The WRB Data Technology team at Standard Chartered Bank supports Data and Business Intelligence, Finance and Risk projects globally by delivering data through data warehouse solutions. The team is composed of data specialists, technology experts, and project managers who work closely with business stakeholders to implement end-to-end solutions. Standard Chartered Bank is looking to hire skilled data professionals with relevant experience to contribute to the team's objectives. The successful candidates will be expected to work in a global environment, drawing from both internal and external talent pools. Your responsibilities as a member of the WRB Data Technology team will include participating in data warehousing migration programs involving cross-geography and multi-functional delivery. You will need to align project timelines to ensure successful project delivery, provide support for data analysis, mapping, and profiling, and perform data requirement gathering, analysis, and documentation. Additionally, you will be responsible for mapping data attributes from different source systems to target data models, interpreting use case requirements, designing target data models/data marts, and profiling data attributes to assess data quality and provide remediation recommendations. It is crucial to ensure that data use complies with data architecture principles, including golden sources and standard reference data. Furthermore, you will be involved in data modeling for better data integration within the data warehouse platform and project delivery, engaging consultants, business analysts, and escalating issues in a timely manner. You will work closely with Chapter Leads and Squad Leads to lead projects and manage various stakeholders, including business, technology teams, and internal development teams. Your role will involve transforming business requirements into data requirements, designing data models for use cases and data warehousing, creating data mapping templates, and profiling data to assess quality, suitability, and cardinality. You will also support data stores inbound and/or outbound development, perform data acceptance testing, provide direction on solutions from a standard product/architecture perspective, and participate in key decision-making discussions with business stakeholders. Additionally, you will be responsible for supporting System Integration Testing (SIT) and User Acceptance Testing (UAT), managing change requests effectively, ensuring alignment with bank processes and standards, and delivering functional specifications to the development team. To excel in this role, you should possess domain knowledge and technical skills, along with 6-8 years of experience in banking domain/product knowledge with IT working experience. A graduate degree in computer science or a relevant field is required, and familiarity with tools such as Clarity, ADO, Axess, and SQL is beneficial. Strong communication and stakeholder management skills are essential, as well as the ability to write complex SQL scripts. Knowledge of Base SAS is an advantage, and familiarity with Retail Banking and Wealth Lending data is ideal. You should be able to work effectively in a multi-cultural, cross-border, and matrix reporting environment, demonstrating knowledge management for MIS applications, business rules, mapping documents, data definitions, system functions, and processes. With a background in business or data analysis roles, you should have a good understanding of data analytics, deep dive capabilities, and excellent attention to detail and time management. This role offers the opportunity to become a go-to person for data across the bank globally, providing extensive exposure to all parts of the bank's business model. It serves as a solid foundation for a future career in the broader data space, preparing individuals for roles in analytics, business intelligence, and big data. Your work will contribute to driving commerce and prosperity through unique diversity, aligning with Standard Chartered Bank's purpose and brand promise to be here for good. If you are passionate about making a positive difference and are eager to work in a collaborative and inclusive environment, we encourage you to join our team at Standard Chartered Bank.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
maharashtra
On-site
The primary responsibility of the Lead Data Governance role is to oversee data accuracy and identify data gaps within critical data repositories such as the Data Lake, BIU Data warehouse, and Regulatory data marts. This includes developing control structures to ensure data quality and lineage, conceptualizing and reviewing Data Quality Functional and Technical Rules, and collaborating with various teams to address data gaps and ensure metadata management. Furthermore, the Lead Data Governance is tasked with performing detailed data profiling, developing guidelines for data handling, implementing data strategies, and designing solution blueprints to meet current and future business needs. The role also involves utilizing contemporary techniques and dynamic visual displays to present analytical insights effectively and fostering innovation in problem-solving. In addition to primary responsibilities, the Lead Data Governance role involves stakeholder management, creating data remediation projects, understanding business requirements, leading by example, developing a tableau reporting team, upskilling the team, and ensuring team productivity and quality deliverables. Key success metrics for this role include maintaining accurate and consistent data, conducting timely data quality checks, and ensuring no data quality issues in BIU Datamarts. The ideal candidate for this position should hold a Bachelor's degree in relevant fields such as Bachelor of Science (B.Sc), Bachelor of Technology (B.Tech), or Bachelor of Computer Applications (BCA), along with a Master's degree such as Master of Science (M.Sc), Master of Technology (M.Tech), or Master of Computer Applications (MCA). Additionally, a minimum of 10 years of experience in data governance is required to excel in this role.,
Posted 1 week ago
7.0 - 12.0 years
35 - 50 Lacs
Hyderabad
Work from Office
Job Description: Spark, Java Strong SQL writing skills, data discovery, data profiling, Data exploration, Data wrangling skills Kafka, AWS s3, lake formation, Athena, glue, Autosys or similar tools, FastAPI (secondary) Strong SQL skills to support data analysis and imbedded business logic in SQL, data profiling and gap assessment Collaborate with development and business SMEs within technology to understand data requirements, perform data analysis to support and Validate business logic, data integrity and data quality rules within a centralized data platform Experience working within the banking/financial services industry with solid understanding of financial products and business processes
Posted 1 week ago
4.0 - 6.0 years
7 - 12 Lacs
Hyderabad
Work from Office
Role Description : As a Senior Software Engineer - ETL - Python at Incedo, you will be responsible for designing and developing ETL workflows to extract, transform, and load data from various sources to target systems. You will work with data analysts and architects to understand business requirements and translate them into technical solutions. You will be skilled in ETL tools such as Informatica or Talend and have experience in programming languages such as SQL or Python. You will be responsible for writing efficient and reliable code that is easy to maintain and troubleshoot. Roles & Responsibilities: Develop, maintain, and enhance software applications for Extract, Transform, and Load (ETL) processes Design and implement ETL solutions that are scalable, reliable, and maintainable Develop and maintain ETL code, scripts, and jobs, ensuring they are efficient, accurate, and meet business requirements Troubleshoot and debug ETL code, identifying and resolving issues in a timely manner Collaborate with cross-functional teams, including data analysts, business analysts, and project managers, to understand requirements and deliver solutions that meet business needs Design and implement data integration processes between various systems and data sources Optimize ETL processes to improve performance, scalability, and reliability Create and maintain technical documentation, including design documents, coding standards, and best practices. Technical Skills : Proficiency in programming languages such as Python for writing ETL scripts. Knowledge of data transformation techniques such as filtering, aggregation, and joining. Familiarity with ETL frameworks such as Apache NiFi, Talend, or Informatica. Understanding of data profiling, data quality, and data validation techniques. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 1 week ago
4.0 - 6.0 years
6 - 10 Lacs
Chennai
Work from Office
As a Senior Software Engineer - ETL - Python at Incedo, you will be responsible for designing and developing ETL workflows to extract, transform, and load data from various sources to target systems. You will work with data analysts and architects to understand business requirements and translate them into technical solutions. You will be skilled in ETL tools such as Informatica or Talend and have experience in programming languages such as SQL or Python. You will be responsible for writing efficient and reliable code that is easy to maintain and troubleshoot. Roles & Responsibilities: Develop, maintain, and enhance software applications for Extract, Transform, and Load (ETL) processes Design and implement ETL solutions that are scalable, reliable, and maintainable Develop and maintain ETL code, scripts, and jobs, ensuring they are efficient, accurate, and meet business requirements Troubleshoot and debug ETL code, identifying and resolving issues in a timely manner Collaborate with cross-functional teams, including data analysts, business analysts, and project managers, to understand requirements and deliver solutions that meet business needs Design and implement data integration processes between various systems and data sources Optimize ETL processes to improve performance, scalability, and reliability Create and maintain technical documentation, including design documents, coding standards, and best practices. Technical Skills Skills Requirements: Proficiency in programming languages such as Python for writing ETL scripts. Knowledge of data transformation techniques such as filtering, aggregation, and joining. Familiarity with ETL frameworks such as Apache NiFi, Talend, or Informatica. Understanding of data profiling, data quality, and data validation techniques. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 1 week ago
4.0 - 6.0 years
6 - 10 Lacs
Gurugram
Work from Office
Role Description As a Senior Software Engineer - ETL - Python at Incedo, you will be responsible for designing and developing ETL workflows to extract, transform, and load data from various sources to target systems. You will work with data analysts and architects to understand business requirements and translate them into technical solutions. You will be skilled in ETL tools such as Informatica or Talend and have experience in programming languages such as SQL or Python. You will be responsible for writing efficient and reliable code that is easy to maintain and troubleshoot. Roles & Responsibilities: Develop, maintain, and enhance software applications for Extract, Transform, and Load (ETL) processes Design and implement ETL solutions that are scalable, reliable, and maintainable Develop and maintain ETL code, scripts, and jobs, ensuring they are efficient, accurate, and meet business requirements Troubleshoot and debug ETL code, identifying and resolving issues in a timely manner Collaborate with cross-functional teams, including data analysts, business analysts, and project managers, to understand requirements and deliver solutions that meet business needs Design and implement data integration processes between various systems and data sources Optimize ETL processes to improve performance, scalability, and reliability Create and maintain technical documentation, including design documents, coding standards, and best practices. Technical Skills Skills Requirements: Proficiency in programming languages such as Python for writing ETL scripts. Knowledge of data transformation techniques such as filtering, aggregation, and joining. Familiarity with ETL frameworks such as Apache NiFi, Talend, or Informatica. Understanding of data profiling, data quality, and data validation techniques. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
haryana
On-site
You will play a crucial role as a Data Modeler at Sun Life, where you will be responsible for designing and implementing new data structures to support project teams working on ETL, data warehouse design, managing the enterprise data model, data maintenance, and enterprise data integration approaches. Your technical responsibilities will include building and maintaining data models to report disparate data sets reliably, consistently, and in an interpretable manner. You will gather, distil, and harmonize data requirements to design conceptual, logical, and physical data models, as well as develop source-to-target mappings with complex ETL transformations. In this role, you will contribute to requirement analysis and database design, both in transactional and dimensional data modeling. You will work independently on data warehouse projects, collaborate with data consumers and suppliers to understand detailed requirements, and propose standardized data models. Additionally, you will help improve Data Management data models and facilitate discussions to understand business requirements and develop dimension data models based on industry best practices. To be successful in this position, you should have extensive practical experience in Information Technology and software development projects, with a minimum of 8 years of experience in designing operational data stores and data warehouses. Proficiency in data modeling tools such as Erwin or SAP Power Designer, a strong understanding of ETL and data warehouse concepts, and the ability to write complex SQL for data transformations and profiling are essential. Furthermore, you should possess a combination of solid business knowledge, technical expertise, excellent analytical and logical thinking, and strong communication skills. It would be advantageous if you have an understanding of the Insurance Domain, basic knowledge of AWS cloud services, experience with Master Data Management, Data Quality, Data Governance, and data visualization tools like SAS VA and Tableau. Familiarity with implementing and architecting data solutions using tools like Informatica, SQL Server, or Oracle is also beneficial. Join Sun Life's Advanced Analytics team and embark on a rewarding journey where you can contribute to making a positive impact on individuals, families, and communities worldwide.,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
You have an exciting opportunity to join YASH Technologies as a Business Analysis Professional. With 7-10 years of experience in Data & Analytics projects, you will be responsible for expertise in MDM data mappings, analysis, and configuration. Working closely with subject matter experts, you will understand functional requirements, lead the requirements, and prepare data mapping sheets. Your role will require strong analytical and troubleshooting skills, proficiency in data profiling, and understanding data patterns. In this position, you will need to have a solid grasp of data models, entity relationships, SQL, ETL, and Data warehousing. Experience in Snowflake is a plus. Functional testing, publishing metrics, system testing, and UAT for data validation are key aspects of the role. Domain knowledge in Manufacturing, particularly in BOM subject area, is preferred. Excellent communication skills, both written and verbal, are essential. Your technical expertise should include technical writing, data modeling, data sampling, and experience in Agile Scrum development environments. Creating user stories, product backlogs, attending scrum events, and scheduling calls with business users to understand requirements are also part of the responsibilities. You will provide technical assistance to the development team, work closely with business stakeholders to gather requirements, and build strong relationships. Your role will involve proven analytics skills, including data mining, evaluation, and visualization. Strong SQL or Excel skills are required, with an aptitude for learning other analytics tools. Defining and implementing data acquisition and integration logic, as well as analyzing data to answer key questions for stakeholders, are crucial components of the position. At YASH Technologies, you will have the opportunity to create a fulfilling career in an inclusive team environment. The company offers career-oriented skilling models and continuous learning opportunities. Embracing a Hyperlearning workplace culture, YASH empowers employees through flexible work arrangements, emotional positivity, agile self-determination, transparency, and open collaboration. You will receive all the support needed to achieve business goals, along with stable employment and an ethical corporate culture.,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
You have a fantastic opportunity to join YASH Technologies as a Business Analysis Professional. With 7-10 years of experience in Data & Analytics projects, you will be responsible for expertise in MDM data mappings, analysis, and configuration. Your role will involve collaborating with Subject Matter Experts to understand functional requirements and lead the preparation of data mapping sheets. Your strong analytical and troubleshooting skills will be crucial as you delve into data profiling, understanding data patterns, and comprehending data models and entity relationships. Proficiency in SQL, ETL, and Data warehousing is essential, and experience in Snowflake would be advantageous. Additionally, you will be involved in functional testing, system testing, and UAT for data validation. Having domain knowledge in the Manufacturing area, particularly in the BOM subject area, would be beneficial. Excellent communication and interpersonal skills are necessary as you engage in technical writing, data modeling, and data sampling. Experience in Agile Scrum development environments, creating User stories, product backlogs, and attending scrum events will be part of your responsibilities. You will play a key role in scheduling calls with business users, providing technical assistance to the development team, and collaborating with stakeholders to gather requirements. Proven analytics skills, including data mining, evaluation, and visualization, will be essential. Strong SQL or Excel skills are required, with a willingness to learn other analytics tools. Your responsibilities will also include defining and implementing data acquisition and integration logic, analyzing data to answer key questions, and developing and maintaining databases. Project roadmap management, scheduling, PMO updates, and conflict resolution are part of the role. At YASH, you will have the opportunity to shape your career in an inclusive team environment that values continuous learning and growth. Our Hyperlearning workplace is built on flexibility, agility, trust, and support for achieving business goals, providing a stable employment with a positive atmosphere and ethical corporate culture. Join us at YASH Technologies and be part of a team that drives positive changes in a virtual world.,
Posted 1 week ago
4.0 - 6.0 years
7 - 12 Lacs
Hyderabad
Work from Office
As a Senior Software Engineer - ETL - Python at Incedo, you will be responsible for designing and developing ETL workflows to extract, transform, and load data from various sources to target systems. You will work with data analysts and architects to understand business requirements and translate them into technical solutions. You will be skilled in ETL tools such as Informatica or Talend and have experience in programming languages such as SQL or Python. You will be responsible for writing efficient and reliable code that is easy to maintain and troubleshoot. Roles & Responsibilities: Develop, maintain, and enhance software applications for Extract, Transform, and Load (ETL) processes Design and implement ETL solutions that are scalable, reliable, and maintainable Develop and maintain ETL code, scripts, and jobs, ensuring they are efficient, accurate, and meet business requirements Troubleshoot and debug ETL code, identifying and resolving issues in a timely manner Collaborate with cross-functional teams, including data analysts, business analysts, and project managers, to understand requirements and deliver solutions that meet business needs Design and implement data integration processes between various systems and data sources Optimize ETL processes to improve performance, scalability, and reliability Create and maintain technical documentation, including design documents, coding standards, and best practices. Technical Skills Skills Requirements: Proficiency in programming languages such as Python for writing ETL scripts. Knowledge of data transformation techniques such as filtering, aggregation, and joining. Familiarity with ETL frameworks such as Apache NiFi, Talend, or Informatica. Understanding of data profiling, data quality, and data validation techniques. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 2 weeks ago
4.0 - 6.0 years
6 - 10 Lacs
Chennai
Work from Office
As a Senior Software Engineer - ETL - Python at Incedo, you will be responsible for designing and developing ETL workflows to extract, transform, and load data from various sources to target systems. You will work with data analysts and architects to understand business requirements and translate them into technical solutions. You will be skilled in ETL tools such as Informatica or Talend and have experience in programming languages such as SQL or Python. You will be responsible for writing efficient and reliable code that is easy to maintain and troubleshoot. Roles & Responsibilities: Develop, maintain, and enhance software applications for Extract, Transform, and Load (ETL) processes Design and implement ETL solutions that are scalable, reliable, and maintainable Develop and maintain ETL code, scripts, and jobs, ensuring they are efficient, accurate, and meet business requirements Troubleshoot and debug ETL code, identifying and resolving issues in a timely manner Collaborate with cross-functional teams, including data analysts, business analysts, and project managers, to understand requirements and deliver solutions that meet business needs Design and implement data integration processes between various systems and data sources Optimize ETL processes to improve performance, scalability, and reliability Create and maintain technical documentation, including design documents, coding standards, and best practices. Technical Skills Skills Requirements: Proficiency in programming languages such as Python for writing ETL scripts. Knowledge of data transformation techniques such as filtering, aggregation, and joining. Familiarity with ETL frameworks such as Apache NiFi, Talend, or Informatica. Understanding of data profiling, data quality, and data validation techniques. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 2 weeks ago
8.0 - 15.0 years
8 - 13 Lacs
Hyderabad
Work from Office
Key Roles and Responsibilities 1. Stakeholder Collaboration Business Engagement Participate in data quality implementation discussions with business stakeholders. Engage in architecture discussions around the Data Quality framework. Collaborate with O2C, Tax, Trading Partner Finance, Logistics teams to gather business rules. Provide business requirements to BODS/LSMW teams to support data loads. Review and track exceptions related to data quality rules; incorporate logic into tools. Capture sign-offs from key stakeholders at critical project milestones. Identify and drive continuous improvements in data standards, data quality rules, and maintenance processes in collaboration with business and MDM teams. 2. Data Quality Rules Standards Definition Define and document data quality rules based on business and compliance needs. Translate business rules into technical definitions to support tool development. Support the development of rules in data quality (DQ) tools. Validate rule outputs and communicate findings to business stakeholders. 3. Data Profiling, Cleansing Monitoring Create and maintain data quality dashboards and reports. Review customer master data profiling results and suggest cleansing opportunities. Support the design and execution of a Data Quality Framework. Conduct root cause analysis and suggest process/system improvements. Execute pre- and post-validation checks as part of cleansing efforts. 4. Customer Master Data Management (SAP) Manage and maintain customer master data directly in SAP where BODS cannot be used. Ensure accuracy, consistency, and completeness across customer records. Lead initiatives for identifying obsolete records and define deactivation criteria. 5. Data Migration, Integration Tool Support Collaborate with IT teams on data migration, cleansing, and validation activities during SAP projects or enhancements. Analyze business processes and translate them into functional specifications to enhance master data processes. Recommend system/process enhancements to improve data quality and governance. Skills Required Core Experience 10+ years of experience with SAP Customer Master module. Strong knowledge of SAP O2C (Order to Cash) T-codes and fields. Extensive hands-on experience in creating, maintaining, and validating customer master data. Domain Knowledge Familiarity with industry-specific customer processes, including: Credit Management Intercompany Trading Trading Partner Finance Tax Configuration In-depth understanding of customer master lifecycle: creation, change, extension, obsolescence, deletion. Data Quality Governance Strong skills in defining, documenting, and enforcing data quality rules. Experience in data profiling, cleansing, and standardization. Ability to perform root cause analysis and define remediation strategies. Proficiency in documenting data standards and definitions. Tools Technologies SAP ECC / S/4HANA SAP SD / O2C SAP BODS (BusinessObjects Data Services) SAP LSMW recording, field mapping, batch input SAP SE16N, SQVI, and custom reporting Excel, Power BI, SAP IS, Tableau Experience with data quality tools (DQ dashboards, exception reporting) Sap Sd Consultant With 8 Years Experience, Masterdata, Configuration, Migration, Etl
Posted 2 weeks ago
8.0 - 10.0 years
22 - 27 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of the role is to define and develop Enterprise Data Structure along with Data Warehouse, Master Data, Integration and transaction processing with maintaining and strengthening the modelling standards and business information. Do 1. Define and Develop Data Architecture that aids organization and clients in new/ existing deals a. Partnering with business leadership (adopting the rationalization of the data value chain) to provide strategic, information-based recommendations to maximize the value of data and information assets, and protect the organization from disruptions while also embracing innovation b. Assess the benefits and risks of data by using tools such as business capability models to create an data-centric view to quickly visualize what data matters most to the organization, based on the defined business strategy c. Create data strategy and road maps for the Reference Data Architecture as required by the clients d. Engage all the stakeholders to implement data governance models and ensure that the implementation is done based on every change request e. Ensure that the data storage and database technologies are supported by the data management and infrastructure of the enterprise f. Develop, communicate, support and monitor compliance with Data Modelling standards g. Oversee and monitor all frameworks to manage data across organization h. Provide insights for database storage and platform for ease of use and least manual work i. Collaborate with vendors to ensure integrity, objectives and system configuration j. Collaborate with functional & technical teams and clients to understand the implications of data architecture and maximize the value of information across the organization k. Presenting data repository, objects, source systems along with data scenarios for the front end and back end usage l. Define high-level data migration plans to transition the data from source to target system/ application addressing the gaps between the current and future state, typically in sync with the IT budgeting or other capital planning processes m. Knowledge of all the Data service provider platforms and ensure end to end view. n. Oversight all the data standards/ reference/ papers for proper governance o. Promote, guard and guide the organization towards common semantics and the proper use of metadata p. Collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure a common understanding, consistency, accuracy and control q. Provide solution of RFPs received from clients and ensure overall implementation assurance i. Develop a direction to manage the portfolio of all the databases including systems, shared infrastructure services in order to better match business outcome objectives ii. Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution for the big/small data iii. Provide technical leadership to the implementation of custom solutions through thoughtful use of modern technology iv. Define and understand current issues and problems and identify improvements v. Evaluate and recommend solutions to integrate with overall technology ecosystem keeping consistency throughout vi. Understand the root cause problem in integrating business and product units vii. Validate the solution/ prototype from technology, cost structure and customer differentiation point of view viii. Collaborating with sales and delivery leadership teams to identify future needs and requirements ix. Tracks industry and application trends and relates these to planning current and future IT needs 2. Building enterprise technology environment for data architecture management a. Develop, maintain and implement standard patterns for data layers, data stores, data hub & lake and data management processes b. Evaluate all the implemented systems to determine their viability in terms of cost effectiveness c. Collect all the structural and non-structural data from different places integrate all the data in one database form d. Work through every stage of data processing: analysing, creating, physical data model designs, solutions and reports e. Build the enterprise conceptual and logical data models for analytics, operational and data mart structures in accordance with industry best practices f. Implement the best security practices across all the data bases based on the accessibility and technology g. Strong understanding of activities within primary discipline such as Master Data Management (MDM), Metadata Management and Data Governance (DG) h. Demonstrate strong experience in Conceptual, Logical and physical database architectures, design patterns, best practices and programming techniques around relational data modelling and data integration 3. Enable Delivery Teams by providing optimal delivery solutions/ frameworks a. Build and maintain relationships with delivery and practice leadership teams and other key stakeholders to become a trusted advisor b. Define database physical structure, functional capabilities, security, back-up and recovery specifications c. Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results d. Monitor system capabilities and performance by performing tests and configurations e. Integrate new solutions and troubleshoot previously occurred errors f. Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards g. Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects h. Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams i. Recommend tools for reuse, automation for improved productivity and reduced cycle times j. Help the support and integration team for better efficiency and client experience for ease of use by using AI methods. k. Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams l. Ensures architecture principles and standards are consistently applied to all the projects m. Ensure optimal Client Engagement i. Support pre-sales team while presenting the entire solution design and its principles to the client ii. Negotiate, manage and coordinate with the client teams to ensure all requirements are met iii. Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Mandatory Skills: Data Governance. Experience : 8-10 Years.
Posted 2 weeks ago
4.0 - 9.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Must have technical skills: 4 years+ onSnowflake advanced SQL expertise 4 years+ on data warehouse experiences hands on knowledge with the methods to identify, collect, manipulate, transform, normalize, clean, and validate data, star schema, normalization / denormalization, dimensions, aggregations etc, 4+ Years experience working in reporting and analytics environments development, data profiling, metric development, CICD, production deployment, troubleshooting, query tuning etc, 3 years+ on Python advanced Python expertise 3 years+ on any cloud platform AWS preferred hands on experience on AWS on Lambda, S3, SNS / SQS, EC2 is bare minimum, 3 years+ on any ETL / ELT tool Informatica, Pentaho, Fivetran, DBT etc. 3+ years with developing functional metrics in any specific business vertical (finance, retail, telecom etc), Must have soft skills: Clear communication written and verbal communication, especially with time off, delays in delivery etc. Team Player Works in the team and works with the team, Enterprise Experience Understands and follows enterprise guidelines for CICD, security, change management, RCA, on-call rotation etc, Nice to have: Technical certifications from AWS, Microsoft, Azure, GCP or any other recognized Software vendor, 4 years+ on any ETL / ELT tool Informatica, Pentaho, Fivetran, DBT etc. 4 years+ with developing functional metrics in any specific business vertical (finance, retail, telecom etc), 4 years+ with team lead experience, 3 years+ in a large-scale support organization supporting thousands ofusers
Posted 2 weeks ago
5.0 - 10.0 years
10 - 16 Lacs
Hyderabad
Remote
Job description As an ETL Developer for the Data and Analytics team, at Guidewire you will participate and collaborate with our customers and SI Partners who are adopting our Guidewire Data Platform as the centerpiece of their data foundation. You will facilitate and be an active developer when necessary to operationalize the realization of the agreed upon ETL Architecture goals of our customers adhering to Guidewire best practices and standards. You will work with our customers, partners, and other Guidewire team members to deliver successful data transformation initiatives. You will utilize best practices for design, development, and delivery of customer projects. You will share knowledge with the wider Guidewire Data and Analytics team to enable predictable project outcomes and emerge as a leader in our thriving data practice. One of our principles is to have fun while we deliver, so this role will need to keep the delivery process fun and engaging for the team in collaboration with the broader organization. Given the dynamic nature of the work in the Data and Analytics team, we are looking for decisive, highly-skilled technical problem solvers who are self-motivated and take proactive actions for the benefit of our customers and ensure that they succeed in their journey to Guidewire Cloud Platform. You will collaborate closely with teams located around the world and adhere to our core values Integrity, Collegiality, and Rationality. Key Responsibilities: Build out technical processes from specifications provided in High Level Design and data specifications documents. Integrate test and validation processes and methods into every step of the development process Work with Lead Architects and provide inputs into defining user stories, scope, acceptance criteria and estimates. Systematic problem-solving approach, coupled with a sense of ownership and drive Ability to work independently in a fast-paced Agile environment Actively contribute to the knowledge base from every project you are assigned to. Qualifications: Bachelors or Masters Degree in Computer Science, or equivalent level of demonstrable professional competency, and 3 - 5 years + in a technical capacity building out complex ETL Data Integration frameworks. 3+ years of Experience with data processing and ETL (Extract, Transform, Load) and ELT (Extract, Load, and Transform) concepts. Experience with ADF or AWS Glue, Spark/Scala, GDP, CDC, ETL Data Integration, Experience working with relational and/or NoSQL databases Experience working with different cloud platforms (such as AWS, Azure, Snowflake, Google Cloud, etc.) Ability to work independently and within a team. Nice to have: Insurance industry experience Experience with ADF or AWS Glue Experience with the Azure data factory, Spark/Scala Experience with the Guidewire Data Platform.
Posted 2 weeks ago
5.0 - 10.0 years
4 - 7 Lacs
Bengaluru
Work from Office
Responsibilities: Development of workflows and Connectors for the Collibra Platform Administration and configuration of Collibra Platform Duties: Collibra DGC Administration and Configuration Collibra Connect Administration and Configuration Collibra Development of Workflows and MuleSoft Connectors Ingesting metadata from any external sources into Collibra. Installation, upgrading and Administration Collibra Components Setup, support, deployment & migration of Collibra Components Implement Application changes: review and deploy code packages, perform post implementation verifications. Participate in group meetings (including business partners) for problem solving, decision making and implementation planning Senior Collibra Developer- Mandatory Skills MUST HAVE SKILLS: Collibra Connect Collibra DGC Java Advanced hands-on working knowledge of Unix/Linux Advanced hands on experience wit UNIX scripting SQL Server Groovy Nice to have: Knowledge and interest in data governance and/or metadata management Working knowledge of Jira would be an asset
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
The Customer Excellence Advisory Lead (CEAL) is focused on empowering customers to maximize the potential of their data through top-tier architectural guidance and design. Within the Oracle Analytics Service Excellence organization, the CEAL team consists of Solution Architects specializing in Oracle Analytics Cloud, Oracle Analytics Server, and Fusion Data Intelligence. The primary objective is to facilitate the successful adoption of Oracle Analytics by engaging with customers and partners globally, fostering trust in Oracle Analytics, collaborating with Product Management to enhance product offerings, and sharing insights through various channels such as blogs, webinars, and demonstrations. The ideal candidate will work closely with strategic FDI customers and partners to guide them towards optimized implementations and develop Go-live plans that prioritize achieving high usage. This role is at Career Level - IC4. Responsibilities: - Proactively identify customer requirements and unaddressed needs across different customer segments, proposing potential solutions. - Assist in developing complex product and program strategies based on customer interactions, and effectively execute scalable solutions and projects for customers in diverse enterprise environments. - Collaborate with customers and internal stakeholders to communicate strategies, synchronize solution implementation timelines, provide updates, and adjust plans as objectives evolve. - Prepare for and address complex product or solution-related inquiries or challenges from customers. - Gather and communicate detailed product insights driven by customer needs and requirements. - Promote understanding of customer complexities and the value propositions of various programs to key internal stakeholders through various channels such as events, team meetings, and product reviews. Primary Skills: - Minimum 4 years of experience with OBIA and Oracle Analytics. - Strong expertise in Analytics RPD design, development, and deployment. - Solid understanding of BI/data warehouse analysis, design, development, and testing. - Extensive experience in data analysis, data profiling, data quality, data modeling, and data integration. - Proficient in crafting complex queries and stored procedures using Oracle SQL and Oracle PL/SQL. - Skilled in developing visualizations and user-friendly workbooks. - Previous experience in developing solutions incorporating AI and ML using Analytics. - Experience in enhancing report performance. Desirable Skills: - Experience with Fusion Applications (ERP/HCM/SCM/CX). - Ability to design and develop ETL Interfaces, Packages, Load plans, user functions, variables, and sequences in ODI to support both batch and real-time data integrations. - Experience working with multiple Cloud Platforms. - Certification in FDI, OAC, and ADW. Qualifications: Career Level - IC4 About Us: Oracle, a world leader in cloud solutions, leverages cutting-edge technology to address current challenges. With over 40 years of experience, Oracle partners with industry leaders across various sectors, operating with integrity and embracing change. Oracle is committed to fostering an inclusive workforce that provides opportunities for all individuals to contribute to true innovation. Employees enjoy global opportunities with a focus on work-life balance, competitive benefits, flexible medical, life insurance, and retirement options. Volunteering in communities is encouraged through volunteer programs. We are dedicated to facilitating the inclusion of individuals with disabilities in all stages of the employment process. If you require accessibility assistance or accommodation for a disability, please contact us at accommodation-request_mb@oracle.com or call +1 888 404 2494 in the United States.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Data Quality Integration Engineer, your primary responsibility will be to incorporate data quality capabilities into enterprise data landscapes. You will play a crucial role in integrating advanced data quality tools like Ataccama and Collibra with cloud data platforms such as Snowflake and SQL databases. Your role is essential in ensuring that data governance standards are met through robust, scalable, and automated data quality processes. In this role, you will need to develop scalable applications using appropriate technical options and optimize application development, maintenance, and performance. You will be required to implement integration of data quality tools with Snowflake and SQL-based platforms, develop automated pipelines and connectors for data profiling, cleansing, monitoring, and validation, and configure data quality rules aligned with governance policies and KPIs. Troubleshooting integration issues, monitoring performance, and collaborating with various teams to align solutions with business needs will also be part of your responsibilities. You will need to adhere to coding standards, perform peer reviews, write optimized code, create and review design documents, templates, test cases, and checklists, and develop and review unit and integration test cases. Additionally, you will estimate efforts for project deliverables, track timelines, perform defect RCA, trend analysis, and propose quality improvements, and mentor team members while managing aspirations and keeping the team engaged. To excel in this role, you must have strong experience with data quality tools like Ataccama and Collibra, hands-on experience with Snowflake and SQL databases, proficiency in SQL scripting and data pipeline development (preferably Python or Scala), and a sound understanding of data profiling, cleansing, enrichment, and monitoring. Knowledge of REST APIs, metadata integration techniques, and cloud platforms like AWS and Azure would be advantageous. Furthermore, soft skills such as strong analytical and problem-solving abilities, effective communication and presentation skills, and the ability to manage high-pressure environments and multiple priorities are essential. Certification in Ataccama, Collibra, Snowflake, AWS, or Azure, along with domain knowledge in enterprise data architecture and financial services, insurance, or asset management domains, would be beneficial for this role.,
Posted 2 weeks ago
12.0 - 16.0 years
0 Lacs
karnataka
On-site
As a Senior Data Modeller, you will be responsible for leading the design and development of conceptual, logical, and physical data models for enterprise and application-level databases. Your expertise in data modeling, data warehousing, and data governance, particularly in cloud environments, Databricks, and Unity Catalog, will be crucial for the role. You should have a deep understanding of business processes related to master data management in a B2B environment and experience with data governance and data quality concepts. Your key responsibilities will include designing and developing data models, translating business requirements into structured data models, defining and maintaining data standards, collaborating with cross-functional teams to implement models, analyzing existing data systems for optimization, creating entity relationship diagrams and data flow diagrams, supporting data governance initiatives, and ensuring compliance with organizational data policies and security requirements. To be successful in this role, you should have at least 12 years of experience in data modeling, data warehousing, and data governance. Strong familiarity with Databricks, Unity Catalog, and cloud environments (preferably Azure) is essential. Additionally, you should possess a background in data normalization, denormalization, dimensional modeling, and schema design, along with hands-on experience with data modeling tools like ERwin. Experience in Agile or Scrum environments, proficiency in integration, databases, data warehouses, and data processing, as well as a track record of successfully selling data and analytics software to enterprise customers are key requirements. Your technical expertise should cover Big Data, streaming platforms, Databricks, Snowflake, Redshift, Spark, Kafka, SQL Server, PostgreSQL, and modern BI tools. Your ability to design and scale data pipelines and architectures in complex environments, along with excellent soft skills including leadership, client communication, and stakeholder management will be valuable assets in this role.,
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
punjab
On-site
The Technical Business Analyst role in Sydney/Melbourne/Brisbane requires a minimum of 7 years of experience in a business/system analyst role within the Banking domain. Candidates with Risk, Compliance & Legal experience will also be highly encouraged. It is mandatory to have project lifecycle experience and experience in finance projects with an understanding of related functional aspects. The ideal candidate should have a track record of working in a fast-paced collaborative cross-functional environment and proficiency in working with distributed teams. Key Skills required for this role include the ability to analyze customer requirements and provide innovative solutions, strong internal consultation and stakeholder management skills, excellent presentation and communication skills, advanced problem-solving and troubleshooting skills, and strong SQL, Data Analysis, and Data Profiling skills. Additionally, familiarity with Agile development practices, Business Process Modelling, Data Modelling, Data Warehouse concepts, and knowledge of planning/forecasting, data reporting visualization platforms are desired. The objectives of the role include working collaboratively with teams and stakeholders to obtain, analyze, communicate, and validate requirements for Projects. The role involves supporting the process of Business context and Requirements gathering, defining MVPs, facilitating elaboration sessions, undertaking technical analysis, documenting business logic, mapping, governance, and success criteria, and assisting in estimating for the Delivery Phase. Key Accountabilities of the role include supporting the development and testing team in understanding requirements, contributing to Knowledge Transfer during and after project delivery, identifying improvement opportunities, maintaining ongoing dialogue with stakeholders, promoting continuous improvement, and demonstrating effective customer service. Overall, the Technical Business Analyst will play a crucial role in ensuring that projects align with Suncorp's business objectives and deliver consistently high-quality results.,
Posted 2 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad
Work from Office
Job Description: At least 5+ years of relevant hands on development experience as Azure Data Engineering role Proficient in Azure technologies like ADB, ADF, SQL(capability of writing complex SQL queries), ADB, PySpark, Python, Synapse, Delta Tables, Unity Catalog Hands on in Python, PySpark or Spark SQL Hands on in Azure Analytics and DevOps Taking part in Proof of Concepts (POCs) and pilot solutions preparation Ability to conduct data profiling, cataloguing, and mapping for technical design and construction of technical data flows Experience in business processing mapping of data and analytics solutions
Posted 2 weeks ago
8.0 - 13.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Job Description: Senior/Azure Data Engineer Job Location: Hyderabad / Bangalore / Chennai / Kolkata / Noida/ Gurgaon / Pune / Indore / Mumbai Overall 8+ Years of experience. At least 5+ years of relevant hands-on development experience as Azure Data Engineering role Proficient in Azure technologies like ADB, ADF, SQL(capability of writing complex SQL queries), ADB, PySpark, Python, Synapse, Delta Tables, Unity Catalog Hands on in Python, PySpark or Spark SQL Hands on in Azure Analytics and DevOps Taking part in Proof of Concepts (POCs) and pilot solutions preparation Ability to conduct data profiling, cataloguing, and mapping for technical design and construction of technical data flows Experience in business processing mapping of data and analytics solutions
Posted 2 weeks ago
5.0 - 10.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time educationTHIS DEMAND IS FOR ATCAMA . JD as UnderData Engineering skills:Knowledge of data integration, data warehousing, and data lake technologies. Data Quality and Governance skills:Experience with data quality tools, data governance frameworks, and data profiling techniques. Programming skills:Proficiency in languages like Java, Python, or SQL, depending on the specific role. Cloud computing skills:Experience with cloud platforms like AWS, Azure, or Google Cloud Platform. Problem-solving skills:Ability to troubleshoot data issues and identify solutions. As a Data Governance Practitioner, you will establish and enforce data governance policies to ensure the accuracy, integrity, and security of organizational data. Collaborate with key stakeholders to define data standards, facilitate effective data collection, storage, access, and usage; and drive data stewardship initiatives for comprehensive and effective data governance. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead data governance strategy implementation- Develop and maintain data governance frameworks- Conduct data quality assessments Professional & Technical Skills: - Strong understanding of data governance principles- Experience in implementing data governance solutions- Knowledge of data privacy regulations- Familiarity with data quality management practices Additional Information:- The candidate should have a minimum of 5+ years of experience in Atacama Data Governance.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough