Jobs
Interviews

38 Transform Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 16.0 years

0 Lacs

hyderabad, telangana

On-site

As a SAP Data Migration Consultant at Syniti, you will play a crucial role in SAP Implementation projects by managing various data migration activities. Your responsibilities will include data analysis, reporting, conversion, harmonization, and business-process analysis using SAP and other Enterprise Data Migration Tools. To excel in this role, you must have a strong background in SAP and be an expert in specific business-process areas. You will be actively involved in data migration activities for a specific process thread, engaging with client Subject Matter Experts (SMEs) and Business-Process Experts. Familiarity with the onsite-offshore delivery model is essential for success in this position. The physical demands of this role are limited to office routines, with occasional travel required to various locations across regions. Qualifications: - 11-16 years of SAP Techno-Functional or Functional experience, including involvement in 3+ full SAP implementation lifecycles - Expertise in business-process knowledge related to SAP functional modules such as FI, CO, MM, SD, PM, PP, PS - Over 10 years of experience in IT projects - Proficiency in BackOffice CranSoft/DSP/SAP Data Services/other Data Migration tools - Extensive experience in data quality, data migration, data warehousing, data analysis, and conversion planning - 5 to 7 years of Business-Process experience - Bachelor's degree in Business, Engineering, Computer Science, or related disciplines, or equivalent experience - Proficiency in Microsoft SQL, including SQL query skills and understanding of relational databases Job Responsibilities: - Conduct expert level Business Analysis on SAP modules like FI, CO, MM, SD, PM, PP, PS - Lead and guide the team based on project requirements, ensuring client needs are met - Communicate effectively with onsite teams and client personnel - Facilitate blueprint sessions with onsite/client teams - Develop and maintain SAP Data Migration plan, Integration plan, and Cutover Plan - Perform SAP Data Extraction, Transformation, and Loading - Implement change management and defect management processes - Document all relevant activities - Train new team members on SAP Migration Toolsets If you are looking to leverage your SAP expertise and contribute to impactful data migration projects, this role at Syniti offers a dynamic opportunity to excel in a collaborative and innovative environment.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

west bengal

On-site

About CTM CTM is a global award-winning provider of innovative and cost-effective travel management solutions to the corporate, events, leisure, and loyalty travel markets. With offices in New Zealand, Australia, Asia, North America, and Europe, CTM has over 3000 employees dedicated to providing personalized service excellence with client-facing technology solutions. Our team at CTM embodies collaboration, innovation, and a future-focused mindset, always working in alignment with our core values - Connect, Deliver, and Evolve. About The Role As the Manager - Data Warehouse at CTM, you will be responsible for overseeing the strategic planning, implementation, and management of data warehouse initiatives. Your key focus will be on developing a comprehensive global data warehouse framework for CTM, catering to the needs of stakeholders, integrating data from multiple sources, applying advanced analytics techniques, and ensuring compliance with data privacy regulations. You will play a crucial role in empowering internal customers by providing support, training, resources, and fostering a culture of continuous feedback and improvement. This remote role can be located anywhere in Australia and reports to the Director - Global Business Intelligence. Knowledge, Skills, And Experiences Technical Expertise: Possess a strong understanding of data warehousing, ETL processes, data modeling, data visualization, and advanced analytics techniques. Customer Service: Adapt to evolving customer needs, collect actionable feedback, and deliver high-quality and consistent customer service throughout the customer lifecycle. Leadership Skills: Demonstrate the ability to lead and inspire a team, facilitate effective communication, promote team building, and resolve conflicts. Business Acumen: Understand CTM's goals, objectives, and KPIs, and translate business requirements into data solutions. Strategic Thinking: Develop a long-term vision for the data warehouse function aligned with CTM's overall strategy, identify opportunities for innovation, and stay updated on emerging trends. Project Management: Proficient in managing DW projects from inception to delivery, including scope definition, resource allocation, and stakeholder engagement. Continuous Learning: Stay abreast of the latest trends in data warehousing and analytics through personal research and professional development opportunities. Collaboration Skills: Collaborate effectively with cross-functional teams to ensure alignment of DW initiatives with CTM's goals. Problem-Solving Abilities: Identify business challenges, analyze complex data sets, and derive actionable insights to drive strategic decision-making. Communication Skills: Communicate technical concepts and insights clearly to non-technical stakeholders. Why CTM Join CTM to be part of a supportive and sustainable work environment that prioritizes your career development and wellbeing. Enjoy a range of employee benefits including travel discounts, lifestyle perks, training opportunities, volunteer days, wellness initiatives, and flexible work arrangements.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You will be part of the People, Culture & Communications team at bp, focusing on modernizing and simplifying HR processes globally. As a People Data Specialist, you will play a crucial role in providing guidance and information to employees, managers, and HR regarding sophisticated employee processes and data changes within the Workday system. Your key responsibilities will include coordinating and managing employee data changes, such as work schedules, locations, compensations, and more. You will be expected to handle transactional corrections, understand workflows, ensure data integrity, and adhere to global data management processes and procedures. Moreover, your role will involve reviewing and approving steps to align with policies, delivering record and document image management services, and supporting ad hoc projects as required. Collaboration with other regions to identify continuous service improvements, participation in acceptance testing for technology changes, and consistently enhancing self-awareness are also essential aspects of this role. To excel in this position, you must possess a bachelor's or master's degree, along with 3-5+ years of experience in HR Shared Services, preferably with Workday system experience. Proficiency in CRM systems, MS Office, organizational skills, judgment, communication, customer service, and teamwork is crucial. Additionally, demonstrating agility, analytical thinking, creativity, decision-making, and information security awareness will be key to success. This full-time role will have shift timings from 12:30 to 9:30 PM IST, with the possibility of working in a hybrid office/remote setup in Pune. The position does not entail significant travel but may require occasional relocation within the country. Please note that adherence to local policies, including background checks and medical reviews, may be necessary for employment. If you require any accommodations during the application process or while performing crucial job functions, please contact us. Your commitment to data privacy, integrity, risk management, and high ethical standards will be highly valued in this role.,

Posted 1 month ago

Apply

7.0 - 10.0 years

11 - 20 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Role & responsibilities Experience in designing and implementing solutions within Service Now is required. • Good knowledge of the core concepts of Service Now to include its OOTB functions and customisation capabilities. • Experience of working knowledge on ServiceNow ITSM preferred. • Experience of developing within a customised ServiceNow instance. • 7-8 years of ServiceNow Developer experience, including APIs, Access Control, Client and Server side scripting, Import sets and Transform Maps & Scripts, Service Portal including Widget development, Notifications, Employee Center, Agent Workspace, System Upgrades, System clones etc. • Good working knowledge of JavaScript. • Experience in writing technical documents is essential. • Understanding of Agile methodologies and working practices. • Must be able to design, discuss, question and document system solutions. • Experience of system testing with a focus on end-user testing and ensuring traceability between requirements and test cases. • Able to work under pressure and demonstrate initiative, enthusiasm and a rapid learning capability. • Proven results-driven approach with the ability to take initiative, handle multiple tasks and shifting priorities and meet deadlines. • Experience of forming and maintaining network relationships - solid partner/stakeholder interaction skills. • Both spoken and written communication skills with experience of adapting style and approach to the audience and message to be delivered. • ServiceNow Certification preferred . Preferred candidate profile

Posted 2 months ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a full-time Business Analyst at FIS, you will be responsible for defining requirements for various product integrations using the Adeptia ETL tool. These integrations may involve external vendors and FIS Capital Markets" solutions or between multiple FIS products. Your role will require you to collaborate with engineering groups for design changes, product management team for requirement gathering, professional services team for customization and implementation, and customer experience team for product support. You will be part of the Digital Integration Hub team within FIS's Capital Markets division, focusing on adding business value to FIS clients through seamless integration of various solutions. The team's core focus is on automation, efficiency, standardized technology stack, and integration. Your responsibilities will include understanding financial products, refining connector requirements based on stakeholder input, defining requirements and modifications with product managers and users, participating in software design meetings, and working closely with internal Client Training, Client Relationship, and Sales teams. To excel in this role, you should have experience working in an agile/scrum environment, possess strong knowledge of financial markets, along with basic technical skills such as SQL, JavaScript, Excel Macros, and basic programming skills. Ability to analyze, design, and modify connectors, as well as effectively interact with product managers and users to define system requirements, are essential. Additionally, familiarity with solution design, requirements definition disciplines, writing user stories, completing documentation, and providing training to internal teams is crucial. Knowledge of the financial services industry, including capital markets, private equity, and fund accounting, is a requirement. Having knowledge of Adeptia ETL tool, experience with SQL Database engine, and proficiency in Excel would be considered added advantages for this role. At FIS, we offer you more than just a job - it's an opportunity to shape the future of fintech. You will have a voice in the industry, continuous learning and development opportunities, a collaborative work environment, chances to give back, and a competitive salary and benefits package. FIS is dedicated to safeguarding the privacy and security of all personal information processed to provide services to clients. Our recruitment model primarily involves direct sourcing, and we do not accept resumes from recruitment agencies not on our preferred supplier list. We are committed to providing a fair and transparent recruitment process without any related fees for applicants or employees.,

Posted 2 months ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a valued member of our team at FIS, you will play a crucial role as a business analyst in defining requirements for various product integrations using the Adeptia ETL tool. These integrations may involve external vendors and FIS Capital Markets" solutions, or between multiple FIS products. Utilizing the low code/no code Adeptia tool, you will collaborate with engineering groups, product management, professional services, and customer experience teams to ensure successful design changes, customization, and implementation. You will be an integral part of the Digital Integration Hub team within FIS's Capital Markets division, focusing on enhancing business value for FIS clients through seamless integration of various solutions. This team prioritizes automation, efficiency, a standardized technology stack, and smooth integration processes. Your responsibilities will include understanding financial products to refine connector requirements, engaging with product managers and users to define requirements and modifications, participating in software design meetings, and collaborating with internal client training, client relationship, and sales teams to support the product. To excel in this role, you should have experience working in an agile/scrum environment, possess strong financial markets exposure, and basic technical skills such as SQL, JavaScript, Excel Macros, and basic programming. Additionally, you must demonstrate the ability to analyze, design, and modify various connectors, write user stories based on business requirements, complete documentation for installation and maintenance, and effectively communicate with internal teams. An added advantage would be familiarity with the Adeptia ETL tool, experience with SQL Database engine, and proficiency in Excel. Knowledge of the financial services industry, including capital markets, private equity, and fund accounting, is highly desirable. At FIS, we provide you with a platform to shape the future of fintech, offering continuous learning and development opportunities, a collaborative work environment, avenues for giving back, and competitive salary and benefits. Join us at FIS and be a part of the exciting journey towards transforming the world of fintech. Privacy Statement: FIS is dedicated to safeguarding the privacy and security of all personal information processed to deliver services to our clients. For detailed information on how FIS protects personal information online, please refer to the Online Privacy Notice. Sourcing Model: Recruitment at FIS primarily operates on a direct sourcing model, with a small portion of hiring through recruitment agencies. FIS does not accept resumes from agencies not on the preferred supplier list and disclaims responsibility for any associated fees related to resumes submitted through job postings or any part of the company.,

Posted 2 months ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

You will be responsible for supporting the global community of practice within the Finance entity. This includes establishing and managing the backlog of service improvements, collaborating with the Global Experience Owner to propose measurement standards, and developing capacity reporting for service throughput. You will coordinate plans to measure and enhance service levels at various levels and track global standard operating procedures. Additionally, you will serve as the point of contact for improving or aligning local operating procedures. In terms of projects, you will work with the Service Delivery Manager to determine the best approach and management structures for service-enhancing projects. This will involve tracking and monitoring projects, managing the change plan within the service area, and overseeing internal and external resources as needed. You will need to demonstrate a collaborative and performance-driven mindset, identify project risks and issues, and present project status reports. To be successful in this role, you should have a degree or professional qualification in a relevant field, or equivalent experience. You should have at least 4 years of experience in P&C services and systems, business analysis, process development/documentation, and supporting P&C business. Additionally, you should have 2+ years of project management experience in both agile and waterfall methodologies. Proficiency in Microsoft Office, particularly Excel, is required. You should actively work towards developing capabilities aligned with the P&C Capability Framework. This role does not require significant travel and is eligible for relocation within the country. It is a hybrid position, allowing for a combination of office and remote work. Key skills for this role include agility core practices, analytical thinking, collaboration, communication, creativity, customer service excellence, data management, decision making, strategic implementation, and project management. Additionally, you should be adept at managing change, engaging stakeholders, and utilizing measurement and metrics effectively.,

Posted 2 months ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

As an experienced SAP TM Consultant with 7 to 10 years of relevant experience, you will be responsible for providing expertise in SAP TM module, functionality, and business processes. Your role will involve integrating Business Processes with SAP Master Data and ensuring seamless integration between various SAP modules and subscribing systems. You will guide businesses on the appropriate utilization of the SAP system and review, analyze, and evaluate existing business processes within the SAP TM functionality. Your responsibilities will include configuring and maintaining relevant SAP TM components, along with supporting business users in end-user training. You will also be involved in the Extract, Transform, and Cleanse of SAP TM data objects. Creating and managing SAP TM project tracks for implementing or deploying capabilities such as blueprints, gap analysis, end-to-end process design, testing strategy, cutover plans, and training will be a crucial part of your role. Experience with S/4HANA is mandatory for this position. You should possess strong attention to detail, be self-directed, and have excellent independent problem-solving skills. Effective communication and presentation skills are essential, along with the ability to interact with senior leadership. Mentoring team members in technology, architecture, and application delivery will also be part of your responsibilities. A certification in SAP configuration and/or specific SAP modules is a must-have qualification for this role. If you are looking for an opportunity to utilize your SAP TM expertise and contribute to the successful implementation of SAP projects, this role offers a challenging and rewarding environment to grow your career.,

Posted 2 months ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Chennai

Work from Office

About Nexinfo: NexInfo is a premier consulting firm founded in 1999. We have been in business for 25+ years and work with clients of all sizes to achieve Operational Excellence’ using a blend of knowledge in both Business Processes and Software Consulting. We offer implementation and managed services for businesses across many industries including Manufacturing, Pharmaceuticals, Biotech, Medical Devices, Industrial Automation, Automobile Industry, and many more. We have a global footprint across North America, Europe, and Asia with most clients distributed across North America with a team size of 300+ employees and our headquarters in California, USA. Role Summary: We are looking for a Senior Python Developer with strong communication skills and proven experience working on business application integrations . The ideal candidate will be responsible for developing, testing, and maintaining integration solutions between enterprise systems, such as CRMs, ERPs, or third-party SaaS platforms, using Python.\ Key Responsibilities: Design, develop, and implement system integrations using Python. Work on end-to-end integration flows between business applications (e.g., Zoho, Zenoti,etc.). Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Develop and maintain REST/SOAP API integrations. Handle data extraction, transformation, and loading (ETL). Troubleshoot and resolve integration issues promptly. Document integration architecture, logic, and workflows. Communicate effectively with technical and non-technical stakeholders. Key Requirements: 5+ years of experience as a Python Developer with a focus on integration projects. Strong understanding of API development and integration frameworks. Hands-on experience working with business applications such as CRMs, ERPs, or other SaaS platforms. Solid knowledge of RESTful services, webhooks, and JSON/XML data formats. Familiarity with integration tools and platforms is a plus (e.g., MuleSoft, Zapier, Apache Camel). Excellent problem-solving and communication skills.

Posted 2 months ago

Apply

2.0 - 4.0 years

2 - 4 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Role & responsibilities Data Analysis & Reporting: Develop comprehensive reports from scratch using MS Office tools, including advanced functions like What-If analysis. Perform accurate and detailed data analysis to provide actionable insights for the business. SQL Expertise: Work with MS SQL or PostgreSQL databases for data extraction and querying. Handle large datasets effectively and implement ETL (Extract, Transform, Load) principles for data management. Process Understanding & Alignment: Understand or demonstrate eagerness to learn business processes to align data insights with organizational goals. Data-Driven Approach: Maintain a systematic and data-oriented mindset to ensure all outputs are precise, relevant, and timely. ETL & Data Management: Manage large datasets, transform raw data into meaningful insights, and maintain data accuracy and integrity. Problem-Solving & Business Support: Assist in identifying trends, issues, and opportunities through in-depth analysis to support business decisions. Requirements: Experience: Proven experience in working with MS Office tools for data analysis and reporting. Hands-on experience with SQL databases (MS SQL or PostgreSQL) is essential. Technical Skills: Proficiency in handling large datasets and ETL processes. Strong knowledge of building reports and using advanced MS Excel functions. Mindset & Approach: Data-oriented with a methodical and structured approach to analysis. Eager to learn business processes and adapt to the organizations needs. Soft Skills: Attention to detail, problem-solving capabilities, and effective communication skills.

Posted 3 months ago

Apply

4.0 - 6.0 years

5 - 10 Lacs

pune, mumbai (all areas)

Work from Office

We are seeking a driven and experienced Nova Context Developer to strengthen the OSS practice for a leading digital solutions company specializing in Cloud, AI-AIOps, product engineering services, and system integration. Key Responsibilities: Actively contribute as a team member on project implementations. Develop, construct, and test components of a Nova Context (Ontology) solution under the supervision of technical leads. Build and maintain strong technical relationships with customers. Support pre-sales efforts when required. Collaborate effectively with internal and client teams to ensure successful project delivery. Continuously develop consulting capabilities and professional competencies. Follow guidance from lead or senior consultants on assigned projects. Key Qualifications & Requirements: Minimum 1 year of hands-on experience in Nova Context (Ontology) solution deployment and construction. 3+ years of experience managing large, data-oriented projects in a customer-facing role. Strong analytical skills to interpret complex datasets, identify patterns, and establish data relationships. Proficient in extracting data from Excel, XML, and using ETL processes. Experience with graph database solutions (NoSQL), RDF, and graph data modeling. Strong command of graph query handling, especially SPARQL and PRONTO (must-have). Advanced scripting and development skills in Python, BASH, Perl, and Linux Shell / CLI. Good understanding of the Telco domain (Wireless: 2G/3G/4G/5G, Wireline: GPON, Fibre, Transport: Microwave, DWDM, SDH, PDH). IT infrastructure knowledge, including virtualization (VMware/MS Hypervisor) and container technologies (Docker, K3s, Kubernetes). Familiarity with data lakes and data modeling techniques. Additional Skills: Strong grasp of SDLC and implementation best practices. Quality-focused with a "completer-finisher" mindset. Business-aware, understanding broader departmental and organizational goals. Self-driven with strong problem-solving skills. Excellent communication and relationship-building skills, including cross-cultural collaboration.

Posted Date not available

Apply

4.0 - 9.0 years

3 - 6 Lacs

gurugram, bengaluru

Work from Office

Job Summary: Supports, develops and maintains a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with the Business and IT teams to understand the requirements to best leverage the technologies to enable agile data delivery at scale. Key Responsibilities: Implements and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Implements methods to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Develops reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Develops physical data models and implements data storage architectures as per design guidelines. Analyzes complex data elements and systems, data flow, dependencies, and relationships in order to contribute to conceptual physical and logical data models. Participates in testing and troubleshooting of data pipelines. Develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses agile development technologies, such as DevOps, Scrum, Kanban and continuous improvement cycle, for data driven application. External Qualifications and Competencies Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activitiesfrom variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications: College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience: 4-5 Years of experience. Relevant experience preferred such as working in a temporary student employment, intern, co-op, or other extracurricular team activities. Knowledge of the latest technologies in data engineering is highly preferred and includes: - Exposure to Big Data open source - SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework - SQL query language - Clustered compute cloud-based implementation experience - Familiarity developing applications requiring large file movement for a Cloud-based environment - Exposure to Agile software development - Exposure to building analytical solutions - Exposure to IoT technology Additional Responsibilities Unique to this Position 1) Work closely with business Product Owner to understand product vision. 2) Participate in DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). 3) Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. 4) Work under limited supervision to design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. 5) Responsible for creation of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP) with guidance and help from senior data engineers. 6) Take part in evaluation of new data tools, POCs with guidance and help from senior data engineers. 7) Take ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization under limited supervision. 8) Assist to resolve issues that compromise data accuracy and usability. 1. Programming Languages:Proficiency in languages such as Python, Java, and/or Scala. 2. Database Management:Intermediate level expertise in SQL and NoSQL databases. 3. Big Data Technologies:Experience with Hadoop, Spark, Kafka, and other big data frameworks. 4. Cloud Services:Experience with Azure, Databricks and AWS cloud platforms. 5. ETL Processes:Strong understanding of Extract, Transform, Load (ETL) processes. 6. API: Working knowledge of API to consume data from ERP, CRM

Posted Date not available

Apply

4.0 - 5.0 years

6 - 10 Lacs

pune

Work from Office

Job Summary: Supports, develops and maintains a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with the Business and IT teams to understand the requirements to best leverage the technologies to enable agile data delivery at scale. Key Responsibilities: Implements and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Implements methods to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Develops reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Develops physical data models and implements data storage architectures as per design guidelines. Analyzes complex data elements and systems, data flow, dependencies, and relationships in order to contribute to conceptual physical and logical data models. Participates in testing and troubleshooting of data pipelines. Develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses agile development technologies, such as DevOps, Scrum, Kanban and continuous improvement cycle, for data driven application. External Qualifications and Competencies Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activitiesfrom variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications: College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience: 4-5 Years of experience. Relevant experience preferred such as working in a temporary student employment, intern, co-op, or other extracurricular team activities. Knowledge of the latest technologies in data engineering is highly preferred and includes: - Exposure to Big Data open source - SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework - SQL query language - Clustered compute cloud-based implementation experience - Familiarity developing applications requiring large file movement for a Cloud-based environment - Exposure to Agile software development - Exposure to building analytical solutions - Exposure to IoT technology Additional Responsibilities Unique to this Position 1) Work closely with business Product Owner to understand product vision. 2) Participate in DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). 3) Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. 4) Work under limited supervision to design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. 5) Responsible for creation of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP) with guidance and help from senior data engineers. 6) Take part in evaluation of new data tools, POCs with guidance and help from senior data engineers. 7) Take ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization under limited supervision. 8) Assist to resolve issues that compromise data accuracy and usability. 1. Programming Languages:Proficiency in languages such as Python, Java, and/or Scala. 2. Database Management:Intermediate level expertise in SQL and NoSQL databases. 3. Big Data Technologies:Experience with Hadoop, Spark, Kafka, and other big data frameworks. 4. Cloud Services:Experience with Azure, Databricks and AWS cloud platforms. 5. ETL Processes:Strong understanding of Extract, Transform, Load (ETL) processes. 6. API: Working knowledge of API to consume data from ERP, CRM

Posted Date not available

Apply
Page 2 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies