Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We are seeking a highly skilled Data Engineer with extensive experience in Snowflake, Data Build Tool (dbt), Snaplogic, SQL Server, PostgreSQL, Azure Data Factory, and other ETL tools. The ideal candidate will have a strong ability to optimize SQL queries and a good working knowledge of Python. A positive attitude and excellent teamwork skills are essential. Key Responsibilities Data Pipeline Development: Design, develop, and maintain scalable data pipelines using Snowflake, dbt, Snaplogic and ETL tools. SQL Optimization: Write and optimize complex SQL queries to ensure high performance and efficiency. Data Integration: Integrate data from various sources, ensuring consistency, accuracy, and reliability. Database Management: Manage and maintain SQL Server and PostgreSQL databases. ETL Processes: Develop and manage ETL processes to support data warehousing and analytics. Collaboration: Work closely with data analysts, data scientists, and business stakeholders to understand data requirements and deliver solutions. Documentation: Maintain comprehensive documentation of data models, data flows, and ETL processes. Troubleshooting: Identify and resolve data-related issues and discrepancies. Python Scripting: Utilize Python for data manipulation, automation, and integration tasks. Qualifications Experience: Minimum of 9 years of experience in data engineering. Technical Skills Proficiency in Snowflake, dbt, Snaplogic, SQL Server, PostgreSQL, and Azure Data Factory. Strong SQL skills with the ability to write and optimize complex queries. Knowledge of Python for data manipulation and automation. Knowledge of data governance frameworks and best practices Soft Skills Excellent problem-solving and analytical skills. Strong Communication And Collaboration Skills. Positive attitude and ability to work well in a team environment. Certifications: Relevant certifications (e.g., Snowflake, Azure) are a plus. Show more Show less
Posted 1 month ago
2 - 7 years
4 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Mulesoft Anypoint Platform Good to have skills : NA Minimum 2 year(s) of experience is required Educational Qualification : BE Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications using MuleSoft Anypoint Platform. Your typical day will involve collaborating with cross-functional teams, analyzing business process and application requirements, and delivering high-quality solutions. Roles & Responsibilities: Design, build, and configure applications using MuleSoft Anypoint Platform to meet business process and application requirements. Collaborate with cross-functional teams to analyze business process and application requirements, and deliver high-quality solutions. Develop and maintain integration solutions using MuleSoft Anypoint Platform, including API development, data integration, and ETL processes. Ensure the performance, quality, and responsiveness of applications, and identify and correct bottlenecks and fix bugs. Professional & Technical Skills: Must To Have Skills:Experience in designing and developing integration solutions using MuleSoft Anypoint Platform. Good To Have Skills:Experience with other integration platforms such as Dell Boomi, Informatica, or SnapLogic. Strong understanding of integration patterns, RESTful APIs, and web services. Experience with Java, XML, JSON, and other related technologies. Experience with Agile development methodologies and DevOps practices. Solid grasp of software development best practices, including coding standards, code reviews, source control management, and testing methodologies. Additional Information: The candidate should have a minimum of 2 years of experience in MuleSoft Anypoint Platform. The ideal candidate will possess a strong educational background in computer science, software engineering, or a related field, along with a proven track record of delivering high-quality integration solutions. This position is based at our Hyderabad office.Alternate location can be Pune, Bangalore, Mumbai, Chennai, NCR, Kolkata. Qualifications BE
Posted 1 month ago
3 - 5 years
5 - 10 Lacs
Chennai, Bengaluru
Work from Office
Integration Design and Development: Develop integration solutions using SnapLogic to automate data workflows between Snowflake, APIs, Oracle and other data sources. Design, implement, and maintain data pipelines to ensure reliable and timely data flow across systems. Develop API integrations to facilitate seamless data exchange with internal master data management systems. Monitor and optimize data integration processes to ensure high performance and reliability. Provide support for existing integrations, troubleshoot issues, and suggest improvements to streamline operations. Work closely with cross-functional teams, including data analysts, data scientists, and IT, to understand integration needs and develop solutions. Maintain detailed documentation of integration processes and workflows. Experience: 3-4 years of Proven experience as a SnapLogic Integration Engineer. Experience with Snowflake cloud data platform is preferred. Experience in API integration and development. Familiar with RESTful API design and integration. Strong understanding of ETL/ELT processes Role & responsibilities Preferred candidate profile
Posted 1 month ago
2 - 7 years
6 - 10 Lacs
Bengaluru
Work from Office
Hello Talented Techie! We provide support in Project Services and Transformation, Digital Solutions and Delivery Management. We offer joint operations and digitalization services for Global Business Services and work closely alongside the entire Shared Services organization. We make efficient use of the possibilities of new technologies such as Business Process Management (BPM) and Robotics as enablers for efficient and effective implementations. We are looking for Data Engineer ( AWS, Confluent & Snaplogic ) Data Integration Integrate data from various Siemens organizations into our data factory, ensuring seamless data flow and real-time data fetching. Data Processing Implement and manage large-scale data processing solutions using AWS Glue, ensuring efficient and reliable data transformation and loading. Data Storage Store and manage data in a large-scale data lake, utilizing Iceberg tables in Snowflake for optimized data storage and retrieval. Data Transformation Apply various data transformations to prepare data for analysis and reporting, ensuring data quality and consistency. Data Products Create and maintain data products that meet the needs of various stakeholders, providing actionable insights and supporting data-driven decision-making. Workflow Management Use Apache Airflow to orchestrate and automate data workflows, ensuring timely and accurate data processing. Real-time Data Streaming Utilize Confluent Kafka for real-time data streaming, ensuring low-latency data integration and processing. ETL Processes Design and implement ETL processes using SnapLogic , ensuring efficient data extraction, transformation, and loading. Monitoring and Logging Use Splunk for monitoring and logging data processes, ensuring system reliability and performance. You"™d describe yourself as: Experience 3+ relevant years of experience in data engineering, with a focus on AWS Glue, Iceberg tables, Confluent Kafka, SnapLogic, and Airflow. Technical Skills : Proficiency in AWS services, particularly AWS Glue. Experience with Iceberg tables and Snowflake. Knowledge of Confluent Kafka for real-time data streaming. Familiarity with SnapLogic for ETL processes. Experience with Apache Airflow for workflow management. Understanding of Splunk for monitoring and logging. Programming Skills Proficiency in Python, SQL, and other relevant programming languages. Data Modeling Experience with data modeling and database design. Problem-Solving Strong analytical and problem-solving skills, with the ability to troubleshoot and resolve data-related issues. Preferred Qualities: Attention to Detail Meticulous attention to detail, ensuring data accuracy and quality. Communication Skills Excellent communication skills, with the ability to collaborate effectively with cross-functional teams. Adaptability Ability to adapt to changing technologies and work in a fast-paced environment. Team Player Strong team player with a collaborative mindset. Continuous Learning Eagerness to learn and stay updated with the latest trends and technologies in data engineering. Create a better #TomorrowWithUs! This role, based in Bangalore, is an individual contributor position. You may be required to visit other locations within India and internationally. In return, you'll have the opportunity to work with teams shaping the future. At Siemens, we are a collection of over 312,000 minds building the future, one day at a time, worldwide. We value your unique identity and perspective and are fully committed to providing equitable opportunities and building a workplace that reflects the diversity of society. Come bring your authentic self and create a better tomorrow with us. Find out more about Siemens careers at: www.siemens.com/careers
Posted 1 month ago
3 - 7 years
9 - 13 Lacs
Bengaluru
Work from Office
Mission/Position Headline: OIC (Oracle Integration Cloud) with OTM (Oracle Transportation Management) or other Oracle SaaS applications. Areas of Responsibility: Design and develop (deliver) product extensions, integrations, reports, etc. Stage/migrate customer data as needed. Lead/Support knowledge transition to client. Support all formal documentation of solutions, including requirements for product extensions, etc. Lead/Support all solution activities, including testing and verification. Contribute to the training and development of client technical staff. System Troubleshooting and debugging. Be willing to do after-hours support in USA and EU (level 1-3) **After-hours for US will coincide with daytime/evening working hours in India** Need to be available between 8am and noon EST (meaning work would be between 3am and 12pm EST, or 12:30pm and 9:30pm IST) Desired Experience: Must have development knowledge/experience on OIC (Oracle Integration Cloud). Must have experience in the OIC service role, Adapter & Connection, Connection and Security Properties, Agent & Architecture. Must have experience in the Data Transformation module. Integration action, file processing, error handling, OIC Administrator task Detailed technical knowledge of Oracle Integration Cloud (OIC) and Oracle Cloud Infrastructure (OCI), including Integrations, connections, securing connections, mapping, APIs. Experience building integration in OIC using REST/SOAP services. Configure lookup and manage certificates and keys within OIC. Data ModelXML Schema, and JSON Integration design and development Data migration and loading Integration with other applications Performance tuning; scalability configuration; troubleshooting Experience with SnapLogic or other integration software a plus. Qualification and Experience Overall, 8-12 Yrs of experience. Bachelor"™s or Master"™s degree in Computer Science/Electronics Engineering required, or equivalent. Capabilities Should have good communication skills, be self-motivated, quality and result oriented Strong Analytical and Problem-Solving Skills
Posted 1 month ago
8 - 13 years
18 - 22 Lacs
Hyderabad, Bengaluru
Work from Office
Location: Hyderabad, Bangalore Function: HD HR Requisition ID: 1032970 Our Company We’re Hitachi Digital, a company at the forefront of digital transformation and the fastest growing division of Hitachi Group. We’re crucial to the company’s strategy and ambition to become a premier global player in the massive and fast-moving digital transformation market. Our group companies, including GlobalLogic, Hitachi Digital Services, Hitachi Vantara and more, offer comprehensive services that span the entire digital lifecycle, from initial idea to full-scale operation and the infrastructure to run it on. Hitachi Digital represents One Hitachi, integrating domain knowledge and digital capabilities, and harnessing the power of the entire portfolio of services, technologies, and partnerships, to accelerate synergy creation and make real-world impact for our customers and society as a whole. Imagine the sheer breadth of talent it takes to unleash a digital future. We don’t expect you to ‘fit’ every requirement – your life experience, character, perspective, and passion for achieving great things in the world are equally as important to us. The team Our Global HR Technology team is responsible for the HR technology stack strategy and execution across theHitachi Digital operating companies, which include GlobalLogic, Hitachi Digital Services and Hitachi Vantara and comprise of more than 50,000 employees in 52 countries across the globe. We are an innovative, driven and dynamic team that are passionate about people and technology and are currently leading several critical transformation initiatives, which include a global re-implementation of Workday to incorporate new functionalities, global tech stack optimization, and the introduction of AI capability across HR. What you’ll be doing The Workday Integration Lead will be responsible for leading the design, delivery, and management of integrations across all operating companies (OpCos) within the organization. This role focuses on overseeing the lifecycle of Workday and HR technology integrations, ensuring seamless connectivity between systems, data integrity, and process efficiency. As the Integration Lead, you will collaborate with internal stakeholders and external partners to deliver high-quality, scalable integration solutions. This role will drive strategic initiatives, including ensuring compliance with global data governance and security standards, while managing ongoing optimization of the integration landscape post-implementation. You will: Serve as a primary representative of the HR engineering function, collaborating with implementation teams and internal stakeholders to translate business requirements into technical designs, and develop, test, and deploy HR integrations Define and execute the product vision and roadmap for Workday and related HR technology integrations, with consideration for integration architecture best practice, scalability, governance and business continuity Lead the design and delivery of integration solutions, collaborating with IT, HR, and external vendors to develop scalable solutions for critical business processes, such as payroll, benefits, and finance Oversee data mapping, conversion, and validation processes to ensure data accuracy and consistency across systems Ensure compliance with data governance, privacy regulations, and security protocols during integration design and development Manage the lifecycle of Workday and other HR system integrations, monitoring integration performance through operational dashboards, ensuring stability and continuous improvement Build strong relationships with OpCo stakeholders, ensuring integration solutions meet local and global business requirements Facilitate communication and training to ensure stakeholders understand and can effectively use integration solutions Ensure all integration solutions adhere to global compliance and regulatory requirements, including GDPR, CCPA, and other data privacy standards. Collaborate with IT and data governance teams to address security vulnerabilities and ensure alignment with enterprise policies. What you bring to the team Bachelor’s degree in Computer Science, Information Systems, or a related field. Minimum of 8 years of experience in HR technology integrations, preferably with Workday or similar ERP systems. Proven experience managing complex integration projects in global organizations. Strong expertise in Workday integration tools, including Workday Studio, EIBs, Core Connectors, and Workday Web Services (REST/SOAP APIs). Familiarity with middleware solutions (e.g., Dell Boomi, SnapLogic) and programming languages like XML, XSLT etc., Comprehensive understanding of HR functional areas (HCM, Payroll, Benefits, etc.) and associated data models. Experience with global data governance, security frameworks, and compliance standards. Strong project management skills with the ability to lead multiple initiatives and balance competing priorities. Exceptional communication and stakeholder management abilities, with a collaborative approach. Analytical and problem-solving skills, with a proactive mindset toward innovation and improvement. Certification in Workday Integration Certification and / or Workday HCM Certification are preferred but not essential About us We’re a global, 1000-stong, diverse team of professional experts, promoting and delivering Social Innovation through our One Hitachi initiative (OT x IT x Product) and working on projects that have a real-world impact. We’re curious, passionate and empowered, blending our legacy of 110 years of innovation with our shaping our future. Here you’re not just another employee; you’re part of a tradition of excellence and a community working towards creating a digital future. Our Values We strive to create an inclusive environment for all and are open to considering home working, compressed/flexible hours and flexible arrangements. Get in touch with us to explore how we might be able to accommodate your specific needs.We are proud to say we are an equal opportunity employer and welcome all applicants for employment without attention to race, colour, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status. Championing diversity, equity, and inclusion
Posted 1 month ago
3 - 6 years
7 - 15 Lacs
Pune, Chennai, Bengaluru
Work from Office
Key Responsibilities: Integration Design and Development: Develop integration solutions using SnapLogic to automate data workflows between Snowflake, APIs, Oracle and other data sources. Design, implement, and maintain data pipelines to ensure reliable and timely data flow across systems. Develop API integrations to facilitate seamless data exchange with internal master data management systems. Monitor and optimize data integration processes to ensure high performance and reliability. Provide support for existing integrations, troubleshoot issues, and suggest improvements to streamline operations. Work closely with cross-functional teams, including data analysts, data scientists, and IT, to understand integration needs and develop solutions. Maintain detailed documentation of integration processes and workflows. Experience: 3-4 years of Proven experience as a SnapLogic Integration Engineer. Experience with Snowflake cloud data platform is preferred. Experience in API integration and development. Familiar with RESTful API design and integration. Strong understanding of ETL/ELT processes. Preferred candidate profile
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
We are looking for a skilled and experienced SnapLogic Developer with expertise in analysing, developing, and deploying integration solutions. This role involves end-to-end project delivery and close collaboration with stakeholders to ensure the successful execution of integration solutions. Responsibilities Design, develop and maintain scalable integration solutions using SnapLogic Manage and oversee integration projects from initiation to completion with timely delivery and high-quality outcomes Collaborate with stakeholders to translate business requirements into efficient technical solutions Provide technical guidance and support throughout all phases of the project lifecycle Develop and maintain technical documentation, adhering to established processes and best practices Troubleshoot and resolve integration challenges, ensuring system reliability and effectiveness Continuously assess current processes to identify opportunities for improvement and optimization Monitor system performance and ensure compliance with organizational and project-specific standards Requirements 4-5 years of working experience in SnapLogic development and end-to-end integration delivery Knowledge of SnapLogic Designer, SnapLogic Manager, and pipelines for handling integration tasks Expertise in integrating systems such as databases, SaaS applications, and REST APIs using SnapLogic Background in ETL processes, data flows, and data transformation capabilities Familiarity with cloud platforms like AWS, Azure, or GCP and their integration capabilities Understanding of error handling, debugging, and best practices to ensure seamless integrations Show more Show less
Posted 1 month ago
2 - 5 years
0 Lacs
Hyderabad, Telangana, India
On-site
We are looking for a skilled and experienced SnapLogic Developer with expertise in analysing, developing, and deploying integration solutions. This role involves end-to-end project delivery and close collaboration with stakeholders to ensure the successful execution of integration solutions. Responsibilities Design, develop and maintain scalable integration solutions using SnapLogicManage and oversee integration projects from initiation to completion with timely delivery and high-quality outcomesCollaborate with stakeholders to translate business requirements into efficient technical solutionsProvide technical guidance and support throughout all phases of the project lifecycleDevelop and maintain technical documentation, adhering to established processes and best practicesTroubleshoot and resolve integration challenges, ensuring system reliability and effectivenessContinuously assess current processes to identify opportunities for improvement and optimizationMonitor system performance and ensure compliance with organizational and project-specific standards Requirements 4-5 years of working experience in SnapLogic development and end-to-end integration deliveryKnowledge of SnapLogic Designer, SnapLogic Manager, and pipelines for handling integration tasksExpertise in integrating systems such as databases, SaaS applications, and REST APIs using SnapLogicBackground in ETL processes, data flows, and data transformation capabilitiesFamiliarity with cloud platforms like AWS, Azure, or GCP and their integration capabilitiesUnderstanding of error handling, debugging, and best practices to ensure seamless integrations
Posted 1 month ago
2 - 5 years
0 Lacs
Mumbai Metropolitan Region
On-site
Data Quality & GovernanceStrong understanding of data validation, testing (e.g., dbt tests), and lineage trackingEmphasis on maintaining data trust across pipelines and modelsStakeholder ManagementPartner with business and technical stakeholders to define data needs and deliver insightsAbility to explain complex data concepts in clear, non-technical termsDocumentation & CommunicationMaintain clear documentation for models, metrics, and data transformations (using DBT docs or similar)Strong verbal and written communication skills; able to work cross-functionally across teamsProblem-Solving & OwnershipProactive in identifying and resolving data gaps or issuesSelf-starter with a continuous improvement mindset and a focus on delivering business value through dataIACdeploy scalable, secure, and high-performing Snowflake environments in line with data governance and security in palce using Terraform and other automation scripitAutomate infrastructure provisioning, testing, and deployment for seamless operations Requirements Strong SQL & DBT ExpertiseExperience building and maintaining scalable data models in DBTProficient in modular SQL, Jinja templating, testing strategies, and DBT best practicesData Warehouse ProficiencyHands-on experience with Snowflake including: Dimensional and data vault modeling (star/snowflake schemas)Performance optimization and query tuningRole-based access and security managementData Pipeline & Integration ToolsExperience with Kafka (or similar event streaming tools) for ingesting real-time dataFamiliarity with SnapLogic for ETL/ELT workflow design, orchestration, and monitoringVersion Control & AutomationProficient in Git and GitHub for code versioning and collaborationExperience with GitHub Actions or other CI/CD tools to automate DBT model testing, deployment, and documentation updates
Posted 1 month ago
3 years
0 Lacs
Bengaluru, Karnataka
Work from Office
Servicenow QA As a Senior ServiceNow Consultant, you will be responsible for leading and executing ServiceNow projects and initiatives for clients. You will work closely with clients to understand their requirements, identify opportunities for improvement, and develop innovative solutions leveraging the ServiceNow platform. Key Responsibilities: Serve as the primary ServiceNow subject matter expert for clients, advising on best practices, platform capabilities, and potential solutions. Lead all aspects of ServiceNow project delivery, including requirements gathering, design, development, testing, and deployment.Conduct product demonstrations and educate stakeholders on ServiceNow functionality.Collaborate with clients to understand their business requirements and how ServiceNow can support them. Develop and configure custom ServiceNow applications, workflows, forms, reports, and integrations. Customize and extend ServiceNow ITSM, SPM, and ITOM modules to meet specific client needs. Develop and maintain project plans, budgets, and schedules. Communicate project status, risks, and issues to clients and project stakeholders. Requirements: Bachelors degree in Computer Science, Information Technology, or related field. 5+ years of hands-on experience with the ServiceNow platform, including architecture, design, development, and deployment across multiple modules. 3+ years of experience developing and managing integrations with ServiceNow and other systems and technologies. In depth technical knowledge of integration protocols and technologies, including REST APIs, SOAP APIs, JDBC, LDAP, and others. Experience with other integration tool sets such as Workato, Apptus or Snaplogic is a big plus. Proven experience leading successful ServiceNow implementations from start to finish. Strong problem-solving and analytical skills, with the ability to develop creative solutions to complex problems. Excellent verbal and written communication skills, with the ability to effectively communicate technical information to both technical and non-technical stakeholders. Experience with multiple ServiceNow modules, such as ITSM, SPM, ITOM, HRSD, or CSM. Experience with Performance Analytics is a plus. Experience with the younger ServiceNow modules such as WSD, LSD, SLM is a big plus. Preferred Certifications: ServiceNow Certified Implementation Specialist (ITSM). Consulting experience preferred. Hardware Asset Management (HAM) Fundamentals Software Asset Management (SAM) Fundamentals ServiceNow Certified Implementation Specialist (Discovery) Mandatory Skills Service now ITSM, SPM & ITOM Workato, Apptus or Snaplogic integration protocols and technologies, including REST APIs, SOAP APIs, JDBC, LDAP, and others About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 1 month ago
5 - 10 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SnapLogic Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements in Pune. You will play a crucial role in developing innovative solutions to enhance business operations and efficiency. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the development and implementation of SnapLogic solutions Conduct code reviews and ensure adherence to coding standards Troubleshoot and resolve technical issues in SnapLogic integrations Professional & Technical Skills: Must To Have Skills: Proficiency in SnapLogic Strong understanding of ETL processes Experience with API integrations Knowledge of cloud platforms such as AWS or Azure Hands-on experience in developing and maintaining SnapLogic pipelines Additional Information: The candidate should have a minimum of 5 years of experience in SnapLogic This position is based at our Pune office A 15 years full-time education is required Qualification 15 years full time education
Posted 1 month ago
4 - 8 years
10 - 18 Lacs
Hyderabad
Work from Office
ProcessMAP is seeking an experienced Data Engineer who will build automated pipelines and solutions for data migration/data import or other operations requiring data ETL. Role & responsibilities Perform analysis on core products to support migration planning and development. Work closely with the Team Lead and collaborating with other stakeholders to gather requirements and build well architected data solutions. Produce supporting documentation, such as: specifications, data models, relation between data and others, required for the effective development, usage and communication of the data operations solutions with different stakeholders. Experience working with databases on-premises and/or cloud-based environments such as MSSQL, MySQL, PostgreSQL, AzureSQL, Aurora MySQL & PostgreSQL, AWS RDS etc. Experience building ETL/ELT pipelines Strong problem solving and analytical skills, high attention to detail Passion for analytics, real-time data, and monitoring Critical Thinking and collaboration skills Self-starter and a quick learner, ready to learn new technologies and tools that the job demand Preferred candidate profile Strong experience (4-8 years of relevant experience) in Snaplogic or working with databases on-premises and/or cloud-based environments such as MSSQL, MySQL, PostgreSQL, AzureSQL, Aurora MySQL & PostgreSQL, AWS RDS etc. Strong knowledge of databases, data modeling and data life cycle Proficient in understanding data and writing complex SQL Experience with any ETL/ELT tool or platforms - SSIS, Talend, Informatica, Azure Data Factory, AWS Glue, IBM InfoSphere etc. Experience working with REST API in data pipelines Strong problem solving and high attention to detail Passion for analytics, real-time data, and monitoring Critical Thinking, good communication and collaboration skills Focus on high performance and quality delivery Highly self-motivated and continuous learner Desirable: Experience working with no-SQL databases like MongoDB Experience working with Microsoft Power Platform (PowerAutomate and PowerApps) or any similar automation / RPA tool Experience with cloud data platforms like snowflake, data bricks, AWS, Azure etc. Awareness of emerging ETL and Cloud concepts such as Amazon AWS or Microsoft Azure Experience working with Scripting languages, such as Python, R, JavaScript, etc.
Posted 2 months ago
5 - 10 years
10 - 15 Lacs
Gurgaon
Work from Office
This role is primarily responsible for leading a global Data Integration and Data Analytics project. The position focuses on the continuous improvement of the tool based on user feedback. You will collaborate with cross-functional teams to define/document requirements, drive technical excellence, and ensure that our software systems meet business objectives and industry standards. oriented: Lead a global Data Integration and Data Analytics project. Collaborate with various departments worldwide to understand integration and analytics requirements. Connect various departmental tools to a common data lake (Snowflake) based on business needs. Develop and provide dashboards, web applications, workflows, etc., using data from the data lake. Be part of the global platform team for driving harmonization and improvements on the data and analytics platform Collaborate with other departments such as Project Management, Engineering, Procurement, Bid Management, and Finance, Professional Service, Customer Support, and Sales to ensure modernized product initiatives are implemented while adhering to KPIs and best practices. Interface with Siemens Energy partners and customers to evangelize Siemens Energy's cloud service requirements for 3rd party Integrations, API Services, Big Data Analytics, and ease of configuration. Evaluate and work with 3rd party vendors, infrastructure service providers, and vendor teams to deliver modern cloud architectural initiatives. Experienced in hiring top talent, retention, and positively influencing the team with cutting-edge technology and architectural improvements. Certifications in Cloud computing and Snowflake or related fields are a plus. Willingness to continuously learn and adapt to new methodologies, practices, and technologies. We dont need superheroes, just super minds: Bachelors degree in mechanical or electrical engineering or a comparable field. 5 to 10 years of relevant work experience. Expertise in Snowflake, SQL, Python, Cloud computing, etc. Excellent team player with a strong commitment to supporting the global team. High flexibility in taking on new tasks and responsibilities. Strong dedication to the product with a willingness to take ownership. Extensive Experience in designing API-led services, API platform services, out-of-the-box third-party integrations, marketplace, and Big Data Analytics. Extensive experience with application deployment, troubleshooting, and production support for highly secure, large-scale, complex, distributed applications/platforms Fluency with Internet technologies, Web Services, Database (Oracle, SQL Server), and data platforms with hands-on coding experience. Proven experience in managing a geographically distributed development team and handling customer escalations and technical presentations. Experience with Core Engineering applications and/or the Power sector domain is a plus. Relevant certifications (e.g., AWS Certified Solutions, Snowflake etc.) are a plus. Experience with agile methodologies (Scrum, Kanban) and software development lifecycle (SDLC) processes. Experience in administration and development of COMOS is a Plus. Experience with PowerBI, Tableau, Snaplogic, Alteryx, Mendix, Power Apps, and Power Automate is a plus. Good understanding of the power generation business. Business Skills: Excellent technical leadership and management skills Excellent communication, technical presentation, customer-facing, interpersonal, written, and verbal skills. Excellent organizational, analytical, and creative problem-solving skills Strong project management skills with KPI, process definition, and improvements Ability to effectively manage multiple tasks simultaneously, drive results, and rapidly adapt to changing business dynamics Strong understanding of security best practices and compliance requirements. Ability to travel globally.
Posted 2 months ago
3 - 6 years
5 - 8 Lacs
Bengaluru
Work from Office
The Role Are you passionate about solving complex problems? Do you thrive in a fast-paced environment? Then theres a good chance you will love being a part of our Software Engineering Development team at Kyndryl, where you will be able to see the immediate value of your work. As a Software Engineering - Developer at Kyndryl, you will be at the forefront of designing, developing, and implementing cutting-edge software solutions. Your work will play a critical role in our business offering, your code will deliver value to our customers faster than ever before, and your attention to detail and commitment to quality will be critical in ensuring the success of our products. Using design documentation and functional programming specifications, you will be responsible for implementing identified components. You will ensure that implemented components are appropriately documented, unit-tested, and ready for integration into the final product. You will have the opportunity to architect the solution, test the code, and deploy and build a CI/CD pipeline for it. As a valued member of our team, you will provide work estimates for assigned development work, and guide features, functional objectives, or technologies being built for interested parties. Your contributions will have a significant impact on our products' success, and you will be part of a team that is passionate about innovation, creativity, and excellence. Above all else, you will have the freedom to drive innovation and take ownership of your work while honing your problem-solving, collaboration, and automation skills. Together, we can make a difference in the world of cloud-based managed services. Key Responsibilities: Technical Responsibilities: Develop, configure, and implement Coupa enhancements (e.g., custom validations, approval chains, UI modifications, workflows, and APIs). Create and optimize Coupa Reports and Analytics , leveraging Coupa Insights, Advanced Reporting, and BI tools. Develop custom scripts, integrations, and connectors between Coupa and various ERPs (e.g., SAP, Oracle, NetSuite, D365 F&O, Maconomy) using APIs, Coupa Link, and middleware tools. Debug and resolve system issues, API failures, and integration bottlenecks through log analysis and system diagnostics. Implement best practices in Coupa Security & User Access Management to ensure compliance with organizational policies. Perform data extraction, transformation, and loading (ETL) processes for large-scale reporting and analytics. Functional Responsibilities: Work closely with business users to gather requirements for custom reports and system enhancements . Support end-to-end Coupa Procurement and Spend Management processes , ensuring smooth operations and adherence to policies. Conduct root cause analysis for procurement-related issues , working in coordination with IT teams and business users. Configure Coupa workflows, approval hierarchies, and budget controls to align with business needs. Perform testing, validation, and troubleshooting of new releases and upgrades. Act as a liaison between business users and Coupa Support , logging, tracking, and escalating tickets as required. Who You Are Youre good at what you do and possess the required experience to prove it. However, equally as important you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused someone who prioritizes customer success in their work. And finally, youre open and borderless naturally inclusive in how you work with others. Required Technical and Professional Experience 3+ years of experience in Coupa Procurement applications as a techno-functional consultant. Strong hands-on experience with Coupa API integrations, middleware tools (Mulesoft, Boomi, SnapLogic, etc.), and ERP connectivity . Expertise in Coupa Advanced Reporting, Insights, and BI tools . Knowledge of Coupa Data Model, Smart Forms, Approvals, and Configurable Fields . Strong SQL and data querying skills to extract, transform, and analyse procurement and financial data. Ability to debug Coupa API logs, troubleshoot issues , and coordinate with technical teams. Prior experience in P2P, S2P, and contract lifecycle management processes . Familiarity with Coupa Supplier Portal and Supplier Enablement workflows . Strong communication and stakeholder management skills to bridge the gap between technical teams and business users. Preferred Technical and Professional Experience Experience working with D365 F&O, SAP Ariba, or Oracle Cloud Procurement alongside Coupa. Knowledge of RPA (Robotic Process Automation) tools for process automation within Coupa. Experience in Coupa Risk Assess, Spend Guard, and Risk Assessments . Certification in Coupa Platform Administration or Technical Integration .
Posted 2 months ago
5 - 10 years
18 - 33 Lacs
Pune
Hybrid
Job requirements What do you need to succeed 5+ years experience within an IT organization, experience in PI/PO, CPI is a MUST JAVA preferable/ Passionate about technology and/or programming Professional English and ideally one other European language A strong customer service focused personality A Continuous Improvement mindset Ability to work according to procedures and best practices Capable to work in a diverse, global team that runs 24/7 Able to work with a high degree of autonomy Familiarity with basic integration concepts (e.g. APIs, Service Oriented Architecture, ESB) Basic Programming experience Working experience with JIRA service desk or similar solutions Comfortable with hybrid work environment from our Pune, India office. Additional Desired skills A bachelors degree in computer science, Software Engineering , or equivalent Knowledge of any integration platform (like SAP PI, SAP PO, SnapLogic, Dell Boomi, Apache Camel, etc.) Analytical skills to identify patterns and improvement opportunities . What do we offer? The chance to gain work experience in a dynamic and inspiring environment and launch your career Plenty growth opportunities while working in a high energy and fun environment. The opportunity to work on innovative projects with colleagues who are genuinely proud of their contribution Training and mentoring to support your professional development with a yearly education budget International atmosphere with Multicultural environments (+- 20 nationalities) A global, inclusive and diverse working climate within a world conscious organization
Posted 2 months ago
5 - 10 years
0 - 3 Lacs
Bengaluru
Work from Office
Experience-5+Years 30 days max Bangalore-WFO About 5 To 8 years of relevant experience in SnapLogic Integration. Demonstrated experience one of the API gateways like Layer7, Apigee. Experience with dealing with Web Services standards (WSDL, SOAP, REST) Experience in data formats: JSON, XML, HTTP etc. iPAAS/ ETL knowledge is a plus Strong API technical development, protocols, auth, implementations and design patterns Experience working with Message Brokers like Kafka, ActiveMQ, RabbitMQ etc. Experience in writing SQL queries and DB knowledge (SQL and NoSQL). Unit testing (connectivity tests and integration tests) by respecting dev guidelines & best practices Experience in operational analytics, building reports for monitoring and troubleshooting using Splunk / ELK Experience in scripting using python, javascript, java or php. Experience with security standards OAuth, Basic Authentication, JWT, SAML, and OpenID Connect Experience in implementing systems using Agile Scrum framework Experience working with global teams and strong communication skill Desirable Skills [Good To Have] Intermediate programming capabilities outside Integration (Java / .NET) Good understanding of configuration settings Understand requirements quickly and adhere to agile practices Having good business communication skills, organized and ability to work independently
Posted 2 months ago
4 - 9 years
20 - 25 Lacs
Bengaluru
Hybrid
Job description Role & responsibilities : What Skills Youll need to Succeed [Must Have]: About 5 To 8 years of relevant experience in SnapLogic Integration. Demonstrated experience one of the API gateways like Layer7, Apigee. Experience with dealing with Web Services standards (WSDL, SOAP, REST) Experience in data formats: JSON, XML, HTTP etc. iPAAS/ ETL knowledge is a plus Strong API technical development, protocols, auth, implementations and design patterns Experience working with Message Brokers like Kafka, ActiveMQ, RabbitMQ etc. Experience in writing SQL queries and DB knowledge (SQL and NoSQL). Unit testing (connectivity tests and integration tests) by respecting dev guidelines & best practices Experience in operational analytics, building reports for monitoring and troubleshooting using Splunk / ELK Experience in scripting using python, javascript, java or php. Experience with security standards OAuth, Basic Authentication, JWT, SAML, and OpenID Connect Experience in implementing systems using Agile Scrum framework Experience working with global teams and strong communication skill Desirable Skills [Good To Have] Intermediate programming capabilities outside Integration (Java / .NET) Good understanding of configuration settings Understand requirements quickly and adhere to agile practices Having good business communication skills, organized and ability to work independently Location : Bangalore Notice period : Immediate to 45 days
Posted 2 months ago
3 - 5 years
5 - 8 Lacs
Bengaluru
Work from Office
About your team The GPS Datawarehouse & Reporting is a team of around 100 people whose role is to develop and maintain the datwarehouse and reporting platforms that we use to administer the persions and investments of our workplace and retail customers across the world. In doing this we critical to the delivery of our core product and value proposition to these clients today and in future About your role The role will focus on functional testing, data test automation, and collaboration with both development and business analysts to identify functional and technical testing areas, as well as address key capability testing gaps. The role will involve using/enhancing existing automation frameworks or developing a new one to improve the speed of change. The key outcomes shall include (but are not limited to) creating and maintaining testing related services/framework and a functional automation suite while keeping the DevOps principle of speed and quality in mind About you Experience in working with Snowflake and ETL tool like Snaplogic/Informatica Experience in Python Experience in designing DB validation test cases for Cloud platform Expertise in writing Complex queries for Data base testing, data mining Good understanding of database concepts, such as data models, schemas, tables, queries, views, stored procedures, triggers, indexes, constraints, and transactions. Expertise in Data warehouse ETL testing and concepts Well known to the overall Cloud Terminologies, tools and usage Experience in at least one Cloud based tool for Database platform Experience of test automation framework design for service layer/API. Relevant experience in Application development tools and frameworks like Rest Assured, Cucumber etc. Experience of using source code management tools, e.g. GitHub Knowing development methodologies such as Scrum, Agile, and Kanban. Candidate needs to have rich experience around engineering skills, CI CD and build/ deployment automation tools. Appreciate the business principles involved in a project
Posted 2 months ago
5 - 10 years
15 - 30 Lacs
Bengaluru
Remote
About us: Cloud Raptor is Australias fastest growing and dynamic Cloud Service advisory partners. With global big 4 consulting DNA our key focus is to bring cool world class leading Cloud solutions and a customer experience that is second to none experience & people are at the heart of everything we do! A specialised technology services consultancy working across Australia, APAC, EMEA and NA. We are headquartered In Australia however our offshore development centre is in Bangalore. Cloud Raptor is a Salesforce, MuleSoft, Freshworks, and ServiceNow partner. A multi-technologies consultancy, specializing in Cloud, Data, Cyber security, programming, SAAS, Infra, testing, and project services. We’re currently working with several major brands both locally and internationally to help them improve the way they work by leveraging the power and pushing the boundaries of what Cloud technology can achieve. Not only do we provide amazing Cloud services capability, but we are also extremely focused on providing a work environment that is supportive, nurturing, safe and above all equal. Everyone deserves to be recognised for their hard work and an opportunity to progress in their career – if you’re up to the challenge and don’t mind hanging out with some cool people while providing awesome solutions – Cloud Raptor is the place for you! Job Description: We are seeking a dynamic and experienced Snaplogic Solution Designer to join our team. The ideal candidate will have a strong background in ETL development, data integration, and cloud services. As a Snaplogic Solution Designer, you will be responsible for designing, developing, and implementing integration solutions using the Snaplogic platform. Key Responsibilities: Design and develop integration solutions using Snaplogic to meet business requirements. Collaborate with stakeholders to understand business needs and translate them into technical solutions. Develop, document, and test ETL interfaces and data pipelines. Optimize and troubleshoot existing Snaplogic integrations. Provide technical expertise and support for Snaplogic-related projects. Ensure best practices in ETL development and data integration. Collaborate with cross-functional teams to deliver high-quality solutions. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum of 5 years of experience in ETL development and data integration. Proficiency in Snaplogic and other ETL tools. Experience with cloud services such as AWS, Azure, or Google Cloud. Strong SQL and PL/SQL skills. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Ability to work independently and as part of a team. Preferred Qualifications: Experience with other integration tools such as Informatica or Talend. Knowledge of data warehousing concepts and best practices. Certification in Snaplogic or related technologies.
Posted 2 months ago
2 - 4 years
4 - 6 Lacs
Bengaluru
Work from Office
About your team The Data Platform team manage the products and technical infrastructure that underpin the use of data at Fidelity databases (Oracle, SQL Server, PostgreSQL), data streaming (Kafka, Snaplogic), data security, data lake (Snowflake), data analytics (PowerBI, Oracle Analytics Cloud), data management, and more. We provide both cloud-based and on premise solutions as well as automations and self-service tools. The company predominately uses AWS, but we also deploy on Azure and Oracle Cloud in certain situations. About your role You role will be to use your skills (and the many skills you will acquire on the job) to develop our cloud solutions for the products and services we look after. We offer a complete service, so this includes support work as well as technical development work. Our goal is to provide the best possible service to our customers and so you will also be involved technical work to streamline our processes and provide self-service options where possible. We work in a highly regulated environment and so all solutions must be secure, highly available and compliant with policies. About you You will be a motivated, curious and technically savvy person who is always collaborative and keeps the customer in mind with the work you perform. Required skills are: - Practical experience of implementing simple & effective cloud and/or database solutions - Practical working knowledge of fundamental AWS concepts, such as IAM, networking, security, compute (Lambda, EC2), S3, SQS/SNS, scheduling tools - Python, BOTO and SQL programming (Java Script a bonus) - Experience of delivering change through CI/CD (and ideally Terraform) - Ability to work on tasks as a team player, share knowledge and deal effectively with people from other company departments - Excellent verbal & written communication in English
Posted 2 months ago
6 - 11 years
15 - 20 Lacs
Mumbai
Work from Office
1.Architecture ReviewConduct comprehensive assessments of existing Data, AI, and Automation architecture landscapes across various Financial Institutions and Regulators. Identify gaps compared to leading architecture patterns and best practices. 2.Gap Analysis and Pitch DevelopmentConstruct detailed pitches for transformation programs aimed at bridging identified gaps. Articulate the value proposition of proposed solutions to stakeholders. 3.Technical Solution ArchitectureLead the formulation of technical solution architectures for proposals and pitches. Ensure that solutions are innovative, scalable, and aligned with industry standards, while also establishing differentiators against competitors. 4.Technology Stack EvaluationEvaluate current technology stacks used by banks and assist in selecting appropriate products and partner ecosystems that enhance the overall architecture. 5.Execution OversightOversee the implementation of solution architectures during the execution phase. Review progress against architectural designs and ensure adherence to established guidelines and standards. 6.Stakeholder CollaborationCollaborate with cross-functional teams, including business analysts, developers, and project managers, to ensure alignment between business needs and technical solutions. 7.Documentation and ReportingMaintain clear documentation of architectural designs, decisions made, and execution progress. Provide regular updates to stakeholders on project status and any challenges encountered. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 1.Educational BackgroundA bachelor’s or master’s degree in computer science, Information Technology, Engineering, or a related field is required. 2.ExperienceAt least 15 years of experience in IT consulting or a related field with a strong focus on conceptualizing solution architecture in the banking sector. 3. Banking KnowledgeExpertise in leading or crafting analytical solutions or products within the banking sector. 4. Skills: a.Proficiency in designing scalable architectures that leverage Data, AI, and automation technologies. b.Strong understanding of cloud computing platforms (e.g., AWS, Azure, GCP ) and their application in banking solutions. c.Experience with various architectural scenarios in Banking i.Very Large-Scale Data Management and high performing solution architectures. ii.Low latency – Near Real time and Real Time Processing iii.High reliability d.Familiarity with programming languages, API libraries and communication protocols in Banking. 5.Professional Skills: a.Excellent skills with a strong ability to identify gaps in existing architectures. b.Strong communication skills to effectively convey complex technical concepts to non-technical stakeholders. c.Capabilities to assess, guide and course correct engagements execution decisions pertaining to solution architecture d.Understanding for regulatory guidelines on banking system interoperability and security. Preferred technical and professional experience 1.Familiarity with emerging trends in banking such as digital banking, embedded banking, and regulatory compliance requirements. 2.CertificationsRelevant certifications such as TOGAF (The Open Group Architecture Framework), AWS Certified Solutions Architect, or similar credentials that demonstrate expertise in architecture design. 3.Experience with Analytical SolutionsPrior experience in leading analytical solutions or products within the banking industry is highly desirable. 4.Understanding of Security PrinciplesKnowledge of security frameworks relevant to banking applications to ensure compliance with regulatory standards.
Posted 2 months ago
3 - 7 years
14 - 22 Lacs
Pune
Hybrid
In this role you will be expected to deliver software related solutions and will be responsible for taking client specifications and translate them into technical design. You will be required to work with integration technologies and platforms (like SAP, Snaplogic, etc.) training will be provided. You will also take full ownership for the end-to-end development activity (Designing, Coding, Testing, Debugging, Deployment) and create documentation for the solution and perform maintenance. You will have the opportunity to get responsibilities in consulting like ensuring development deadlines, assisting the clients in the implementation tasks and mentoring junior developers. 3-5 years of experience within an IT Software Development Environment in Java programming JavaScript, Python and/or C# will be an advantage Good knowledge of Object-Oriented programming concepts Experience working with an IDE, such as Eclipse, IntelliJ Experience in working with Maven, JUnit, Log4J Knowledge of Apache Camel framework is a plus Experience with JIRA, GitHub, Jenkins Comfortable with the code quality and vulnerability assessment tools like SonarQube, Grype Experience with Web Services and API technologies (REST, SOAP, OAuth, SSL) Good understanding of CI/CD processes What soft skills are we looking for? Quick Learner and adapt to the new tools and technologies Team Player with good technical, analytical, communication skills and friendly A bright mind and ability to understand a complex platform Ability to understand technical/engineering concepts and to learn integration product functionality and applications Demonstrated user-focused technical writing ability Must be able to communicate complex technical concepts clearly and effectively Strong analytical and problem-solving skills Ability to work independently in a dynamic environment Ability to work on multiple complex projects simultaneously Strong interpersonal communication skills Effectively communicates in one-to-one and group situations Additional Desired skills: You have at least a bachelors degree in computer engineering or a related field. Proficient in English. Affinity with any integration platform/software like Boomi, SAP Cloud Integration, or SnapLogic is good to have. Apart from all you need to have an Excellent attitude towards fellow colleagues and willing to learn new things in terms of technology and life challenges
Posted 2 months ago
2 - 5 years
4 - 7 Lacs
Mohali
Work from Office
We are seeking an experienced and talented Data Engineer to join our growing team. As a Data Engineer, you will be responsible for designing, implementing, and optimizing data solutions leveraging Snowflake. The ideal candidate should have a deep understanding of data engineering principles, expertise in Snowflake architecture, and a passion for building scalable and efficient data pipelines. Responsibilities : Design and implement scalable data architectures using Snowflake, ensuring optimal performance, reliability, and scalability. Work to define best practices for data modeling, storage, and processing within the Snowflake environment. Develop, implement, and maintain robust ETL processes using Snowflake's native features and tools in collaboration with integration developers using an IPAAS like SnapLogic. Collaborate with data consumers to understand data requirements, ensure the availability of high-quality, well-organized data, and deliver effective solutions. Optimize and tune Snowflake data warehouses to achieve optimal query performance and resource utilization. Design and implement effective data models that align with business requirements and industry best practices. Ensure data models are optimized for Snowflake's unique capabilities, such as automatic clustering and metadata indexing. Implement and enforce data security measures in accordance with industry standards and organizational policies. Develop and maintain automated processes for data pipeline orchestration, monitoring, and alerting. Proactively identify and address performance bottlenecks and data inconsistencies. Create and maintain comprehensive documentation for data engineering processes, workflows, and Snowflake configurations. Qualifications : Bachelor's degree in computer science, Information Technology, or a related field. Proven experience as a Data Engineer with a focus on Snowflake. Strong SQL proficiency and expertise in designing and optimizing complex queries. In-depth knowledge of Snowflake architecture, features, and best practices. Experience with scripting languages (Python or equivalent) for automation and data manipulation. Familiarity with data warehousing concepts, cloud computing, and distributed systems. Preferred Skills : Snowflake certifications (e.g., SnowPro Core, SnowPro Advanced). Experience with other cloud platforms (AWS, Azure, GCP). Familiarity with data versioning and lineage tracking.
Posted 2 months ago
5 - 10 years
14 - 22 Lacs
Pune
Hybrid
SAP PI / PO / CPI - Customer Support We are looking for experienced Enterprise Integration Consultants in Customer Supppport to join our team. It's on Shift The ideal candidate has: Excellent English Communication skills. • Strong knowledge of integration principles and consultancy skills to be able to translate business to IT requirements. • Hands-on experience in integration of SAP and non-SAP systems in A2A and B2B scenarios. • Extensive experience in integration of SAP and non-SAP systems in A2A and B2B scenarios using SAP Integration Suite or Cloud Integration (CPI) • In-depth technical, functional, and architectural expertise in integrating SAP PI / PO products with non-SAP applications using different technologies such as but not limited to; ALE-IDocs, EDI, RFC, XI, HTTP, IDOC, JDBC, File/FTP, SOAP, Mail, REST, JMS using SAP XI/PI/PO or CPI. • Expert level in Programming User Defined Function UDF in Message Mapping, Lookup implementation in UDF or standard functions: RFC Lookup, SOAP Lookup, and JDBC Lookup, Proficient on ccBPM and RWB. • Extensive knowledge of Java, WS-RM, Service-Oriented architecture SOA, NWA, iFlow, NW BRM, and BPM. Strong experience using NWDS for development. • Connectivity using ABAP Proxies: Inbound Proxy and Outbound Proxy. • Solid middleware knowledge and web service skills. • Good understanding of REST API and Web Services. • Experience integration with main SAP backend systems (SAP ERP, SAP S/4HANA, SAP S/4HANA Cloud) Requirements to succeed in this role Experience using SAP PI, SAP PO or SAP Cloud Integration (CPI) Quick Learner and adapt to the new tools and technologies and evaluate their test applicability. • Team Player with good technical, analytical, communication skills and client-driven mindset. A bright mind and ability to understand a complex platform. Ability to understand technical/engineering concepts and to learn integration product functionality and applications; Demonstrated user-focused technical writing ability. Must be able to communicate complex technical concepts clearly and effectively. Strong analytical and problem-solving skills. • Ability to work independently in a dynamic environment. Ability to work on multiple complex projects simultaneously. Strong interpersonal communication skills. Effectively communicates in one-to-one and group situations. At least three years of previous experience in a similar role. Additional desired skills: You have at least a Bachelors degree in computer engineering or a related field Experience with any API Management Platform Experience with any other non-SAP iPaaS. Example: Boomi, SnapLogic, Apache Camel, etc. Good understanding CI/CD concepts Speak and write English fluently • Affinity with any integration platform/software like Boomi, SAP Cloud Integration, or SnapLogic is desirable
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2