Key Responsibilities: Integration with other SAP solutions: The ability to integrate SAP BRIM with other SAP and Non-SAP systems . Preferably a middleware consultant(PI, CPI, Dell Boomi etc.) who understands the basics of integration. Programming and Configuration: CM uses APL (A Programming Language) which is like Java/Java Script. APLs are utilized to define processing rules, transformations, and business logic within the mediation process. Data and Workflow Management: CM runs multiple workflows as their primary source of functionality which are enabled to handle huge volumes of data. Troubleshooting & Optimization: Skilled in identifying and resolving issues in end-to-end scenarios/processes while optimizing CM runtime and memory utilizations. Communication & Documentation: Strong ability to collaborate with cross-functional teams and making it a point to document solutions and processes clearly for everyone involved.
Job Summary: We are looking for a talented K2 Developer with experience in both K2 FIVE and K2 Cloud to design and implement business process applications. The ideal candidate will have a strong background in workflow automation, system integration, and enterprise-grade form development, with the ability to deliver scalable and maintainable solutions. Key Responsibilities: Design and develop K2 SmartForms, Views, SmartObjects , and Workflows Provide technical solutioning and design for K2 applications Integrate K2 solutions with SQL Server , Oracle , SharePoint , and SAP Create and manage REST APIs , including Swagger definition files Deploy K2 packages across multiple environments Implement OData and other web service integrations Work with PowerApps and Power Automate to support business process automation Collaborate with cross-functional teams in Agile development environments Troubleshoot and resolve issues related to K2 workflows and system integrations Required Skills: Strong experience with K2 FIVE and K2 Cloud Hands-on skills in K2 SmartForms , Views , Workflows , and SmartObjects Proficient in SQL (MS SQL Server) and working knowledge of Oracle Experience in REST API implementation , Swagger file creation Integration experience with SharePoint , SAP , and other third-party systems using ODATA , REST APIs , and Web Services Familiarity with PowerApps and Power Automate Strong understanding of HTML, CSS, and JavaScript for form customization Knowledge of Agile methodologies Preferred Qualities: Self-driven, collaborative, and solution-oriented mindset Strong problem-solving skills and ability to handle production-level issues Excellent communication and interpersonal skills Why Join Us Be part of a global automotive technology initiative Work with cutting-edge process automation platforms Opportunity for professional growth and cross-functional collaboration
Role: DB2 Database Administrator z/OS Location: BLR, HYD, CHENNAI, PUNE, NOIDA Work from home Budget: 20-25 LPA Experience : 5+ years (4+ years relevant experience) Notice Period: Immediate to 15 days Payroll: Quess corp & C2H period with Kyndryl 18 months Job Description: The Client Tech Management SAP team provides support for Clients global SAP systems running on Db2 z/OS platform. As one of the team members of the TM-SAP/MF1 team with special focus on Db2, the candidate will: Participate in projects for deploying my SAP/Net weaver applications in a win Clients Z/OS Db2 environment. Install, use and maintain Db2 software and supporting db2 utilities. Build and maintain Db2 subsystems Provide DB2 and SAP basis support as needed to meet service level agreements for our development and production instances of Db2. Diagnose and tune performance issues Participate in design, setup and testing of back up, recover and DR solutions Participate in root cause analysis of recurring problems Create documentation for problem resolutions to be used by other support group members Participate in weekly status reporting for issue identification and escalation. Participate in on call rotation to provide 7 x 24 support Continually look for improvement opportunities in the Db2 space and recommend / implement as appropriate Qualifications and Skills: Experience in db2 technology including: DB2 Version 13.01.0002 [DB Release] Distributed Data Facility, DB2 Connect, Data Sharing/Sysplex Db2 Utilities (IBM or 3rd party) and monitoring tools Basic Z/OS mainframe skills Understanding of how Infrastructure relates across different technologies Ability to participate in root cause analysis of technical issues. Logical, creative, problem solving and human relation skills. If interested, please share your resume to priya@lionandelephants.com & abinaya@lionandelephants.com along with the following details Notice Period Current CTC Expected CTC Total & Relevant experience Any offers in hand Thanks & Regards, LION & ELEPHANTS CONSULTANCY PVT LTD SINGAPORE | INDIA
Urgent Hiring: Payments Business Analyst (Techno-Functional) Location: India Employment Type: Full-time | Onsite About the Role We are urgently seeking an experienced Techno-Functional Payments Business Analyst to join a dynamic team working on cutting-edge payment platforms. This role is ideal for professionals with deep expertise in the payments domain who enjoy bridging the gap between technology and business. Key Responsibilities Configure payment processors and support end-to-end payment processing. Conduct detailed requirements analysis and translate into solution designs. Serve as SME for production issues, root cause analysis, and issue resolution. Collaborate with cross-functional stakeholders across technology and business units. Work within Agile/Scrum delivery frameworks. Must have skills: 5+ years of experience in the payments domain. Strong expertise in GPP / FIS OPF. Solid understanding of SWIFT (MT/MX) and ISO 20022 messaging standards. Knowledge of RTGS, ACH, and domestic/international clearing systems. Hands-on experience in Java development and API integration (REST/SOAP). Familiarity with microservices architecture Practical exposure to Agile/Scrum methodologies. Bonus Skills (Nice to *Have ) Experience with cloud platforms (AWS, Azure, GCP). Exposure to Kafka, Docker, and Kubernetes. Understanding of Payment Fraud, AML, and compliance systems. Ideal Candidate Profile Payments domain SME with both business and technical acumen. Strong problem-solving skills and ability to work independently. Comfortable in a fast-paced and evolving environment.
Role: Technical Writer Location: Chennai, Hybrid Budget: up to 15 LPA (Hike based on Current CTC) Experience: 4-8 years Notice Period: Immediate to 15 days Job Type: Permanent with Quess Corp No of positions: 3 Interview process: L1 Virtual interview & L2 Face-to-Face interview at client location Job Description: We are looking for a Technical Writer with proven experience in creating and maintaining software documentation. The ideal candidate should have: Strong skills in authoring user manuals, installation guides, API documentation, and online help systems Ability to collaborate with software developers, QA, and product managers to gather and structure technical information Experience with documentation tools such as MadCap Flare, RoboHelp, Confluence, or similar Working knowledge of software development processes and the ability to understand technical concepts Excellent written and verbal communication skills Familiarity with version control tools (e.g., Git) and Agile methodologies is a plus If interested, please share your resume to priya@lionandelephants.com & abinaya@lionandelephants.com along with the following details Notice Period Current CTC Expected CTC Total & Relevant experience Any offers in hand Thanks & Regards, LION & ELEPHANTS CONSULTANCY PVT LTD SINGAPORE | INDIA
Job Description:- Sr. Technical Lead with expertise in React.js, Node.js, PostgreSQL, Azure Cloud, Java, SpringBoot and DevOps. This role involves designing scalable, high-performance solutions, leading technical teams, and optimizing cloud and DevOps workflows to improve efficiency. The ideal candidate will demonstrate a strong drive for innovation, process automation, and technical leadership, while fostering business relationships through effective communication and collaboration. Key Responsibilities 1. Architecture & Innovation 1. Architect scalable, secure, and high-performance solutions using React.js, Node.js, PostgreSQL, Azure Cloud, Java, SpringBoot and DevOps. 2. Drive innovation in architecture, development practices, and automation strategies. 3. Design and implement containerized applications using Docker and orchestrate them with Kubernetes. 4. Leverage Azure services for cloud-native application design, hosting, monitoring, and scaling. 5. Conduct architecture reviews, code reviews(PR Reviews), and performance tuning. 2. Development & Integration 1. Lead full-stack development efforts using Java (Spring Boot), Node.js, and React.js/Angular.js. 2. Design and optimize PostgreSQL schemas and queries; integrate with NoSQL databases as needed. 3. Implement CI/CD pipelines and DevOps workflows using Azure DevOps, GitHub Actions, and container technologies. 4. Champion best practices in software engineering, testing, and deployment. 3. Cloud & Infrastructure 1. Design and manage Azure infrastructure using ARM templates, Bicep, or Terraform.
Location : Noida. Sol. Arch. SAP SD Bill Rate: Max- 28 lpa Experience: 10+ Years (Relevant 8+ Years with SD and must have experience with Indian implementation Project) Sol. Arch SAP FI Bill Rate: Max- 28 lpa Experience: 12+ Years (Relevant 8+ Years with FI SAP SD Solution Architect Responsibilities: Lead the design and implementation of SAP SD solutions for large-scale transformation projects. Analyze business requirements and translate them into effective SAP solutions. Ensure end-to-end integration with other SAP modules and external systems. Collaborate closely with cross-functional teams, including development, testing, and project management. Provide guidance on best practices, system architecture, and SAP capabilities. SAP FI Solution Architect Responsibilities: Design, configure, and implement SAP FI solutions tailored to business needs. Ensure compliance with statutory and regulatory requirements specific to India. Lead financial integration with SD, MM, and other SAP modules. Act as the subject matter expert in financial reporting and process optimization. Work with stakeholders to support change management and user training initiatives.
Job Title: Big Data Engineer / Developer Location: Chennai, Hyderabad, Pune, Bangalore, Delhi / Onsite - Hybrid Employment Type: Full time / Perm Experience: 5-10 years Job Description: We are looking for skilled Big Data Engineers using Java Spark with 5-10 years of experience in Big Data / legacy platforms, who can join immediately. Desired candidates should have design, development and optimisation of real-time & batch data pipelines experience in a Big Data environment at an enterprise scale application. You will work on building scalable and high-performance data processing solutions, integrating real-time data streams, and building a reliable Data platform. Strong troubleshooting, performance tuning, and collaboration skills are key for this role. Key Responsibilities: Develop data pipelines using Java Spark and Kafka. Optimize and maintain real-time data pipelines and messaging systems. Collaborate with cross-functional teams to deliver scalable data solutions. Troubleshoot and resolve issues in Java Spark and Kafka applications Qualifications: Experience in Java and Apache Spark is must Knowledge and hands-on experience using distributed computing, real-time data streaming, and big data technologies Strong problem-solving and performance optimisation skills Looking for immediate joiners
Job Description: Role 1.We are seeking experienced Java Microservices Developers with strong expertise in building scalable microservices using Spring Boot . The ideal candidate will have experience working in cloud environments (preferably PCF), DevOps tooling, and CI/CD pipelines. Role 2.We are looking for a Java Microservices Lead to guide a team of developers and drive microservices-based application development. The candidate must have deep technical expertise in Java, Spring Boot, and cloud deployment, along with a strong understanding of DevOps practices. Key Responsibilities: Develop and maintain microservices using Java and Spring Boot. Implement and manage CI/CD pipelines. Work with DevOps tools and Unix shell scripting. Deploy services to cloud environments (preferably PCF). Configure services using YAML and scripting. Follow best practices like DDD, BDD, and TDD. Troubleshoot and resolve application issues. Integrate with external systems and manage authentication using OAuth. Continuously improve monitoring and alerting systems. Mandatory Skills: Java, Spring, Spring Boot RESTful Web Services / Microservices CI/CD, DevOps tools and processes Desired Skills: ReactJS, AWS, PCF, Test-Driven Development Domain: Financial Services
Job Description Shared with Candidates: Role Overview: The Salesforce Developer will be responsible for designing, developing, and maintaining applications on the Salesforce platform using Apex , Lightning Web Components (LWC) , Aura , and integration with external systems . The role involves working with Sales/Service Cloud, implementing scalable solutions, and collaborating with global teams. Key Responsibilities: Develop solutions using Apex , LWC , and Aura components Customize and configure Salesforce Sales/Service Cloud Implement REST/SOAP API integrations with third-party systems Debug and resolve issues using logs and system tools Conduct unit testing , participate in code reviews Stay up-to-date with the latest Salesforce features and releases Collaborate with global stakeholders and cross-functional teams Candidate Profile: Minimum 4+ years of hands-on Salesforce development experience Strong in Apex , LWC , Aura , Flow , and API integrations Experience with REST/SOAP , JSON , XML , and integration patterns Good understanding of Salesforce architecture , platform limits, and deployment tools Ability to work as an individual contributor Salesforce certifications (Platform Developer I/II) preferred
Key Responsibilities Design, develop, and maintain large-scale distributed data pipelines using Apache Spark and Scala . Write clean, efficient, and maintainable Scala code adhering to industry best practices. Implement Spark Core , Spark SQL , and Spark Streaming modules for real-time and batch data processing. Collaborate with cross-functional teams to gather and understand data processing requirements. Optimize performance of complex queries and processing logic in Hadoop ecosystems . Develop and manage workflows using UNIX shell scripting , Hive , Sqoop , and Impala . Participate actively in Agile/Scrum ceremoniesdaily stand-ups, sprint planning, retrospectives, etc. Provide production support and maintenance for existing applications, ensuring high availability and performance. Conduct root cause analysis and resolve data pipeline issues in collaboration with upstream/downstream teams. Stay updated with the latest Big Data technologies and contribute to continuous improvement initiatives. Mandatory Skills Strong proficiency in Scala with at least 8+ years of hands-on experience. Solid experience with Apache Spark for building distributed data processing applications. In-depth understanding of data structures , algorithms , and design patterns . Strong command over SQL and working knowledge of NoSQL databases. Desired Skills Spark Streaming Hadoop and HDFS Hive , Sqoop , Impala UNIX shell scripting Familiarity with ETL concepts , Data Warehousing
Roles & Responsibilities: * Monitor production Ab Initio ETL jobs and processes (graphs, plans, etc.) for failures, performance degradation, and data discrepancies. * Diagnose and resolve complex production issues related to data quality, source system changes, infrastructure, Ab Initio components, and external dependencies * Perform root cause analysis (RCA) to identify underlying problems and prevent recurrence. * Tune Ab Initio graphs and components to optimize execution time, resource utilization, and overall efficiency. * Analyze and interpret logs, error messages, and system metrics to identify performance issues. * Implement performance enhancements and best practices * Perform data reconciliation and validation checks to identify and correct data inconsistencies. * Apply patches, upgrades, and configuration changes to Ab Initio environments. * Develop and implement minor enhancements or bug fixes to existing Ab Initio code as required. * Maintain and update technical documentation, run books, and support procedures Mandatory skills* 1. Ab Initio ETL Desired skills* * knowledge in Unix/SQL/Linux commands * Shell Scripting
We're looking for a skilled Salesforce CPQ Support Engineer with 23 years of experience to join our team! Location: Noida / Gurgaon Work Mode: Hybrid (2 days in office) Experience: 2 - 3 years Key Skills: Salesforce CPQ Bundle config, Product/Price rules, Option constraints, Discounts, Quote mgmt. Basic knowledge of HTML, JavaScript, CSS Interview Rounds: 2
Job Summary: We are seeking an experienced and highly skilled BI Tools Administrator & Architect to manage the administration, architecture, and optimization of enterprise-grade BI platforms including IBM Cognos, Oracle Hyperion, and Tableau Server. The ideal candidate will have hands-on experience in installation, configuration, patching, and security integration, along with strong SQL skills and the ability to collaborate across teams to deliver scalable and efficient BI solutions. Key Skills & Expertise Required: IBM Cognos Analytics - Administration, Upgrades, Patching Oracle Hyperion Essbase / Planning - Installation, Configuration, Compliance Tableau Server - HA Architecture, LDAP, SSL, Load Balancing Patch Management & Technical Upgrades (Cognos, Hyperion, Tableau) Security Integration (LDAP, SSO, SSL) & Performance Tuning BI Architecture & Framework Manager Models Strong SQL and scripting skills (e. g., Shell, Python, PowerShell preferred) Roles & Responsibilities: BI Tools Administration Architecture Design Installation & Configuration Security & Compliance Patch & Upgrade Management Performance Monitoring & Optimization Collaboration & Support Automation & Monitoring Documentation & Best Practices Training & Knowledge Sharing
Responsibilities include: Develop, enhance, document, and maintain application features in .Net Core 5/6+ , C#, /REST API/T-SQL and AngularJS/React JS Application Support & API Integrations with third party solutions/services Understand technical project priorities, implementation dependencies, risks and issues Participate and develop code as part of a unified development group, working the whole technological stack Identify, prioritize and execute tasks in the software development life cycle Work with the team to define, design, and deliver on new features Broad and extensive knowledge of the software development life cycle (SDLC) with software development models like Agile, Scrum model, Jira models. Effective communication skill ,technical documentation , leadership and ownership quality Primary Skills : Develop high-quality software design and architecture 6+ years of development experience in C# , .Net technologies, SQL and at least two year working with Azure Cloud Services Expertise in C#, .Net Core 0/6.0 or higher, Entity framework, EF core, Microservices, Azure Cloud services, Azure DevOps and SOA Ability to lead, inspire and motivate teams through effective communication and established credibility Guide team to write reusable, testable, performant and efficient code Proficient in writing Unit Test Cases using X-Unit, MS-Test Build standards based frameworks and libraries to support a large-scale application Expertise in RDBMS including MS SQL Server with thorough knowledge in writing SQL queries, Stored Procedures, Views, Functions, Packages, Cursors & tables and objects Experience in large scale software Prior experience in Application Support & API Integrations Knowledge of architectural styles and design patterns, experience in designing solutions Strong debugging and problem-solving skills Azure Skills (MANDATORY) Azure Messaging services - Service Bus or Event Grid, Event hub Azure Storage Account - Blobs, Tables, Queue etc Azure Function / Durable Functions Azure DevOps - CI/CD pipelines (classic / YAML) Secondary Skills : Good knowledge of JavaScript, React JS, jQuery, Angular and other front end technologies API Management - APIM Expertise in Microsoft Azure Cloud Service , Application Insights, Azure Monitoring, KeyVault and SQL Azure. Azure DevOps - CI/CD pipelines (classic / YAML) Hands on experience in building and deploying applications by adopting Azure DevOps practices such as Continuous Integration (CI) and Continuous Deployment (CD) in runtime with Git, Docker, Kubernetes and managing Azure Cloud Services. Azure Container Apps/Docker /Azure Container Registry
We are seeking a skilled Test Engineer to join our dynamic team for a MuleSoft Implementation/Integration Project. As a Test Engineer, you will play a crucial role in ensuring the quality and reliability of our Salesforce Service Cloud, OMS, HRC AR Implementation, Esker, and SF Sales Assist integrations. You will collaborate closely with cross-functional teams to design, develop, and execute comprehensive test plans. Responsibilities Test Planning and Strategy: Collaborate with project stakeholders to understand integration requirements. Develop detailed test plans outlining test scenarios, test cases, and acceptance criteria. Test Execution: Execute functional, integration, and regression tests for MuleSoft integrations. Conduct thorough testing of Salesforce Service Cloud, OMS, HRC AR Implementation, Esker, and SF Sales Assist. Automation: Design and implement automated test scripts to increase efficiency and test coverage. Utilize testing frameworks and tools for test automation. Defect Management: Identify, document, and prioritize defects. Work closely with development teams to ensure timely resolution of identified issues. Collaboration: Collaborate with developers, business analysts, and other team members to understand business processes and integration points. Participate in regular project meetings and provide testing status updates. Reporting: Generate comprehensive test reports for management and project stakeholders. Provide input on the overall quality of the implemented solutions. Continuous Improvement: Stay updated on industry best practices and emerging trends in MuleSoft and Salesforce testing. Propose and implement process improvements to enhance the overall testing methodology. Qualifications we seek in you! Minimum Qualifications Bachelors degree in engineering, Computer Science, Information Systems, or a related STEM field. Proven experience as a Test Engineer in MuleSoft and Salesforce environments. Strong understanding of Salesforce Service Cloud, OMS, HRC AR Implementation, Esker, and SF Sales Assist. Experience in creating and executing test plans, test cases, and automated test scripts. Proficiency in testing tools and frameworks (e.g., Selenium, JUnit, TestNG). Excellent communication and collaboration skills. Ability to work in a fast-paced, dynamic environment. Preferred Qualifications/ Skills Prior experience working on Agile Scrum Methodology Prior experience testing applications from legacy to the latest technologies like Mainframe, Oracle Apps, Web based applications, Portals, IoS and Android apps MuleSoft Automation Tester
Key Responsibilities Handle onboarding of complex entities and remediation of customer profiles. Remove KYC restraints while maintaining accurate and updated KYC records. Support the KYC CSO to ensure compliance, accuracy, and efficiency. Manage multiple tasks across KYC verification, OCDD, data remediation, and follow-ups. Communicate effectively with customers and RMs through MS Teams, voice, and email/chat support (Voice 80%, Email/Chat 20%). Continuously upskill on complex queues to meet evolving business needs. Consistently achieve efficiency and quality targets as per performance framework. Role Requirements 3+ years of Banking experience in KYC/AML/Disputes/Fraud/Compliance/Investigation. Strong verbal & written communication and interpersonal skills. Proficiency in MS Office Suite, SharePoint, and general computer applications. Ability to multitask and manage complex tasks efficiently. Flexibility to work in early shifts and be available for Work from Office (WFO).
About the Role: We are looking for a Conversion Lead with deep expertise in SQL performance tuning , ERP data migration , and database optimization . This role involves taking full ownership of data conversion processes in complex ERP environments, ensuring smooth, efficient, and accurate migration across systems. Key Responsibilities: Analyze and optimize SQL queries for performance and scalability. Design and implement robust database solutions, including stored procedures, views, and functions. Lead end-to-end ERP data migration processes, including data extraction, transformation, loading (ETL), and reconciliation. Own real-time issue resolution related to data inconsistencies or migration failures. Define and maintain data mapping documents, configuration settings, and migration scripts. Set up and manage data migration environments to ensure security, efficiency, and reliability. Nice to Have (Preferred Skills): Experience with Oracle Utilities and data conversion tools like FBDI (File-Based Data Import) & HDL (HCM Data Loader) . Strong functional understanding of ERP systems , especially data flow between modules and third-party integrations. Exposure to BI tools such as OTBI and BI Publisher for reporting and validation. Past experience acting as a Subject Matter Expert (SME) in ERP data conversions.
About the Client: Join one of the worlds leading financial institutions Commonwealth Bank of Australia (CommBank) focused on delivering secure, scalable, and intelligent payment systems with 99.999% uptime. Be part of a global team driving AI adoption, cloud transformation, and enterprise-grade automation within the Payments Enabling Technology & Environments team. Key Responsibilities: Develop scalable, secure enterprise applications using Power Platform (Power Apps, Power Automate), .NET, SQL Server , and Pega . Design, build, and deploy cloud-native and hybrid solutions across Azure , AWS , and on-prem environments. Create and integrate APIs and automation workflows to optimize business processes. Troubleshoot complex system issues across Microsoft, Pega, and cloud stacks. Implement CI/CD pipelines using Azure DevOps and GitHub . Apply AI , analytics, and observability practices to enhance performance and reliability. Collaborate with architects, SMEs, and stakeholders to deliver mission-critical payment systems. Required Skills & Experience: 510 years of hands-on development experience with: Power Platform (Power Apps, Power Automate) .NET , SQL Server Pega (nice to have) Azure / AWS Strong understanding of CI/CD tools (Azure DevOps, GitHub) Experience with API integration , workflow automation, and cloud/on-prem hybrid architectures Familiarity with payments processing , data encryption, and compliance standards Exposure to monitoring, AI automation, and observability frameworks Strong problem-solving and collaboration skills Qualifications: Bachelor's Degree in Computer Science, IT, or a related field Microsoft and/or Pega certifications are a plus
Job Overview: We are seeking an experienced QA Engineer with strong expertise in API Automation and Networking Protocols to join our team. The ideal candidate will have a solid background in API testing, automation frameworks, and network protocols, along with hands-on experience in validating complex integrations and performance testing across NaaS components. Mandatory Skills: Minimum 2.6 years in API Automation with strong understanding of networking protocols. Proven ability to test REST/SOAP APIs using tools like Postman , JMeter , and other standard tools. Familiarity with networking concepts (TCP/IP, VPN, IPSEC, SD-WAN, HA Failover). Experience developing and maintaining automated test suites using Selenium , Postman , and Jenkins . Expertise in TMF APIs (e.g., TMF 638 Service Inventory, TMF 641 Service Order). Experience in testing API integration between multiple IT systems . Key Responsibilities: Design, develop, and maintain automated test scripts for API and GUI testing . Perform API performance testing and validate integrations between NaaS components (e.g., NaaP, SD-WAN controllers, SASE). Conduct testing in Linux and Windows environments. Analyse test results, identify defects, and drive resolution. Collaborate with cross-functional teams for seamless integration and delivery . Document test strategies, plans, and reports for stakeholder review. Perform design validation , performance testing , and end-to-end workflow validation . Assess API performance, identify bottlenecks , and recommend improvements. Lead and mentor QA teams, drive testing strategies , and manage test cycles. Technical Proficiency: Automation Tools: Selenium , UFT , Cypress . API Tools: Postman , JMeter . CI/CD: Jenkins . Scripting: Bash , PowerShell . Networking Protocols: TCP/IP, VPN, IPSEC, SD-WAN, HA Failover . OS: Linux , Windows . SD-WAN Technologies: Meraki , Fortinet (provisioning and policy validation). TMF APIs and NaaS architecture understanding.