Home
Jobs

561 Sftp Jobs - Page 13

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 8.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Role : Digital : SAP Cloud Platform Integration (CPI) Experience: 6-8 Years Location : Hyderabad, Chennai, Ahmedabad Please find the job description mentioned below Experience in implementations of SAP PI/PO and SAP Integration Suite SuccessFactors integration and custom interface development in CPI/PO Required Technical and functional expertise in integrating SAP S/4HANA, SRM and HCM with 3rd party applications using different technologies ALE-IDocs, RFC, XI, HTTP, IDOC, JDBC, File FCC, REST, SOAP, Mail, JMS, SFTP, SFSF using SAP PO/CPI with security mechanism Hands on experience in JAVA mapping, UDFs, Graphical mapping, XSLT, Groovy script. A should have good communication skill, good team player and good logical and analytical thinking Show more Show less

Posted 2 weeks ago

Apply

9.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About Markovate. At Markovate, we dont just follow trendswe drive them. We transform businesses through innovative AI and digital solutions that turn vision into reality. Our team harnesses breakthrough technologies to craft bespoke strategies that align seamlessly with our clients' ambitions. From AI Consulting And Gen AI Development To Pioneering AI Agents And Agentic AI, We Empower Our Partners To Lead Their Industries With Forward-thinking Precision And Unmatched Overview We are seeking a highly experienced and innovative Senior Data Engineer with a strong background in hybrid cloud data integration, pipeline orchestration, and AI-driven data modelling. Requirements This role is responsible for designing, building, and optimizing robust, scalable, and production-ready data pipelines across both AWS and Azure platforms, supporting modern data architectures such as CEDM and Data Vault Requirements : 9+ years of experience in data engineering and data architecture. Excellent communication and interpersonal skills, with the ability to engage with teams. Strong problem-solving, decision-making, and conflict-resolution abilities. Proven ability to work independently and lead cross-functional teams. Ability to work in a fast-paced, dynamic environment and handle sensitive issues with discretion and professionalism. Ability to maintain confidentiality and handle sensitive information with attention to detail with discretion. The candidate must have strong work ethics and trustworthiness. Must be highly collaborative and team oriented with commitment to Responsibilities : Design and develop hybrid ETL/ELT pipelines using AWS Glue and Azure Data Factory (ADF). Process files from AWS S3 and Azure Data Lake Gen2, including schema validation and data profiling. Implement event-based orchestration using AWS Step Functions and Apache Airflow (Astronomer). Develop and maintain bronze ? silver ? gold data layers using DBT or Coalesce. Create scalable ingestion workflows using Airbyte, AWS Transfer Family, and Rivery. Integrate with metadata and lineage tools like Unity Catalog and Open Metadata. Build reusable components for schema enforcement, EDA, and alerting (e.g., MS Teams). Work closely with QA teams to integrate test automation and ensure data quality. Collaborate with cross-functional teams including data scientists and business stakeholders to align solutions with AI/ML use cases. Document architectures, pipelines, and workflows for internal stakeholders. Experience with cloud platforms: AWS (Glue, Step Functions, Lambda, S3, CloudWatch, SNS, Transfer Family) and Azure (ADF, ADLS Gen2, Azure Functions, Event Grid). Skilled in transformation and ELT tools: Databricks (PySpark), DBT, Coalesce, and Python. Proficient in data ingestion using Airbyte, Rivery, SFTP/Excel files, and SQL Server extracts. Strong understanding of data modeling techniques including CEDM, Data Vault 2.0, and Dimensional Modelling. Hands-on experience with orchestration tools such as AWS Step Functions, Airflow (Astronomer), and ADF Triggers. Expertise in monitoring and logging with CloudWatch, AWS Glue Metrics, MS Teams Alerts, and Azure Data Explorer (ADX). Familiar with data governance and lineage tools: Unity Catalog, OpenMetadata, and schema drift detection. Proficient in version control and CI/CD using GitHub, Azure DevOps, CloudFormation, Terraform, and ARM templates. Experienced in data validation and exploratory data analysis with pandas profiling, AWS Glue Data Quality, and Great to have: Experience with cloud data platforms (e.g., AWS, Azure, GCP) and their data and AI services. Knowledge of ETL tools and frameworks (e.g., Apache NiFi, Talend, Informatica). Deep understanding of AI/Generative AI concepts and frameworks (e.g., TensorFlow, PyTorch, Hugging Face, OpenAI APIs). Experience with data modeling, data structures, and database design. Proficiency with data warehousing solutions (e.g., Redshift, BigQuery, Snowflake). Hands-on experience with big data technologies (e.g., Hadoop, Spark, Kafka). Proficiency in SQL and at least one programming language (e.g., Python, it's like to be at Markovate : At Markovate, we thrive on collaboration and embrace every innovative idea. We invest in continuous learning to keep our team ahead in the AI/ML landscape. Transparent communication is keyevery voice at Markovate is valued. Our agile, data-driven approach transforms challenges into opportunities. We offer flexible work arrangements that empower creativity and balance. Recognition is part of our DNAyour achievements drive our success. Markovate is committed to sustainable practices and positive community impact. Our people-first culture means your growth and well-being are central to our mission. Location : hybrid model 2 days onsite. (ref:hirist.tech) Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include: Comprehensive Feature Development and Issue Resolution: Working on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue Resolution: Collaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology Integration: Being eager to learn new technologies and implementing the same in feature development Preferred Education Master's Degree Required Technical And Professional Expertise 6+ Relevant years of Experience 4 Mandatory Skills Finacle Environment Management Support (EMT) specialist Installation and configuration of Finacle 10x and 11x Good to have (Not Mandatory) . We are seeking a skilled Finacle Environment Management Support (EMT) specialist with a comprehensive understanding of Finacle Architecture for both 10x and 11x Environments. The Candidate will have hands-on experience with the Finacle FDM installation and process a strong background in Finacle Environments through various tasks including setting up new services debugging and analyzing logs. Finacle 10x and 11x installation starting from VM procurement configuration, Finacle installation via FDM. Finacle 10x and 11x Environment maintenance and support Preferred Technical And Professional Experience Connectivity establishment between system via CD(Connect Direct) SSH,SFTP Support and maintain the Finacle environment by setting up new services, performing root cause analysis (RCA), and providing necessary fixes Utilize Unix/Linux, WebSphere, WebLogic, JBoss, Shell scripting, Finacle scripting, and Oracle DB and basic Finacle Application knowledge Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

We are seeking an experienced GoAnywhere Administrator with over 5 years of expertise in managing and optimizing GoAnywhere Managed File Transfer (MFT) solutions. The ideal candidate will be responsible for ensuring the secure, reliable, and efficient transfer of data across our systems and with our partners. This role requires a deep understanding of MFT processes, security protocols, and compliance requirements. Key Responsibilities : System Administration: Oversee the installation, configuration, and maintenance of GoAnywhere MFT systems to ensure operational efficiency and security. Workflow Design and Automation: Develop and manage automated workflows for file transfers, leveraging GoAnywhere's features to streamline operations. Security Management: Implement and maintain security policies, including encryption, access controls, and secure transfer protocols (SFTP, FTPS, HTTPS). User and Role Management: Create and manage user accounts and roles, ensuring appropriate access levels and permissions. Monitoring and Troubleshooting: Monitor file transfer activities, resolve issues, and optimize performance. Provide support for incident and problem management. Compliance and Reporting: Ensure that all file transfer activities comply with relevant regulations and standards. Generate detailed reports for audits and compliance checks. Collaboration: Work closely with IT teams, business units, and external partners to understand requirements and deliver effective file transfer solutions. Documentation: Maintain comprehensive documentation of systems, processes, and procedures for internal reference and training purposes. Mandatory Requirements : GoAnywhere Administrator with over 5 years of expertise in managing and optimizing GoAnywhere Managed File Transfer (MFT) solutions. Notice Period: Immediate- 30 days Relevant Experience Required: 5-10 years Location: Bangalore ,Hyderabad, Mumbai, Kolkata, Gurgaon, Chennai, Noida Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

India

On-site

Linkedin logo

Welcome to Veradigm! Our Mission is to be the most trusted provider of innovative solutions that empower all stakeholders across the healthcare continuum to deliver world-class outcomes. Our Vision is a Connected Community of Health that spans continents and borders. With the largest community of clients in healthcare, Veradigm is able to deliver an integrated platform of clinical, financial, connectivity and information solutions to facilitate enhanced collaboration and exchange of critical patient information. Job Description: Expert Software Engineer(Boomi Developer) (8+ years of experience) We are looking for a highly skilled and experienced Dell Boomi Developer with a strong background in integration design, development, deployment, and maintenance using the Dell Boomi platform. The ideal candidate should have over 8+ years of hands-on experience in developing integration solutions, working with various Boomi connectors, and effectively handling complex integrations as an Individual Contributor. Technical Skill Sets Design, develop, deploy and support complex integration solutions using Boomi Atmosphere for Cloud, On-Premises, and Hybrid environments. Develop and manage APIs (REST and SOAP) for integration solutions, ensuring adherence to best practices in API design and implementation. Utilize the Boomi Pub/Sub model, Event-Based Integrations, Document Flow, Document Cache, and Boomi’s Monitoring Framework to optimize integration performance. Proficiency in web languages like XML, JSON, WSDL, XSD, EDI, HTML, etc. and extensive experience in Boomi API Management Experience in Dell Boomi connectors such as SFDC, SuccessFactors, Mail, Web Service, SFTP, Database, HTTP, Atom Queue, JMS, SharePoint, and more. Ability to Implement robust error handling and logging mechanisms to ensure high reliability of integration solutions. Apply custom scripting in JavaScript and Groovy as required for specialized integration and transformation needs. Dell Boomi Integration Developer Certification will be preferred. Exposure to Relational Databases MS SQL Server, Oracle, etc. will be an added advantage. Ability to produce technical documentation, Test scripts etc. for the Integrations developed/ Maintained. Preferred Exposure Interface design and development for Salesforce/Zuora/ Kantata integration. Basic knowledge of Salesforce/Zuora applications to aid in building integrations. Expertise in requirement analysis, integration design and documentation Ability to grasp new technologies quickly and willingly, adaptability to new challenges Exposure to any incident management tool viz CA service desk, Service NOW, JIRA, Freshworks etc. will be an added advantage Mandatory Wiling to work in 2:30 to 11:30 shifts or late evening US shift as and when needed. Excellent oral and written English communication skills required Ability to manage, track and progress the tasks assigned. Veradigm is proud to be an equal opportunity workplace dedicated to pursuing and hiring a diverse and inclusive workforce. Thank you for reviewing this opportunity! Does this look like a great match for your skill set? If so, please scroll down and tell us more about yourself! Show more Show less

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Req Number: 94908 Time Type: Full Time Are you ready to develop your career in a rapidly growing and successful global company acting in a fast-paced role? If you are someone looking for a position that challenges and inspire you in a successful international company, DSV is your place. The DSV group, headquartered in Denmark, is one of the biggest transports and logistics companies in the world, with more than 75,000 dedicated employees, operating in more than 90 countries. You will join a global and determined group driven forward by the desire to grow and will be part of a dynamic group, which is characterized by high level professionalization and constant improvement. Who do we look for? We are seeking a highly talented and support-oriented IT Specialist/Senior IT Specialist for our EDI team to help give excellent Support for our internal and external customers up to highest standards and quality. If you are excited about IT integration, ready to acquire new technical skills, and open for new challenges, then DSV CA&I is the place for you. Our focus is supporting the EDI processes across the organization. Roles & Responsibilities Perform 2nd level/3rd Level support on applications placed within the EDI area Responsible for the support performance and follow-up in line with the SLA Conduct or attend on internal trainings Planning and controlling of the own tasks assigned on ITSM tool Propose solution to optimize the efficiency of processes and teamwork Ensure support availability during agreed support hours Ensure knowledge sharing and documentation within the team Ensure proactive communication to line-of-business on incidents, requests and changes Supporting other team members whenever needed Participating in ad-hoc Integration Projects allocated by Management and completing them within agreed timeframes Understand the IT organization and the EDI Support team role in Global IT dept. Work in multicultural and worldwide environment in Follow the Sun schema Requirements: Good practical knowledge of Sterling Integrator Basic EDI mapping experience with Positional, Idocs, XML, X12 and EDIFACT on the IBM B2Bi Sterling Integrator mapper Good understanding of API General and good understanding of EDI system role in organization Strong analytical and problem-solving skills Good communication skills in English – both spoken and written Ability to observe and provide constructive feedback IT role awareness and technical documentation needs in global organization Ability to work under pressure and multitasking Good Unix shell command understanding A very good basic network communication protocols understanding (FTP, sFTP, FTPs, HTTP, HTTPs, other) 2 Practical knowledge about using SQL’s for data analyses Understanding ITIL principles in organization and Foundation certification would be benefit Good interpersonal and communication skills Able to work across teams and departments Proactive, service oriented and helpfulness Teamwork in an international environment Service minded, like to help others. Minimum 2 years of experience in a similar EDI Support environment Must Have: Service minded and customer driven Effective communication together with active listening Strong analytical thinking skills Good knowledge of the logistic business processes preferable but not obligatory Excellent organizational, interpersonal, time management and communication skills Must be able to work in a team environment to meet strict deadlines and compliance criteria defined by customers Exercise good judgment to resolve non-standard issues independently Ability to work under pressure and multitasking Location: Hyderabad (India) Want to know more and apply? We will be happy to answer any questions you may have regarding the position and about your options in DSV. You are welcome to send an email to rajeswari.byra@dsv.com DSV – Global Transport and Logistics DSV is a dynamic workplace that fosters inclusivity and diversity. We conduct our business with integrity, respecting different cultures and the dignity and rights of individuals. When you join DSV, you are working for one of the very best performing companies in the transport and logistics industry. You’ll join a talented team of approximately 75,000 employees in over 80 countries, working passionately to deliver great customer experiences and high-quality services. DSV aspires to lead the way towards a more sustainable future for our industry and are committed to trading on nature’s terms. We promote collaboration and transparency and strive to attract, motivate and retain talented people in a culture of respect. If you are driven, talented and wish to be part of a progressive and versatile organisation, we’ll support you and your need to achieve your potential and forward your career. Visit dsv.com and follow us on LinkedIn, Facebook and Twitter. Show more Show less

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Description Participation in user interviews to understand technical and customer needs. Developing front end website architecture based on Palantir Foundry. Designing user interactions on web pages within Palantir Foundry Workshop. Developing back-end code logic that leverages semantic object linking (ontologies) within Palantir Foundry Pipeline Builder, Code Workbook, and Ontology Manager. Creating servers, databases, and datasets for functionality as needed. Ensuring health of data connections and pipelines (utilizing filesystem, JDBC, SFTP, and webhook). Ensuring conformance with security protocols and markings on sensitive data sets. Ensuring responsiveness of web applications developed on low code/no code solutions. Ensuring cross-platform optimization for mobile phones. Seeing through projects from conception to finished product. Meeting both technical and customer needs. Staying abreast of developments in web applications and programming languages. Lead other engineers to develop features on your projects. Develop and train machine learning models. Analyze and preprocess large datasets. Optimize algorithms for performance and scalability. Deploy AI models into production. Collaborate on AI-powered features for applications. Qualifications Bachelor’s degree in computer science, Management Information Systems, Engineering or related field and 4 years Required Strong knowledge in programming languages and coding principles and procedures. Strong knowledge in web development framework. Palantir Foundry experience preferred. Proficiency with fundamental front-end languages such as HTML, CSS, and JavaScript. Familiarity with JavaScript libraries such as Lodash, Math.js, Moment, Numeral, and es6-shim. Proficiency with server-side languages for structured data processing; Python, PySpark, Java, Apache Spark, and SparkSQL. Proficiency with AI/ML frameworks, data analysis, and deep learning. Familiarity with database technology such as MySQL, Oracle, MongoDB, and others preferred. Familiarity with analytical tools for business intelligence and data science such as Power BI, Jupyter, and R Studio preferred. Strong organizational and project management skills preferred. Team leadership experience preferred. Strong attention to detail, facilitation, team building, collaboration, organization, and problem-solving skills. Excellent verbal and written communication skills. Ability to work methodically and analytically in a quantitative problem-solving environment. Effective written and oral communication skills. Demonstrated critical thinking skills. Strong knowledge in Microsoft Office Suite (Word, Excel, and PPT). Ability to obtain applicable certifications. Job Information Technology Primary Location India-Maharashtra-Mumbai Schedule: Full-time Travel: No Req ID: 250771 Job Hire Type Experienced Not Applicable #BMI N/A Show more Show less

Posted 2 weeks ago

Apply

0 years

4 - 7 Lacs

Hyderābād

On-site

GlassDoor logo

Date: May 30, 2025 Job Requisition Id: 61489 Location: Hyderabad, IN IBG SAP PI Interview questions. 1.Differentiate between PI and CPI?SAP Process Integration (PI)/Process Orchestration (PO):Deployment: On-premise.Focus: Primarily designed for integrating on-premise applications and systems.Scalability: Can be highly scalable, particularly for large SAP landscapes.Complexity: Can be more complex to manage, especially for large or complex projects.Integration: Enables integration between different SAP systems and also with non-SAP systems.Example: Used for integrating different SAP modules like SAP CRM, SAP ECC, and SAP SCM.SAP Cloud Platform Integration (CPI):Deployment:Cloud-based.Focus:Designed for integrating cloud applications, on-premise applications, and third-party systems.Scalability:Offers good scalability and flexibility for integrating cloud-based applications.Ease of Use:CPI is generally considered easier to manage and use, with a more intuitive user interface.Integration:Facilitates integration between cloud applications, on-premise applications, and third-party systems.2.How does SAP PI handle complex scenarios involving multiple systems, asynchronous communication, and various integration patterns (e.g., point-to-point, hub-and-spoke)?Answer: SAP PI utilizes a robust architecture with components like the Integration Engine, Adapter Engine, and Integration Directory to manage complex integrations. The Integration Engine routes messages, the Adapter Engine handles communication with different systems, and the Integration Directory provides a central repository for integration artifacts. Asynchronous communication is supported through message queues, and various integration patterns can be implemented by configuring the Integration Engine and Adapter Engines appropriately.3.What are the key considerations when designing and implementing a complex SAP PI solution, especially concerning scalability, performance, and security?Answer: Key considerations include:Scalability: Ensure the system can handle the expected message volume and system load. This may involve optimizing the Integration Engine and Adapter Engine configurations, as well as using appropriate message queues and infrastructure.Performance: Monitor message processing times and identify bottlenecks. Optimize mappings, message formats, and adapter configurations to improve performance.Security: Implement robust security measures, including user authentication, authorization, and data encryption, to protect sensitive data during transmission and storage. Use secure communication protocols and ensure that the PI system is hardened against security threats.4.How does SAP PI handle data transformations and mappings in complex scenarios involving different data structures and formats?Answer: SAP PI provides powerful mapping capabilities through the Integration Engine, allowing for complex data transformations and mappings. You can use the Integration Engine's mapping tools to transform data between different formats, perform calculations, and enrich data based on business requirements. You can also use external mapping tools or custom mappings to handle more complex scenarios.5.How do you troubleshoot and monitor a complex SAP PI landscape, including identifying and resolving issues related to message failures, performance bottlenecks, and security breaches?Answer: SAP PI provides monitoring tools and business logs to track message flows, identify failures, and monitor performance. You can use these tools to track message status, analyze performance metrics, and identify potential issues. For security breaches, you can use security logs and monitoring tools to detect and respond to security events.6.How does SAP PI integrate with other SAP technologies, such as SAP Business Process Management (BPM) and SAP Cloud Platform Integration (CPI)?Answer: SAP PI can integrate with SAP BPM for process orchestration, allowing you to model and execute complex business processes that span multiple systems. SAP CPI is the next-generation integration platform, and it can be used to build cloud-based integrations and extend the capabilities of SAP PI.7.How Do You Configure an iDoc Collection Scenarios?For Outbound iDoc collection, you need to provide the iDoc package size in Partner Profile (we20) and select the option ‘Collect iDocs’. In PI side, Sender Communication Channel should be configured to handle ‘Multiple iDocs in Same XI message’.8.What Are the Receiver Routing Techniques Available in SAP PI/PO?Standard Receiver Determination and Extended Dynamic Receiver determination are the main methods to define routing in SAP PI interfaces.9.What Are the Different Types of User-Defined Functions?Single Value, All Values in Context, All Values in Queue.10.What Is the Purpose of EDI Separator?EDI separator is an adapter provided with B2B Toolkit. EDI separator splits bulk (batch) EDI messages to individual EDI messages for processing. EDI Seperator supports EDI message format ANSI ASC X12, Edifact, Odette, and VDA.11.What Are the Standard Functions (Objects) Used in Extended Receiver Configuration?Extended Receiver Determination allows us to dynamically derive the message receivers from Message Mapping program. (This is different to routing messages using XPath rules). We need to use several standard objects delivered by SAP under SWCV SAP BASIS and namespace ‘http://sap.com/xi/XI/System’.You need to use standard Service Interface ‘ReceiverDetermination‘, Message Type ‘Receiver‘ to implement Extended Receiver Determination.12.What Are Data Type Enhancements and How Do You Configure Data Type Enhancements?Read my article on Data Type Enhancements.13.How Do you Search for the PI Message of an Inbound iDoc if You Know the iDoc Number?PI message ID can be found in the iDoc Control record. Find the PI message ID in iDoc Archive Key of iDoc control key. Search for the PI message in Message monitor using the message ID.If you have configured iDoc Monitor in SAP PI, you can search for the PI message directly using iDoc number.14.How Do You Configure AS2 Adapter Certificates?AS2 certificates are installed in Key Storage of Netweaver Admin (NWA). Public certificates of PI host and third party systems are exchanged and installed. In PI, keys are installed as a combination of Key Store View and Key Store Entry.15.How Do You Set up a SAP ABAP System in System Land Scape Directory?Usually, SAP technical systems are installed in SLD by BASIS team. You need to create the Product, Software Component Version, and Business System of the SAP system.16.What Is the Functionality of Service Registry?Service Registry is the central location for webservices. Service Registry allows us to expose webservices of PI (host) in accordance with Service Oriented Architecture (SOA). Read how to configure service end points in SR.17.What Is the Purpose of Local Software Component Versions (SWCV) and Their Limitations?Local SWCVs are used to test message mapping programs. Objects in local SWCVs cannot be used in end-to-end integration scenarios or viewed in Integration Directory (ID).18.How Many SWCVs Are Required to Build an Interface with One Sender and One Receiver?I prefer to use Three Tier Architecture to represent an integration. One SWCV for Sender, one for Receiver, and another one for cross-system objects, such as Message Mapping and Operation Mappings.19.What Is the Difference Between Business System and Business Component?Previously known as Business Service, Business Component is an abstract representation of a system in which attributes are unknown or partially known. Business System, on the other hand, represents a known system in SLD, for example, internal systems in the organization landscape. Business systems require underline Technical systems. All SAP systems should be represented as Business systems.20.Highlight a few activities in SAP post-go-live knowledge management.Activities include monitoring system performance, managing message queues, and ensuring data consistency.21.What is an Adapter Engine? Mention the use of Adapter Engine AAE in the SAP PI system.The Adapter Engine handles communication between SAP PI and external systems. The Advanced Adapter Engine (AAE) provides enhanced performance and supports additional adapters.Define/Answer in a sentence:SAP PI/PO is a middleware technology that enables seamless integration and process automation within an organization.22.Explain Synchronous communication under SAP PI. Highlight a few advantages and disadvantages.Synchronous communication involves real-time data exchange where the sender waits for a response. Advantages include immediate feedback and data consistency. Disadvantages include potential delays and system dependency.23.Explain asynchronous communication under SAP PI. Highlight a few advantages and disadvantages.Asynchronous communication involves data exchange without waiting for an immediate response. Advantages include reduced system dependency and improved performance. Disadvantages include potential data inconsistency and delayed feedback.24.What is Global Container & its uses in SAP XI?Global Container is a storage area used to store data that can be accessed across different message mappings and transformations.25.List the various adapters in the Advanced Adapter Engine and Integration Engine in the PI system.Adapters include HTTP, SOAP, JDBC, File, and IDoc adapters. They are used for communication between SAP PI and external systems.26.List down the components you can monitor under Configuration and Monitoring options.Components include message flows, communication channels, and system performance metrics.27.How many SAP sessions can you work for a particular client at a particular time?You can work on up to six SAP sessions for a particular client at a particular time. Key Responsibilities: Design, develop, and maintain SAP PI/PO integration solutions to support business processes.Configure and manage adapters (IDoc, SOAP, REST, JDBC, File, SFTP, etc.) and work with various protocols (XML, JSON, HTTP, FTP, AS2).Develop message mappings using graphical mapping, XSLT, and Java-based mappings.Implement BPM (Business Process Management) and BRM (Business Rules Management) solutions for workflow automation.Troubleshoot and resolve complex integration issues and optimize performance.Ensure compliance with security standards , including encryption techniques and authentication mechanisms.Collaborate with functional and technical teams to understand integration requirements.Work in Agile and DevOps environments , leveraging tools like Jenkins, Git, and CI/CD pipelines for automation.Provide application support and maintenance for existing integrations. Skills Required: Strong experience in SAP PI/PO development and support.Proficiency in Java scripting for developing user-defined functions (UDFs) and adapter modules.Solid understanding of SAP CRM integration and related business processes.Hands-on experience with BPM and BRM for workflow automation.Knowledge of cloud platforms (AWS, Azure, Google Cloud) and SAP cloud integrations.Strong problem-solving skills and the ability to troubleshoot integration issues.Excellent communication and teamwork skills. Preferred Skills: Proficiency in Java, JavaScript, and XML transformations .Experience working with security standards, encryption techniques, and compliance regulations .Familiarity with DevOps tools (Jenkins, Git) and CI/CD practices .SAP certification in SAP PO or a related area is a plus. IBG

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

Gurgaon

On-site

GlassDoor logo

Airbnb was born in 2007 when two hosts welcomed three guests to their San Francisco home, and has since grown to over 5 million hosts who have welcomed over 2 billion guest arrivals in almost every country across the globe. Every day, hosts offer unique stays and experiences that make it possible for guests to connect with communities in a more authentic way. Systems Architect_BizTech Integrations Airbnb was born in 2007 when two Hosts welcomed three guests to their San Francisco home, and has since grown to over 4 million Hosts who have welcomed more than 1 billion guest arrivals in almost every country across the globe. Every day, Hosts offer unique stays and experiences that make it possible for guests to connect with communities in a more authentic way. The Community You will Join: Airbnb is a company with a mission to create a world where anyone can belong anywhere, achieved through a unified team adhering to core values. The BizTech department plays a crucial role in this mission by providing reliable internal systems, enterprise technologies, innovative products, and technical support, fostering an empowered and inclusive progress. They also create technical breakthroughs and strategies that redefine the concept of belonging anywhere, delivering value to both the business and its people. The Integrations team of BizTech manages integrations and APIs across various SaaS products and internal systems. A Systems Architect within this team focuses on owning the end-to-end architecture of finance integrations, leading delivery of integration projects. This role involves improving processes, enhancing security, and enforcing best practices, while collaborating with functional and partner teams. The Difference You Will Make: As a Systems Architect at Airbnb, your role is vital for advancing our integration platforms and SaaS integrations across the enterprise. You will work closely with the Finance and Procurement departments to ensure secure and efficient data exchanges, leading over 200 finance integrations within the Oracle Integration framework. Your responsibilities include guiding engineers in creating robust integrations, driving the evolution of Airbnb's integration architecture for future scalability, and managing key integration projects. You must be adept at navigating ambiguity, independently solving problems, and effectively discussing solution strategies. Your proactive problem-solving skills will greatly impact and shape our integration approach, promoting excellence and innovation. A Typical Day: Lead Oracle Integration practice, drive multiple projects simultaneously, taking accountability and ownership of systems, integrations, and processes. Drive cross-functional initiatives. Collaborate and drive alignment with key stakeholders. Architect and build complex integrations, backend services, and APIs to connect various systems and automate processes. Lead design discussions within the team and with partner teams. Ensure all deliverables are of top quality and meet functional and non-functional (ilities) requirements. Communicate proactively with partner teams, stakeholders and within the team. Identifies and solves for communication gaps on team, ensures team members have clear expectations and priorities. Contributes to organizational priorities and in defining the team roadmap and strategy. Collaborate closely with functional leads, other technical leads and team. Participate in the full SDLC including design, implementation, reviews, QA, release and post-production support. Take responsibility for the smooth operation of the month-end financial integrations and reporting, among other aspects of the finance integrations. Proactively identify gaps in the system and operational processes, along with solutions to address. Tackle ambiguous problems, discuss problem-solving approaches, help unblock teams, and clearly articulate solution trade-offs. Helps improve practices that impact engineering efficiency, quality, and the ways of working Ensure industry best practices are being followed and security is top notch. Educate the team about upcoming features in Oracle launches and prepare for any impacts. Your Expertise: 10+ years of hands-on experience with Oracle technologies, with at least 5+ years dedicated to Oracle Integration technologies including Oracle Integration Cloud (OIC). Strong understanding and working experience in Oracle modules such as FAH, Record to Report (R2R), Procure to Pay (P2P), Planning, AP, AR, ARCS, FCCS. Expertise in the architecture, design, and implementation of integrations and APIs to integrate a variety of SaaS apps; therefore, familiar with PGP, SSH, OAuth, HTTPS, SFTP, REST, SOAP. Strong experience in relational databases alongside strong SQL proficiency. Strong experience in building BICC extracts, BI publisher, BI reports, reconciliation reports, drill-down reports, OTBI reports, and dashboards. Good understanding of Oracle Cloud Infrastructure (OCI) and PaaS architecture with working experience on Object storage, Compute instances, ATP, ODI, IAM, and IDCS. Experience in building web pages using Apex/VBCS for business data management. Experience in programming languages such as Java and can build ground up. Experience with data streaming and with messaging infrastructure and storage. Experience in CI/CD using tools such as Jenkins and source control such as GitHub. Have a strong desire to mentor the careers and development of other engineers. Possesses strong verbal and written communication skills. B.Tech in Computer Science or equivalent work-related experience. Our Commitment To Inclusion & Belonging: Airbnb is committed to working with the broadest talent pool possible. We believe diverse ideas foster innovation and engagement, and allow us to attract creatively-led people, and to develop the best products, services and solutions. All qualified individuals are encouraged to apply.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Gurgaon

On-site

GlassDoor logo

BPO Team Lead The position requires a positive and customer friendly attitude. The successful candidate will demonstrate enthusiasm and a keen interest to learn and keep up to date with relevant new product releases and developments. You will have an ability to use acquired troubleshooting and technical skills in identifying root cause and resolution of issues, and will provide 2nd level support for Oracle Hospitality applications, with a focus in providing solutions and troubleshooting product defects. This team works closely with our Cloud Operations, Sustaining Engineering & Development teams. Provide 2nd line support globally for Oracle Hospitality customers and partners This includes (and isn't limited to): OPERA V5 PMS (Property Management System) OPERA CLOUD PMS (Property Management System) R&A (Report & Analytics) OHIP (Hospitality Integration Platform) OPERA OXI (Exchange Interface) OPERA OEDS (Electronic Distribution Suite, such as OWS, ADS, GDS, HTNG, Kiosk) Work very closely with other teams (L1, AMS, SE/Dev, CCSM) to deliver quality customer service Keep up to date with new releases and new functionality Identify and report back root cause and resolution of major incidents to avoid recurring issues Adhere to Global L2 Support standards and processes Actively participate in building Oracle Knowledgebase Committed to the delivery of outstanding service to customers Special Skills OPERA V5 PMS (Property Management System) OPERA V5 application (all modules: EOD; Front Desk; Reports; Commissions; Profiles; Financial Imbalances; Inventory, Rates and Packages configuration, etc.) Know how to generate and analyze a trace SQL knowledge SFTP configuration Export and BOF troubleshooting Advanced logical troubleshooting skills in reproducing issues and researching errors OPERA CLOUD PMS (Property Management System) OPERA Cloud application (all modules: EOD; Front Desk; Reports; Commissions; Profiles; Financial Imbalances; Inventory, Rates and Packages configuration, etc.) Generate, collect and analyze in-memory logs SQL knowledge Understanding of user accesses using SSD and OCIM Advanced logical troubleshooting skills in reproducing issues and researching errors R&A (Report & Analytics) OPERA Cloud application (all modules: EOD; Front Desk; Reports; Commissions; Profiles; Financial Imbalances; Inventory, Rates and Packages configuration, etc.) Generate, collect and analyze in-memory logs SQL knowledge Advanced logical troubleshooting skills in reproducing issues and researching errors OHIP (Hospitality Integration Platform) OPERA Cloud application (all modules: EOD; Front Desk; Reports; Commissions; Profiles; Financial Imbalances; Inventory, Rates and Packages configuration, etc.) Understanding as to how APIs work Knowledge of Postman and how to use it to test API calls Advanced logical troubleshooting skills in reproducing issues and researching errors OPERA OXI (Exchange Interface) OPERA V5 and Cloud application (all modules: EOD; Front Desk; Reports; Commissions; Profiles; Financial Imbalances; Inventory, Rates and Packages configuration, etc.) Knowledge of OXI configuration and OXI processors Understand how to read XML files SQL knowledge Knowledge of OXI logs and troubleshooting error messages Knowledge of SOAP-Ui and how to use it to test XML messages OPERA OEDS (Electronic Distribution Suite, such as OWS, ADS, GDS, HTNG, Kiosk) OPERA Cloud application (all modules: EOD; Front Desk; Reports; Commissions; Profiles; Financial Imbalances; Inventory, Rates and Packages configuration, etc.) Knowledge of OEDS configuration, Business Event queues and configuration, and OEDS services SQL knowledge Understand how to read XML files Knowledge of OEDS logs and troubleshooting error messages Knowledge of SOAP-Ui and how to use it to test XML messages Proven communication and presentation skills through previous interaction with customers and peers Strong experience and methodology in troubleshooting and issue resolution Strong experience in tracking customer issues through to resolution and providing regular status updates to customers and internal teams Excellent Diagnostic and Reporting Skills when reproducing and precisely documenting issues and defects Excellent general IT skills and strong knowledge of current technologies Knowledge of the hospitality and IT industry is essential

Posted 2 weeks ago

Apply

3.0 years

3 - 9 Lacs

Gurgaon

On-site

GlassDoor logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Position in this function performs trading partner maintenance, including setting up MFT connections with internal and external clients and file routing Works problem tickets and perform ongoing system maintenance Excellent troubleshooting skills with transaction failures or file transfer issues Supports upgrades and/or migrations Deployment of new code across Non-Prod and Production environments Provides On-Call production support of the service. Primary Responsibilities: Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Undergraduate degree or equivalent experience 3+ years of experience with Sterling Integrator in an application administration role with duties including Production Monitoring and using various monitoring tools Performance optimization Root cause analysis Advanced troubleshooting 3+ years of experience with file transfer protocols such as FTPS, SFTP, AS2 and Connect:Direct 2+ years of Linux experience including Shell and Perl scripting 2+ years of SQL experience 1+ years of EDI experience Experience supporting applications running on Linux Flexibility towards working in shifts Preferred Qualifications: Experience on various IBM MFT Products or Tools IBM Sterling Secure Proxy IBM Sterling External Authentication Server IBM Control Center IBM Connect Direct RESTful APIs Healthcare or Insurance Industry experience Experience using PragmaEdge Community Manager (PCM) Knowledge on IBM MQ, IBM Transformation Extender Knowledge on Cloud technologies like Azure, DevOps methodologies, Containers At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Chennai

Hybrid

Naukri logo

The Operations Engineer will work in collaboration with and under the direction of the Manager of Data Engineering, Advanced Analytics to provide operational services, governance, and incident management solutions for the Analytics team. This includes modifying existing data ingestion workflows, releases to QA and Prod, working closely with cross functional teams and providing production support for daily issues. Essential Job Functions: * Takes ownership of customer issues reported and see problems through to resolution * Researches, diagnoses, troubleshoots and identifies solutions to resolve customer issues * Follows standard procedures for proper escalation of unresolved issues to the appropriate internal teams * Provides prompt and accurate feedback to customers * Ensures proper recording and closure of all issues * Prepares accurate and timely reports * Documents knowledge in the form of knowledge base tech notes and articles Other Responsibilities: * Be part of on-call rotation * Support QA and production releases, off-hours if needed * Work with developers to troubleshoot issues * Attend daily standups * Create and maintain support documentation (Jira/Confluence) Minimum Qualifications and Job Requirements: * Proven working experience in enterprise technical support * Basic knowledge of systems, utilities, and scripting * Strong problem-solving skills * Excellent client-facing skills * Excellent written and verbal communication skills * Experience with Microsoft Azure including Azure Data Factory (ADF), Databricks, ADLS (Gen2) * Experience with system administration and SFTP * Experience leveraging analytics team tools such as Alteryx or other ETL tools * Experience with data visualization software (e.g. Domo, Datorama) * Experience with SQL programming * Experience automating routine data tasks using various software tools (e.g., Jenkins, Nexus, SonarQube, Rundeck, Task Scheduler)

Posted 2 weeks ago

Apply

7.0 years

6 - 7 Lacs

Chennai

On-site

GlassDoor logo

Job ID R-221158 Date posted 05/29/2025 Job Title: Senior Consultant - SAP PI PO Grade - D2 Introduction to role Are you an SAP Integration professional with hands-on experience in SAP PI/PO/CPI? Do you have a passion for developing SAP integrations using different technologies? If so, we have an exciting opportunity for you! We are looking for a Senior Consultant who is adept at all aspects of the life cycle of integrations. Join us and be part of a team that drives transformational journeys forward. Accountabilities As a Senior Consultant - SAP PI PO, you will: - Take on the role of SAP PI - PO / BTP Developer with both AM & AD perceptions (DevOps) Provide regular BAU support Implement incremental changes and project work Utilize your skills and capabilities to deliver high-quality integration solutions Essential Skills/Experience Minimum 7+ Years of experience in SAP PI/PO Should have handled at least 4 Projects or Support on SAP PI/PO 7.5 Java Stack Shown at least 1 Project or on SAP CPI or SAP BTP Integration Suite Good understanding in BTP cockpit Better understanding of CPI standard processes Good understanding on CPI message Mapping standard methodologies Good working experience in API management open connectors Good knowledge on groovy script java script - PO to CPI migration skills - Working experience on various pallet options Solid grasp on cloud connector Expertise in various SAP PI/PO Tools – NWDS, ESR, ID, RWB and knowledge on SLD Within SAP PI-PO should have worked on various technical adapters like: FTP, SFTP, JDBC, IDoc, RFC, SOAP, REST, HTTP, Proxy, Mail etc. Should have strong expertise in EDI B2B Integration using B2B Addon adapters AS2 & EDI Separator Expertise in Java Mappings & Graphical mappings (including value mappings and lookups) Should have knowledge in handling security artifacts, encryption, and decryption mechanisms Desirable Skills/Experience SAP PI /PO related Java Knowledge Knowledge in SAP API Management Solid grasp in security materials, session handling, authentication methods, set up Be responsible for providing services for application interface production monitoring, job monitoring and making sure system/interfaces are up and running Experience in Certificates / Data Encryption / Data Signing ITSM & SAP SolMan ChaRM experience At AstraZeneca, we are at the forefront of digital transformation, fusing our digital and data capabilities with the support from the business to make it happen. Our team leverages leading technologies and explores data to make improved decisions, helping the business reach the right outcomes quicker. We challenge, innovate, and break away from the norm to find bold new ways of approaching everyday tasks. Empowerment and collaboration are key as we work together to positively impact patients across the world. Our diverse team of specialists continuously expands their knowledge and develops through a two-way feedback loop and novel roles. Ready to shape the future of digital healthcare with us? Apply now! AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements. SAP PI PO - Senior Consultant Posted date May. 29, 2025 Contract type Full time Job ID R-221158 APPLY NOW Why choose AstraZeneca India? Help push the boundaries of science to deliver life-changing medicines to patients. After 45 years in India, we’re continuing to secure a future where everyone can access affordable, sustainable, innovative healthcare. The part you play in our business will be challenging, yet rewarding, requiring you to use your resilient, collaborative and diplomatic skillsets to make connections. The majority of your work will be field based, and will require you to be highly-organised, planning your monthly schedule, attending meetings and calls, as well as writing up reports. Who do we look for? Calling all tech innovators, ownership takers, challenge seekers and proactive collaborators. At AstraZeneca, breakthroughs born in the lab become transformative medicine for the world's most complex diseases. We empower people like you to push the boundaries of science, challenge convention, and unleash your entrepreneurial spirit. You'll embrace differences and take bold actions to drive the change needed to meet global healthcare and sustainability challenges. Here, diverse minds and bold disruptors can meaningfully impact the future of healthcare using cutting-edge technology. Whether you join us in Bengaluru or Chennai, you can make a tangible impact within a global biopharmaceutical company that invests in your future. Join a talented global team that's powering AstraZeneca to better serve patients every day. Success Profile Ready to make an impact in your career? If you're passionate, growth-orientated and a true team player, we'll help you succeed. Here are some of the skills and capabilities we look for. 0% Tech innovators Make a greater impact through our digitally enabled enterprise. Use your skills in data and technology to transform and optimise our operations, helping us deliver meaningful work that changes lives. 0% Ownership takers If you're a self-aware self-starter who craves autonomy, AstraZeneca provides the perfect environment to take ownership and grow. Here, you'll feel empowered to lead and reach excellence at every level — with unrivalled support when you need it. 0% Challenge seekers Adapting and advancing our progress means constantly challenging the status quo. In this dynamic environment where everything we do has urgency and focus, you'll have the ability to show up, speak up and confidently take smart risks. 0% Proactive collaborators Your unique perspectives make our ambitions and capabilities possible. Our culture of sharing ideas, learning and improving together helps us consistently set the bar higher. As a proactive collaborator, you'll seek out ways to bring people together to achieve their best. Responsibilities Job ID R-221158 Date posted 05/29/2025 Job Title: Senior Consultant - SAP PI PO Grade - D2 Introduction to role Are you an SAP Integration professional with hands-on experience in SAP PI/PO/CPI? Do you have a passion for developing SAP integrations using different technologies? If so, we have an exciting opportunity for you! We are looking for a Senior Consultant who is adept at all aspects of the life cycle of integrations. Join us and be part of a team that drives transformational journeys forward. Accountabilities As a Senior Consultant - SAP PI PO, you will: - Take on the role of SAP PI - PO / BTP Developer with both AM & AD perceptions (DevOps) Provide regular BAU support Implement incremental changes and project work Utilize your skills and capabilities to deliver high-quality integration solutions Essential Skills/Experience Minimum 7+ Years of experience in SAP PI/PO Should have handled at least 4 Projects or Support on SAP PI/PO 7.5 Java Stack Shown at least 1 Project or on SAP CPI or SAP BTP Integration Suite Good understanding in BTP cockpit Better understanding of CPI standard processes Good understanding on CPI message Mapping standard methodologies Good working experience in API management open connectors Good knowledge on groovy script java script - PO to CPI migration skills - Working experience on various pallet options Solid grasp on cloud connector Expertise in various SAP PI/PO Tools – NWDS, ESR, ID, RWB and knowledge on SLD Within SAP PI-PO should have worked on various technical adapters like: FTP, SFTP, JDBC, IDoc, RFC, SOAP, REST, HTTP, Proxy, Mail etc. Should have strong expertise in EDI B2B Integration using B2B Addon adapters AS2 & EDI Separator Expertise in Java Mappings & Graphical mappings (including value mappings and lookups) Should have knowledge in handling security artifacts, encryption, and decryption mechanisms Desirable Skills/Experience SAP PI /PO related Java Knowledge Knowledge in SAP API Management Solid grasp in security materials, session handling, authentication methods, set up Be responsible for providing services for application interface production monitoring, job monitoring and making sure system/interfaces are up and running Experience in Certificates / Data Encryption / Data Signing ITSM & SAP SolMan ChaRM experience At AstraZeneca, we are at the forefront of digital transformation, fusing our digital and data capabilities with the support from the business to make it happen. Our team leverages leading technologies and explores data to make improved decisions, helping the business reach the right outcomes quicker. We challenge, innovate, and break away from the norm to find bold new ways of approaching everyday tasks. Empowerment and collaboration are key as we work together to positively impact patients across the world. Our diverse team of specialists continuously expands their knowledge and develops through a two-way feedback loop and novel roles. Ready to shape the future of digital healthcare with us? Apply now! AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements. APPLY NOW Explore the local area Take a look at the map to see what’s nearby. Reasons to Join Thomas Mathisen Sales Representative Oslo, Norway Christine Recchio Sales Representative California, United States Stephanie Ling Sales Representative Petaling Jaya, Malaysia What we offer We're driven by our shared values of serving people, society and the planet. Our people make this possible, which is why we prioritise diversity, safety, empowerment and collaboration. Discover what a career at AstraZeneca could mean for you. Lifelong learning Our development opportunities are second to none. You'll have the chance to grow your abilities, skills and knowledge constantly as you accelerate your career. From leadership projects and constructive coaching to overseas talent exchanges and global collaboration programmes, you'll never stand still. Autonomy and reward Experience the power of shaping your career how you want to. We are a high-performing learning organisation with autonomy over how we learn. Make big decisions, learn from your mistakes and continue growing — with performance-based rewards as part of the package. Health and wellbeing An energised work environment is only possible when our people have a healthy work-life balance and are supported for their individual needs. That's why we have a dedicated team to ensure your physical, financial and psychological wellbeing is a top priority. Inclusion and diversity Diversity and inclusion are embedded in everything we do. We're at our best and most creative when drawing on our different views, experiences and strengths. That's why we're committed to creating a workplace where everyone can thrive in a culture of respect, collaboration and innovation.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

We are looking for a Data Modeller / Data Modeler to lead data architecture efforts across enterprise domains such as Sales, Procurement, Finance, Logistics, R&D, and Advanced Planning Systems (SAP/Oracle). The role involves designing scalable and reusable data models, building data lake foundations, and collaborating with cross-functional teams to deliver robust end-to-end data solutions. Key Responsibilities Work with business/product teams to understand processes and translate into technical specifications. Design logical and physical data models based on Medallion Architecture, EDW, or Kimball methodologies. Source the correct grain of data from true source systems or existing DWHs. Create and manage reusable intermediary data models and physical views for reporting/consumption. Understand and implement Data Governance, Data Quality, and Data Observability practices. Develop business process maps, user journey maps, and data flow/integration diagrams. Design integration workflows using APIs, FTP/SFTP, web services, etc. Support large-scale implementation programs involving multiple projects over extended periods. Coordinate with data engineers, product owners, data modelers, governance teams, and project stakeholders. Technical Skills Minimum 5+ years in data-focused projects (migration, upgradation, lakehouse/DWH builds). Strong expertise in Data Modelling – Logical, Physical, Dimensional, and Vault modeling. Experience with enterprise data domains: Sales, Finance, Procurement, Supply Chain, Logistics, R&D. Tools: Erwin or similar data modeling tools. Deep understanding of OLTP and OLAP systems. Familiar with Kimball methodology, Medallion architecture, and modern Data Lakehouse patterns. Knowledge of Bronze, Silver, and Gold layer architecture in cloud platforms. Ability to read existing data dictionaries, table structures, and normalize data tables effectively. Cloud, DevOps & Integration Familiarity with cloud data platforms (AWS, Azure, GCP) and DevOps/DataOps best practices. Experience with Agile methodologies and participation in Scrum ceremonies. Understand end-to-end integration needs and methods (API, FTP, SFTP, web services). Preferred Experience Background in Retail, CPG, or Supply Chain domains is a strong plus. Experience with data governance frameworks, quality tools, and metadata management platforms. Skills: ftp/sftp,physical data models,data modelling,devops,data modeler,data observability,physical data modeling,cloud platforms,apis,erwin,data lakehouse,vault modeling,dimensional modeling,web services,data modeling,data governance,architecture,data quality,retail,cpg,kimball methodology,medallion architecture,olap,supply chain,logical data models,logical data modeling,integration workflows,online transaction processing (oltp) Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

We are looking for a Data Modeller / Data Modeler to lead data architecture efforts across enterprise domains such as Sales, Procurement, Finance, Logistics, R&D, and Advanced Planning Systems (SAP/Oracle). The role involves designing scalable and reusable data models, building data lake foundations, and collaborating with cross-functional teams to deliver robust end-to-end data solutions. Key Responsibilities Work with business/product teams to understand processes and translate into technical specifications. Design logical and physical data models based on Medallion Architecture, EDW, or Kimball methodologies. Source the correct grain of data from true source systems or existing DWHs. Create and manage reusable intermediary data models and physical views for reporting/consumption. Understand and implement Data Governance, Data Quality, and Data Observability practices. Develop business process maps, user journey maps, and data flow/integration diagrams. Design integration workflows using APIs, FTP/SFTP, web services, etc. Support large-scale implementation programs involving multiple projects over extended periods. Coordinate with data engineers, product owners, data modelers, governance teams, and project stakeholders. Technical Skills Minimum 5+ years in data-focused projects (migration, upgradation, lakehouse/DWH builds). Strong expertise in Data Modelling – Logical, Physical, Dimensional, and Vault modeling. Experience with enterprise data domains: Sales, Finance, Procurement, Supply Chain, Logistics, R&D. Tools: Erwin or similar data modeling tools. Deep understanding of OLTP and OLAP systems. Familiar with Kimball methodology, Medallion architecture, and modern Data Lakehouse patterns. Knowledge of Bronze, Silver, and Gold layer architecture in cloud platforms. Ability to read existing data dictionaries, table structures, and normalize data tables effectively. Cloud, DevOps & Integration Familiarity with cloud data platforms (AWS, Azure, GCP) and DevOps/DataOps best practices. Experience with Agile methodologies and participation in Scrum ceremonies. Understand end-to-end integration needs and methods (API, FTP, SFTP, web services). Preferred Experience Background in Retail, CPG, or Supply Chain domains is a strong plus. Experience with data governance frameworks, quality tools, and metadata management platforms. Skills: ftp/sftp,physical data models,data modelling,devops,data modeler,data observability,physical data modeling,cloud platforms,apis,erwin,data lakehouse,vault modeling,dimensional modeling,web services,data modeling,data governance,architecture,data quality,retail,cpg,kimball methodology,medallion architecture,olap,supply chain,logical data models,logical data modeling,integration workflows,online transaction processing (oltp) Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

We are looking for a Data Modeller / Data Modeler to lead data architecture efforts across enterprise domains such as Sales, Procurement, Finance, Logistics, R&D, and Advanced Planning Systems (SAP/Oracle). The role involves designing scalable and reusable data models, building data lake foundations, and collaborating with cross-functional teams to deliver robust end-to-end data solutions. Key Responsibilities Work with business/product teams to understand processes and translate into technical specifications. Design logical and physical data models based on Medallion Architecture, EDW, or Kimball methodologies. Source the correct grain of data from true source systems or existing DWHs. Create and manage reusable intermediary data models and physical views for reporting/consumption. Understand and implement Data Governance, Data Quality, and Data Observability practices. Develop business process maps, user journey maps, and data flow/integration diagrams. Design integration workflows using APIs, FTP/SFTP, web services, etc. Support large-scale implementation programs involving multiple projects over extended periods. Coordinate with data engineers, product owners, data modelers, governance teams, and project stakeholders. Technical Skills Minimum 5+ years in data-focused projects (migration, upgradation, lakehouse/DWH builds). Strong expertise in Data Modelling – Logical, Physical, Dimensional, and Vault modeling. Experience with enterprise data domains: Sales, Finance, Procurement, Supply Chain, Logistics, R&D. Tools: Erwin or similar data modeling tools. Deep understanding of OLTP and OLAP systems. Familiar with Kimball methodology, Medallion architecture, and modern Data Lakehouse patterns. Knowledge of Bronze, Silver, and Gold layer architecture in cloud platforms. Ability to read existing data dictionaries, table structures, and normalize data tables effectively. Cloud, DevOps & Integration Familiarity with cloud data platforms (AWS, Azure, GCP) and DevOps/DataOps best practices. Experience with Agile methodologies and participation in Scrum ceremonies. Understand end-to-end integration needs and methods (API, FTP, SFTP, web services). Preferred Experience Background in Retail, CPG, or Supply Chain domains is a strong plus. Experience with data governance frameworks, quality tools, and metadata management platforms. Skills: ftp/sftp,physical data models,data modelling,devops,data modeler,data observability,physical data modeling,cloud platforms,apis,erwin,data lakehouse,vault modeling,dimensional modeling,web services,data modeling,data governance,architecture,data quality,retail,cpg,kimball methodology,medallion architecture,olap,supply chain,logical data models,logical data modeling,integration workflows,online transaction processing (oltp) Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

We are looking for a Data Modeller / Data Modeler to lead data architecture efforts across enterprise domains such as Sales, Procurement, Finance, Logistics, R&D, and Advanced Planning Systems (SAP/Oracle). The role involves designing scalable and reusable data models, building data lake foundations, and collaborating with cross-functional teams to deliver robust end-to-end data solutions. Key Responsibilities Work with business/product teams to understand processes and translate into technical specifications. Design logical and physical data models based on Medallion Architecture, EDW, or Kimball methodologies. Source the correct grain of data from true source systems or existing DWHs. Create and manage reusable intermediary data models and physical views for reporting/consumption. Understand and implement Data Governance, Data Quality, and Data Observability practices. Develop business process maps, user journey maps, and data flow/integration diagrams. Design integration workflows using APIs, FTP/SFTP, web services, etc. Support large-scale implementation programs involving multiple projects over extended periods. Coordinate with data engineers, product owners, data modelers, governance teams, and project stakeholders. Technical Skills Minimum 5+ years in data-focused projects (migration, upgradation, lakehouse/DWH builds). Strong expertise in Data Modelling – Logical, Physical, Dimensional, and Vault modeling. Experience with enterprise data domains: Sales, Finance, Procurement, Supply Chain, Logistics, R&D. Tools: Erwin or similar data modeling tools. Deep understanding of OLTP and OLAP systems. Familiar with Kimball methodology, Medallion architecture, and modern Data Lakehouse patterns. Knowledge of Bronze, Silver, and Gold layer architecture in cloud platforms. Ability to read existing data dictionaries, table structures, and normalize data tables effectively. Cloud, DevOps & Integration Familiarity with cloud data platforms (AWS, Azure, GCP) and DevOps/DataOps best practices. Experience with Agile methodologies and participation in Scrum ceremonies. Understand end-to-end integration needs and methods (API, FTP, SFTP, web services). Preferred Experience Background in Retail, CPG, or Supply Chain domains is a strong plus. Experience with data governance frameworks, quality tools, and metadata management platforms. Skills: ftp/sftp,physical data models,data modelling,devops,data modeler,data observability,physical data modeling,cloud platforms,apis,erwin,data lakehouse,vault modeling,dimensional modeling,web services,data modeling,data governance,architecture,data quality,retail,cpg,kimball methodology,medallion architecture,olap,supply chain,logical data models,logical data modeling,integration workflows,online transaction processing (oltp) Show more Show less

Posted 2 weeks ago

Apply

1.0 - 2.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About ACA ACA Group is the leading governance, risk, and compliance (GRC) advisor in financial services. We empower our clients to reimagine GRC and protect and grow their business. Our innovative approach integrates consulting, managed services, and our ComplianceAlpha technology platform with the specialized expertise of former regulators and practitioners and our deep understanding of the global regulatory landscape. Position Summary The Connectors Support Analyst will play a key role in supporting ACAs native data connectors and the reliable delivery of data to the ComplianceAlpha platform. This junior-level position focuses on daily monitoring and troubleshooting of data connector logs, primarily using Airbyte, to ensure consistent and accurate data flows. The ideal candidate is detail-oriented, proactive, and eager to learn. This role requires a strong focus on customer service, clear communication, and the ability to follow documentation and checklists with precision. There will be opportunities to grow technical skills, assist with testing, and engage directly with internal teams and clients to resolve issues and enhance data integration processes. Job Duties Monitor and review daily logs from Airbyte and ACAs native connectors to identify and report errors. Assist with the onboarding of new data connectors by following established protocols. Collaborate with technical architects, client implementation teams, and infrastructure to maintain, troubleshoot, and improve data connectors. Update internal ticketing systems with issue progress and resolutions for both internal and client-facing cases. Create and maintain up-to-date documentation, runbooks, and troubleshooting guides. Communicate directly with clients and internal stakeholders to troubleshoot and resolve connector-related issues. Provide support for diagnostic analysis and resolution for ACA-owned software. Participate in project-related activities and ad-hoc tasks as assigned. Required 1- 2 years of experience in a technical support, QA, or data operations role. Familiarity with data integration workflows and/or data validation. Basic knowledge of SFTP, SMTP, and AWS services. Strong attention to detail and problem-solving skills. Comfortable reviewing logs or structured data (e.g., JSON). Clear and professional communication skills (written and verbal). Interest in learning new technologies and working in a fast-paced team environment. Preferred Experience working with Airbyte or similar ETL/data integration platforms. Exposure to SQL or scripting for data queries. Familiarity with issue tracking tools (e.g., Jira, Zendesk). Experience working directly with clients or end-users. What Working At ACA Offers We offer a competitive compensation package where youll be rewarded based on your performance and recognized for the value you bring to our business. Our Total Rewards package includes medical coverage fully funded by ACA for employees and their family as well as access to Maternity & Fertility and Wellness programs. ACA also provides Personal Accident Insurance, Group Term Life Insurance, Employee Discount programs and Employee Resource Groups. Youll be granted time off for designated ACA Paid Holidays, Privilege Leave, Casual/Sick Leave, and other leaves of absence to support your physical, financial, and emotional well-being. What We Commit To ACA is firmly committed to a policy of nondiscrimination, which applies to recruiting, hiring, placement, promotions, training, discipline, terminations, layoffs, transfers, leaves of absence, compensation and all other terms and conditions of employment. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability or protected veteran status, or any other legally protected status (ref:hirist.tech) Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Position in this function performs trading partner maintenance, including setting up MFT connections with internal and external clients and file routing Works problem tickets and perform ongoing system maintenance Excellent troubleshooting skills with transaction failures or file transfer issues Supports upgrades and/or migrations Deployment of new code across Non-Prod and Production environments Provides On-Call production support of the service. Primary Responsibilities Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Undergraduate degree or equivalent experience 3+ years of experience with Sterling Integrator in an application administration role with duties including Production Monitoring and using various monitoring tools Performance optimization Root cause analysis Advanced troubleshooting 3+ years of experience with file transfer protocols such as FTPS, SFTP, AS2 and Connect:Direct 2+ years of Linux experience including Shell and Perl scripting 2+ years of SQL experience 1+ years of EDI experience Experience supporting applications running on Linux Flexibility towards working in shifts Preferred Qualifications Experience on various IBM MFT Products or Tools IBM Sterling Secure Proxy IBM Sterling External Authentication Server IBM Control Center IBM Connect Direct RESTful APIs Healthcare or Insurance Industry experience Experience using PragmaEdge Community Manager (PCM) Knowledge on IBM MQ, IBM Transformation Extender Knowledge on Cloud technologies like Azure, DevOps methodologies, Containers At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission. Show more Show less

Posted 2 weeks ago

Apply

5.0 - 10.0 years

1 - 5 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Tech Support Practitioner Project Role Description : Act as the ongoing interface between the client and the system or application. Dedicated to quality, using exceptional communication skills to keep our world class systems running. Can accurately define a client issue and can interpret and design a resolution based on deep product knowledge. Must have skills : IBM Sterling B2B Integrator Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education:We are seeking a skilled and motivated Sterling File Gateway (SFG) / Sterling Integrator Specialist to join our integration support and operations team. The ideal candidate will bring strong experience in managing secure file transfers, working across mainframe and distributed environments, and supporting Connect Direct implementations. This role requires technical depth, hands-on troubleshooting, and the ability to collaborate effectively across various teams.Key Responsibilities:Manage and support Sterling File Gateway (SFG) and Sterling B2B Integrator environments.Configure and maintain Connect:Direct (C:D) across Mainframe and Distributed (Linux/AIX) platforms.Perform installation and configuration of Connect:Direct and Secure+, including key exchange and encryption policy setup.Monitor and troubleshoot file transfer issues, ensuring secure and timely delivery of business-critical files.Execute and support daily operational tasks including partner onboarding, certificate/key management, and flow setup.Utilize Linux and AIX command-line tools to perform configuration changes, system checks, and file system maintenance.Participate in incident and problem management processes, including root cause analysis and permanent resolution strategies.Collaborate with infrastructure, network, and application teams for end-to-end solution delivery and maintenance.Required Skills: 5+ years of experience working with Sterling File Gateway (SFG) and Sterling Integrator (SI).Proven experience working with Connect:Direct in Mainframe (z/OS) and Distributed (Linux/AIX) environments.Strong hands-on skills in Linux and AIX command-line administration.Experience in installing and configuring Connect:Direct and setting up Secure+ configurations.Understanding of file transfer protocols (SFTP, FTPS, HTTPS) and related security practices.Good communication and documentation skills to interface with internal teams and external partners.Preferred Qualifications:Exposure to monitoring tools and job scheduling systems such as ESP.Familiarity with ticketing tools like ServiceNow or Remedy. Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

1 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Infra Tech Support Practitioner Project Role Description : Provide ongoing technical support and maintenance of production and development systems and software products (both remote and onsite) and for configured services running on various platforms (operating within a defined operating model and processes). Provide hardware/software support and implement technology at the operating system-level across all server and network areas, and for particular software solutions/vendors/brands. Work includes L1 and L2/ basic and intermediate level troubleshooting. Must have skills : Automation in Systems Integration Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Infra Tech Support Practitioner, you will engage in the ongoing technical support and maintenance of production and development systems and software products. Your typical day will involve addressing technical issues, providing solutions, and ensuring the smooth operation of various platforms. You will work both remotely and onsite, collaborating with team members to troubleshoot and resolve hardware and software challenges while adhering to defined operating models and processes. Your role will be crucial in maintaining the integrity and performance of systems across server and network areas, contributing to the overall efficiency of the organization. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of technical processes and procedures to enhance team knowledge.- Engage in continuous learning to stay updated with the latest technologies and best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Automation in Systems Integration and MFT Administration (MoveIT)- Strong understanding of system integration processes and methodologies.- Experience with troubleshooting and resolving hardware and software issues.- Familiarity with various operating systems and network configurations.- Ability to implement and maintain automated solutions for system management. Additional Information:- The candidate should have minimum 2 years of experience in Automation in Systems Integration and MFT Administration (MoveIT)- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

9 - 14 Lacs

Chennai

Work from Office

Naukri logo

Project Role : User Experience Lead Project Role Description : Lead prototype work and other software engineering solutions that create an optimized user experience. Translate design concepts to prototype solutions as quickly and tangibly as possible, with a balanced understanding of technical feasibility implications and design intent. Must have skills : Boomi Atomsphere Integration Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a User Experience Lead, you will lead prototype work and other software engineering solutions that create an optimized user experience. Your typical day will involve collaborating with cross-functional teams to translate design concepts into tangible prototype solutions, ensuring a balance between technical feasibility and design intent. You will engage in discussions to refine user experience strategies and provide guidance to team members, fostering an environment of creativity and innovation while addressing any challenges that arise during the development process.Key Responsibilities:1. Design and implement complex integration solutions using Dell Boomi, with a primary focus on file-based and API-based integrations.2. Perform Boomi platform administration, including deployment management, environment configuration, and performance tuning.3. Develop and maintain integration logic using Groovy and JavaScript, including data transformations across XML, CSV, and JSON formats.4. Work with relational databases, performing read/write operations as part of integration workflows.5. Implement data validation logic using Boomi Business Rules and Decision components.6. Build and manage batch processes, Atom Queues, JMS messaging, and SFTP integrations.7. Handle technical escalations effectively, providing timely resolutions and root cause analysis.8. Drive innovation and automation in integration designs to improve efficiency and reusability.9. Utilize Boomi API Management capabilities for defining and managing APIs, including security policies and authentication mechanisms.10. Implement robust error handling, logging, and exception management strategies to ensure observability and maintainability.11. Experience with Managed File Transfer (MFT) tools is a plus.Professional Experience:1. Minimum 7 years of overall IT experience with at least 5 years of hands-on experience working on Dell Boomi integration projects.2. Proven expertise in designing and delivering complex process orchestrations using Dell Boomi.3. Hands-on experience with Web Services (SOAP/REST), XML, XSLT, and integration-related technologies.4. Strong understanding of enterprise integration concepts and patterns (synchronous/asynchronous, pub-sub, request-reply, etc.).5. Dell Boomi Process Developer Certification Level I is required; Level II certification is a strong advantage. Additional Information:- The candidate should have minimum 5 years of experience in Boomi Atomsphere Integration.- This position is based in Chennai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

10.0 - 15.0 years

3 - 6 Lacs

Pune

Work from Office

Naukri logo

Project Role : Operations Engineer Project Role Description : Support the operations and/or manage delivery for production systems and services based on operational requirements and service agreement. Must have skills : Managed File Transfer Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time educationKey Responsibilities Technical Experience Professional Attributes Educational Qualification:"Experience in defining, developing and delivering data transmission solutions, Enterprise level FTP and Fax products like System Center Orchestrator SCOM/ SCORCH , RightFax Utility.Running, maintaining and administrating all aspects of the Microsoft System Center Orchestrator tool and RightFax serversWork with internal and external customers to configure and troubleshoot file transfer automations wherever possible. MS Orchestrator experience is highly desirableMust have a basic understanding of UNIX fundamentals, standard command knowledge and shell scripting.Experience performing maintenance and administrative tasks within an application.Proficient in technical skills or knowledge in position-related areas; and the ability to keep current with that knowledge and trends in areas of expertise.Ability to create and document instructions so that others can follow them.A strong technical aptitude, strong problem solving skills.Manage work queues and track user issues and requests. Complete work queue items according to procedures.Ability to manage multiple tasks including; project tasks, deadlines, etc. Ability to interact with vendors for support and informationExcellent communication skills, including verbal, written and presentation are necessary.Must be customer service oriented, flexible and adaptable.""Configure, implement and support of data transmission solutions Enterprise level FTP and Fax products RightFax.Configure secure file transfer SFTP definitions to deliver files to intended destination.Right Fax application utility, FTP tools like IPSwitch, FileZilla, WinScp etc.Should have experience on SQL administration, application Monitoring and IAM -Identity Access Management6+ years of experience in Redwood JScape Managed File Transfer.10+ years of experience in any Enterprise MFT tool.10+ years of experience on database and basic SQL syntax including inserts, updates, deletes and joins.10+ years of experience with File transfer protocols (FTP, SFTP/SCP, HTTP, Connect:Direct)Configure SFTP definitions to securely deliver files to intended destination.10+ years of experience in Linux/ Unix commands.Ability to understand PGP in details.Basic scripting skills required (Shell Script).Good to have experience in Application maintenance, Application patching, Certificate management.Identify, isolate, and troubleshoot issues and drive to RCA.Basic networking knowledge including TCP/IP.""Good verbal and written communication skills to connect with customer, stake holders at varying levels of the organization.Technical documentation skillsAble to prioritize and execute tasks in a high-pressure environmentReady to work in 24*7 shifts""BTech/ BE in CS or Electronics Communication Additional Information:Good to have - ITIL V3/V4 Foundation certified, AWS, Azure Certification" Qualification 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 6.0 years

11 - 15 Lacs

Gurugram

Work from Office

Naukri logo

About NCR Atleos NCR Atleos, headquartered in Atlanta, is a leader in expanding financial access. Our dedicated 20,000 employees optimize the branch, improve operational efficiency and maximize self-service availability for financial institutions and retailers across the globe. Oracle Fusion /SOA -OSB Development and Administration and OIC (Oracle Integration Cloud) Position Summary: The Oracle Integration Cloud (OIC) Developer role involves hands-on development and integration work using the Oracle Integration Cloud platform. The position requires a strong technical skills in OIC, REST/SOAP web services, and data transformation, along with effective communication abilities to collaborate with functional teams and stakeholders. Key Areas of Responsibility: Review functional design document and translate them into technical specs. Perform code prototypes/POCs. Create Tech Design, Build and Test OIC interfaces using OIC middleware toolsets. Build integration in OIC platform using REST/SOAP Services and XML/JSON Payloads. Develop and transform data using XSLT. Develop Database packages and functions using SQL/PLSQL. Leverage pre-built integrations and various adapters within OIC. Manage security, encryption, and scheduling capabilities in OIC. Handle errors and notifications effectively. Identify, triage, and resolve errors and defects independently or collaboratively. Skill Set Requirements Strong experience in Designing, Developing Business Processes and SOA based Integration Architecture. Understanding of different communication protocols like HTTPS, REST, SOAP, SFTP etc., Hands on Experience on OIC (Oracle Integration Cloud) Development skills, should have worked on Cloud Integrations and On Prem Oracle Fusion SOA products (BPEL, OSB, Mediator, Adapters, SOA Server) Developing & Deploying composites using BPEL, Mediator, SOA adopters using standard AIA architecture. Hands on experience in Implementing Proxy Service, Business Service, Proxy pipeline. SOA/Middleware Architecture knowledge Core Java and XSLT Unix Shell Scripting Oracle PL/SQL and Database knowledge Oracle Visual/Workflow builder Perform certificate renewal changes to ensure connectivity exists with integrating systems. Perform system upgrades, security patching, and migration activities. Excellent analytical and problem-solving skills. Offers of employment are conditional upon passage of screening criteria applicable to the job. EEO Statement NCR Atleos is an equal-opportunity employer. It is NCR Atleos policy to hire, train, promote, and pay associates based on their job-related qualifications, ability, and performance, without regard to race, color, creed, religion, national origin, citizenship status, sex, sexual orientation, gender identity/expression, pregnancy, marital status, age, mental or physical disability, genetic information, medical condition, military or veteran status, or any other factor protected by law. Statement to Third Party Agencies To ALL recruitment agenciesNCR Atleos only accepts resumes from agencies on the NCR Atleos preferred supplier list. Please do not forward resumes to our applicant tracking system, NCR Atleos employees, or any NCR Atleos facility. NCR Atleos is not responsible for any fees or charges associated with unsolicited resumes.

Posted 2 weeks ago

Apply

12.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

NVIDIA is hiring a Senior Integration Engineer with an emphasis to develop and scale Enterprise Integration and API platforms. We are building a developer-focused Unified Integration framework for flawless system/application connectivity to enable much of self-service with a focus on datasets for AI application development. Together, we will advance NVIDIA's capacity to build and deploy leading solutions for a broad range of AI-based applications such as autonomous vehicles, healthcare, virtual reality, graphics engines, and visual computing. Together, with NVIDIA partners, we will bring autonomous vehicles to life! What You'll Be Doing Develop roadmap and framework for Enterprise Integrations, B2B and API ecosystem. Lead and collaborate with multi-functional teams to champion a cohesive Integration & API strategy. Build scalable and distributed Integration pipelines that will help power the Unified Enterprise Integration Platform Architecting and designing Integrations for supporting a large volume of applications & systems with real-time enablement, streaming & with event-driven architecture. Well versed with business processes for Operations, Order-to-Cash process, Planning, Supply Chain, Finance and Master Data Management integrations as well IaaS, PaaS, SaaS Collaborate with various Product/Engineering teams to comprehend their data and computing requirements (SW, HW, Automobile, AI) and integrate innovative algorithms into production systems. Automate all aspects of measuring, testing, updating, monitoring, and alerting. What We Need To See Bachelors (or equivalent experience) or Masters in Computer Science, or related Engineering Degree. 12+ years of proven experience in Enterprise Integration/API framework worked on designing and developing software with micro-services, integration patterns, reusable artifacts in heterogeneous applications, systems & external partners, should have expert level knowledge in one of the primary integration framework as SAP CPIS, Integration Suite, API Management, Advanced Event Mesh, MuleSoft, MFT such as GoAnywhere. Worked on Importing, Validating security objects such as certs, private/public keys, SSH Keys, SFTP/FTP, SSH keys/certificates, encryption/decryption using PGP/GPG with solid understanding on SOA, ESOA, API, Web-Services, IIS, REST/SOAP/APIs Deep understanding of scripting language as PowerShell, Person and should be very strong in JAVA Experience in Development of iFlow's using various adapters (Rest, SOAP, oDATA, SFTP, MQ, RNIF etc.) Strong understanding of all integration flow components such as Splitter-Gather, Content Modifier, Enricher, Encryption/signing, Persistence steps, etc. Experience in writing Groovy, java scripts for handling sophisticated transformation needs, Integration Content Advisor to build EDI integrations Has extensive knowledge on API Proxies, security policies such as OAUTH 2.0, JWT Token, Access Control, Quota, Raise Fault etc. Experience in crafting resources, developing custom adapters in Open-Connectors and using them to develop integration flows in CPI. Proficient with building CI/CD framework using tools such as GIT, Jenkins and JFrog, Splunk for log monitoring and having a few experiences with message brokers would be advantageous. Ways To Stand Out From The Crowd Have worked with multiple cloud, SaaS, EDIs across Hi-Tech/Semiconductor industry Build real-time integrations with ZERO-downtime thoughts, beyond applications to infrastructure, active directory Prior experience demonstrating knowledge of Order-to-Cash process, Including Order booking, Credit Risk Assessment, Fulfillment, Invoicing, Revenue Recognition, Entitlements and Renewals Certification in integration domains NVIDIA is widely considered to be one of the technology world’s most desirable employers. We have some of the most forward-thinking and hardworking people on the planet working for us. If you're creative and autonomous, we want to hear from you! JR1997367 Show more Show less

Posted 2 weeks ago

Apply

Exploring sftp Jobs in India

The job market for sftp professionals in India is growing steadily, with many companies looking to hire skilled individuals who can manage secure file transfer protocols effectively. If you are a job seeker interested in pursuing a career in sftp, this article will provide you with valuable insights into the job market, salary ranges, career progression, related skills, and interview questions for sftp roles in India.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

Average Salary Range

In India, the average salary range for sftp professionals varies based on experience levels. Entry-level positions may offer salaries ranging from INR 3-5 lakhs per annum, while experienced professionals can earn between INR 8-15 lakhs per annum.

Career Path

A typical career progression in the sftp field may include roles such as Junior Developer, Senior Developer, Tech Lead, and eventually moving into management positions such as Project Manager or IT Director.

Related Skills

In addition to sftp expertise, professionals in this field are often expected to have knowledge of networking protocols, cybersecurity principles, Linux operating systems, scripting languages (such as Python or Bash), and cloud computing platforms.

Interview Questions

  • What is sftp and how does it differ from ftp? (basic)
  • Explain the key components of a secure file transfer protocol. (medium)
  • How would you troubleshoot connectivity issues while using sftp? (medium)
  • Can you explain the process of setting up public key authentication for sftp? (medium)
  • What are some best practices for securing sftp connections? (medium)
  • How do you monitor sftp server performance and ensure optimal operation? (medium)
  • Describe a scenario where you had to handle a security breach in an sftp environment. How did you address it? (advanced)
  • What tools or software do you use to automate sftp processes? (medium)
  • How do you handle large file transfers efficiently in sftp? (medium)
  • Can you explain the difference between sftp and scp? (basic)
  • How do you ensure data integrity and confidentiality in sftp transfers? (medium)
  • What are some common challenges you have faced while working with sftp and how did you overcome them? (medium)
  • How do you stay updated with the latest trends and developments in the sftp field? (basic)
  • Have you ever integrated sftp with any other systems or applications? If so, can you describe the process? (medium)
  • What security measures do you implement to protect sensitive data during sftp transfers? (medium)
  • How do you handle errors or failures during file transfers in sftp? (medium)
  • Can you explain the difference between passive and active mode in sftp? (basic)
  • What are some ways to optimize sftp performance for large-scale file transfers? (medium)
  • How do you ensure compliance with regulatory requirements when using sftp for data transfers? (medium)
  • Have you ever worked on automating sftp workflows using scripting languages? If so, can you provide an example? (medium)
  • How do you handle authentication and authorization in an sftp environment? (medium)
  • What considerations do you take into account when designing a secure sftp architecture? (advanced)
  • How do you troubleshoot issues related to file permissions in an sftp setup? (medium)
  • Can you explain the role of encryption in securing sftp connections? (medium)

Closing Remark

As you explore sftp job opportunities in India, remember to showcase your expertise in secure file transfer protocols, stay updated with industry trends, and prepare thoroughly for interviews. By honing your skills and demonstrating your capabilities confidently, you can excel in sftp roles and advance your career in the field. Good luck on your job search journey!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies