Home
Jobs
2705 Job openings at Virtusa
About Virtusa

Virtusa is a global provider of digital business strategy, digital engineering, and technology transformation services for global enterprises.

RPA Blue Prism Developer

Gurgaon / Gurugram, Haryana, India

11 - 15 years

INR 11.0 - 15.0 Lacs P.A.

On-site

Full Time

Skill: RPA Blue Prism Developer Role: T2, T1 Key responsibility: RPA (Blue Prism) Developer Overview We are seeking an experienced Blue Prism Developer to join our dynamic team. The ideal candidate will have a strong background in Blue Prism, .NET programming, Powershell, and a solid understanding of reusability concepts and error handling. This role is perfect for someone who is innovative, eager to learn new technologies quickly, and can effectively apply their technical knowledge and experience. Key Responsibilities Design, build, and test applications using Blue Prism and one other programming language Manage process scheduling and monitor processes via control room Utilize excellent debugging skills to troubleshoot and resolve issues efficiently Implement and manage data gateways Provide guidance and support to junior team members Ensure adherence to best practices in error handling and reusability concepts Work with monitoring tools like Sumo and Apps for optimal performance Support and maintain applications, including Prod Support and on-call support activities Required Qualifications 6 - 12 years of strong technical expertise in Blue Prism and one programming language Certified in Blue Prism Strong understanding of powershell capabilities Proficient in SOAP and Rest APIs Good knowledge of database management (SQL Server) and ability to write DB queries Experience with DevOps and agile ecosystems Basic programming skills and a good understanding advanced BP functions Preferred Qualifications Exposure to monitoring and alerting tools like Sumo/Dynatrace Domain/Functional knowledge in BFSI Personal Attributes Innovative thinker with the ability to learn and use new technologies effectively Strong problem-solving skills and excellent debugging capabilities Ability to work independently and as part of a team Excellent communication and interpersonal skills

Mainframe Senior Lead

Chennai, Tamil Nadu, India

10 - 12 years

INR 10.0 - 12.0 Lacs P.A.

On-site

Full Time

The candidate should have experience in below Experience in defining, developing, and leading an interactive team to implement scalable application solutions Mainframe application development, automation, and support, utilizing COBOL, CICS, DB2, JCL, VSAM, MQ-Series, and SQL Mainframe performance tuning and capacity planning, including ways to reduce I/Os, CPU Time, MSUs, and MIPs, using Strobe, iStrobe, and Platinum Detector File format and comparison using File-AID; version management using Endeavor; debugging and unit testing using Intertest and Abend-AID; and job scheduling using Control-M Knowledge of MQ is a plus The candidate should have at least 7+ years of work history in application development, preferably in Banking and Financial Services The applicant should have experience in a structured SDLC (Software Development Life Cycle) process; Analysis, Design, Development, Testing and Production implementation Knowledge of QA Procedures, guidelines, and controls Critical thinking, problem solving and business requirements translation Clear and concise verbal and written communication Development Methodologies Experience/ Qualifications : MCA/B. Tech/Any Graduate

Senior Data Engineer

Mumbai, Maharashtra, India

8 - 10 years

INR 8.0 - 10.0 Lacs P.A.

On-site

Full Time

Sr Developer with special emphasis and experience of 8 to 10 years on Python and Pyspark along with hands on experience on AWS Data components like AWS Glue, Athena etc.,. Also have good knowledge on Data ware house tools to understand the existing system. Candidate should also have experience on Datalake, Teradata and Snowflake. Should be good at terraform. 8-10 years of experience in designing and developing Python and Pyspark applications Creating or maintaining data lake solutions using Snowflake,taradata and other dataware house tools. Should have good knowledge and hands on experience on AWS Glue , Athena etc., Sound Knowledge on all Data lake concepts and able to work on data migration projects. Providing ongoing support and maintenance for applications, including troubleshooting and resolving issues. Expertise in practices like Agile, Peer reviews and CICD Pipelines.

Talend + Snowflake

Chennai, Tamil Nadu, India

5 - 8 years

INR 5.0 - 8.0 Lacs P.A.

On-site

Full Time

Talend - Designing, developing, and documenting existing Talend ETL processes, technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. AWS / Snowflake - Design, develop, and maintain data models using SQL and Snowflake / AWS Redshift-specific features. Collaborate with stakeholders to understand the requirements of the data warehouse. Implement data security, privacy, and compliance measures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Stay current with new AWS/Snowflake services and features and recommend improvements to existing architecture. Design and implement scalable, secure, and cost-effective cloud solutions using AWS / Snowflake services. Collaborate with cross-functional teams to understand requirements and provide technical guidance.

Network Engineer

Hyderabad / Secunderabad, Telangana, Telangana, India

3 - 6 years

INR 3.0 - 6.0 Lacs P.A.

On-site

Full Time

Develop and execute network testing strategies to validate performance, security, and reliability. Perform functional, performance, and security testing of network infrastructure, including OLT and network management systems such as Axiros. Test and troubleshoot fiber optic networks, including fiber optic cables, splicing, connections, and related testing equipment. Analyze and resolve network issues related to IP routing, TCP/IP, BGP, OSPF, and EIGRP. Automate network testing processes using Python or Java to improve efficiency and accuracy. Collaborate with network engineers to enhance network monitoring and management tools.. Experience in configuring and troubleshooting routers, switches, and firewalls. Strong analytical and problem-solving skills with a proactive approach to debugging complex network issues

AWS Data Engineer

Chennai, Tamil Nadu, India

5 - 10 years

INR 5.0 - 10.0 Lacs P.A.

On-site

Full Time

Job Title: Senior Data Engineer Key Responsibilities As a Senior Data Engineer, you will: Data Pipeline Development: Design, build, and maintain scalable data pipelines using PySpark and Python. AWS Cloud Integration: Work with AWS cloud services (S3, Lambda, Glue, EMR, Redshift) for data ingestion, processing, and storage. ETL Workflow Management: Implement and maintain ETL workflows using DBT and orchestration tools (e.g., Airflow). Data Warehousing: Design and manage data models in Snowflake, ensuring performance and reliability. SQL Optimization: Utilize SQL for querying and optimizing datasets across different databases. Data Integration: Integrate and manage data from MongoDB, Kafka, and other streaming or NoSQL sources. Collaboration & Support: Collaborate with data scientists, analysts, and other engineers to support advanced analytics and Machine Learning (ML) initiatives. Data Quality & Governance: Ensure data quality, lineage, and governance through best practices and tools. Mandatory Skills & Experience Strong programming skills in Python and PySpark . Hands-on experience with AWS data services (S3, Lambda, Glue, EMR, Redshift). Proficiency in SQL and experience with DBT for data transformation. Experience with Snowflake for data warehousing. Knowledge of MongoDB , Kafka , and data streaming concepts. Good understanding of data architecture, data modeling, and data governance . Familiarity with large-scale data platforms. Essential Professional Skills Excellent problem-solving skills . Ability to work independently or as part of a team . Experience with CI/CD and DevOps practices in a data engineering environment (Plus). Qualifications Proven hands-on experience working with large-scale data platforms . Strong background in Python, PySpark, AWS , and modern data warehousing tools such as Snowflake and DBT . Familiarity with NoSQL databases like MongoDB and real-time streaming platforms like Kafka.

Java Developer (EFM RFP)

Pune, Maharashtra, India

5 - 7 years

INR 5.0 - 7.0 Lacs P.A.

On-site

Full Time

5+years of experience in Software development with end to end understanding of the development process Strong hands-on experience in Java, Springboot, Spring Data Good experience in MQ Good experience in Oracle DB Good understanding of Devops tools and process Able to work independently Design and develop scalable and resilient systems using Java or Python to contribute to continual, iterative improvements for product teams Executes software solutions, design, development, and technical troubleshooting Identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems Java Springboot SQL MQ

Java Developer

Hyderabad / Secunderabad, Telangana, Telangana, India

5 - 7 years

INR 2.5 - 9.5 Lacs P.A.

On-site

Full Time

Design, develop, and maintain Java applications using Spring Boot framework. Develop and integrate RESTful APIs following microservices architecture and design patterns. Work with distributed systems like Kafka for messaging and event-driven architecture. Perform database interactions using Spring JPA , Hibernate , and RDBMS (Oracle, MSSQL). Collaborate with DevOps to containerize services using Docker , deploy using Kubernetes , and automate CI/CD pipelines using Jenkins . Participate in debugging, testing, and application performance tuning using modern IDEs and tools.

Java Developer (FDN)

Pune, Maharashtra, India

3 - 6 years

INR 3.0 - 6.0 Lacs P.A.

On-site

Full Time

Design and develop scalable Front End software solutions (e.g. Angular components) using Agile methodology Partner with various technology teams to ensure smooth transition through development cycle Build automated unit test cases Leverage CI/CD principles and techniques to deliver quality code Work with development teams, project and product managers to develop software solutions Design and develop scalable Front End software solutions (e.g. Angular components) using Agile methodology Partner with various technology teams to ensure smooth transition through development cycle Build automated unit test cases Leverage CI CD principles and techniques to deliver quality code Work with development teams, project and product managers to develop software solutions Strong skills in Angular 14 above Good understanding of Java Fundamentals

Java Developer

Bengaluru / Bangalore, Karnataka, India

5 - 9 years

INR 2.5 - 9.5 Lacs P.A.

On-site

Full Time

8+ years of software development experience in Java, SpringBoot, Microservices Worked on SpringBoot framework Implemented RESTAPI implementation Experience in CI/CD automation using Jenkins, Familiar with Agile or other rapid application development methods. Participate in SAFe Agile scrum ceremonies Involved in Applicaiton development, Unit Testing and deploying application in various environments they develop working software. Experience with distributed (multi-tiered) systems and relational databases. Demonstrated experience with object-oriented design and coding with variety of languages.a 8+ years of software development experience in Java, SpringBoot, Microservices Worked on SpringBoot framework Implemented RESTAPI implementation Experience in CI/CD automation using Jenkins, Familiar with Agile or other rapid application development methods. Participate in SAFe Agile scrum ceremonies Involved in Applicaiton development, Unit Testing and deploying application in various environments they develop working software. Experience with distributed (multi-tiered) systems and relational databases. Demonstrated experience with object-oriented design and coding with variety of languages.a Involved in Applicaiton development, Unit Testing and deploying application in various environments they develop working software. Experience with distributed (multi-tiered) systems and relational databases. Demonstrated experience with object-oriented design and coding with variety of languages. Java SpringBoot REST API Microservices Any Database knowledge ( SQL, No SQL) CI/CD

Functional QA Lead

Bengaluru / Bangalore, Karnataka, India

8 - 12 years

INR 8.0 - 12.5 Lacs P.A.

On-site

Full Time

We are looking for 8 years to 12 years detailed oriented and experienced QA Functional Tester with a strong background in payments messaging systems (e.g., SWIFT, ISO 20022) hands-on experience with Mainframe environments (e.g., COBOL, JCL, DB2). The ideal candidate will be responsible for creating and executing functional test cases to ensure high-quality delivery of payment processing applications in a Mainframe-based architecture. Analyze functional requirements and translate them into comprehensive test scenarios and cases. Perform functional, regression, and integration testing for payment systems (SWIFT, ISO 20022, SEPA, ACH, etc.). Validate message formats, routing rules, and compliance with payment standards. Conduct testing on Mainframe-based applications, including job execution and data validation. Utilize tools such as JIRA, HP ALM/QC, or similar to manage test plans, defects, and reporting. Collaborate closely with business analysts, developers, and stakeholders to understand business processes and deliver test coverage Participate in defect triage and provide accurate test reports and metrics. Support UAT and post-deployment verification in production-like environments. Good hands on experience in QA functional testing. Strong knowledge of Payments domain, especially SWIFT, ISO 20022, SEPA, or domestic clearing protocols. Experience testing Mainframe applications (COBOL, JCL, DB2, CICS). Familiarity with message validation and file-based interfaces (XML, MX/MT messages, etc.). Solid understanding of test lifecycle and methodologies (Agile/Waterfall). Experience with test management tools such as JIRA, HP ALM, or Zephyr. Strong SQL/database query skills. Excellent communication and documentation skills.

Web API

Gurgaon / Gurugram, Haryana, India

8 - 10 years

INR 8.0 - 10.0 Lacs P.A.

On-site

Full Time

Qualification Capability in Test Automation and setting up frameworks on multiple application types and platforms: Web, Microservice/API & Mobile Knowledge and experience in MS Dynamics/ETL testing/ETL, UNIX, Azure Knowledge and experience implementing testing fundamentals such as TDD, BDD and Scrum Hands-on experience with Cucumber & Gherkin Experience of working in an agile environment with at least foundational experience of digital infrastructure, configuration management, continuous integration & automated software releases Deep knowledge and insights into testing best practices and implementation of unit, functional, integration, regression, tooling and frameworks covering functional requirements Competent in working as a part of a Scrum team, in shaping User Stories and identifying acceptance criteria and key test scenarios Skilled in automating functional testing in sprint and maintaining automated regression packages Proficient in applying industry best practices, fostering knowledge sharing to provide solutions for complex business problems Experienced working in a cross-functional delivery squads working with product owners, solution engineers, solution architects and other Quality Engineers Proven ability to produce and automate appropriate test artefacts i.e test plans, approaches, summary reports Technical Requirements Minimum of 3yrs + experience of the following: Strong commercial experience with Java, Selenium, Serenity (or equivalent) all within multi-tiered environments Experience of test automation RESTful API's Manual Testing using Postman Experience in test automation within a continuous integration environment (CI CD pipelines, ideally Jenkins) Expertise in Native Mobile App Automation Test Automation for IOS and Android Solid experience in using Appium with UI locator strategy for native and hybrid mobile apps Fully proficient with Apple Xcode and Android Studio to setup the test automation setup Manual Testing of Mobile apps on varied devices using Emulators or Physical devices Nice to have Strong understanding of user acceptance testing Experience in NFT and shift left performance testing Experience of automated software releases, configuration management and system management in a high availability cloud environment; Containerization experience highly desirable Experience working with build tools like maven or Gradle Experienced in compatibility testing tools (ideally Browserstack) Key skills include Computer Science and/or Engineering degrees are preferred while other degree subjects may be considered Highly effective communication skills working with all levels of the organization Ability to thrive in a fast-paced, collaborative environment Problem solving ability Relentless focus on delivering business value through sound engineering methods and principles

Java architect

Hyderabad / Secunderabad, Telangana, Telangana, India

7 - 10 years

INR 3.0 - 11.5 Lacs P.A.

On-site

Full Time

Develop high-quality Java applications using Java 8+ and modern frameworks such as Spring Boot and Spring Cloud Work with Camunda BPM or similar BPM tools for process automation and business workflow management Design and implement RESTful APIs for efficient backend communication Integrate backend systems with frontend React platforms, ensuring seamless communication through REST APIs and WebSockets Architect and build microservices-based solutions in cloud environments with containerization (Docker, Kubernetes) Perform database design and optimize queries for both SQL and NoSQL databases Collaborate in Agile environments, utilizing Git for version control and CI/CD pipelines for continuous integration and delivery Troubleshoot and debug complex issues with a focus on performance, scalability, and maintainability Mentor junior developers and provide leadership in project development and best practices

Python Lead Developer

Chennai, Tamil Nadu, India

10 - 15 years

INR 10.0 - 15.0 Lacs P.A.

On-site

Full Time

Key Responsibilities Design, develop, test, and deploy robust backend solutions using Python Build RESTful APIs for both web and mobile applications Optimize applications for speed, scalability, and reliability Collaborate closely with cross-functional teams to define, design, and ship new features Write clean, modular, well-documented, and testable code Integrate third-party APIs and services Participate in code reviews and mentor junior developers Work with CI/CD pipelines, Docker containers, and cloud platforms such as AWS, GCP, or Azure Required Skills 5 to 6 years of experience in Python development Proficiency in one or more Python frameworks such as Django, Flask, or FastAPI Solid understanding of object-oriented programming, data structures, and algorithms Experience working with both SQL and NoSQL databases, including PostgreSQL, MySQL, and MongoDB Strong grasp of RESTful API design principles Hands-on experience with Git, Docker, and Linux-based development environments Familiarity with testing frameworks such as PyTest or UnitTest

Python

Chennai, Tamil Nadu, India

2 - 5 years

INR 2.0 - 5.0 Lacs P.A.

On-site

Full Time

Looking an expertise technical candidate to perform the tasks on time with quality Candidate should have real time experience working directly with clients and meet their expectations deliberately Primary Skills Python Django Flask React Angular Typescript Note Need to work from office min 3 days a week

Java - Architect

Chennai, Tamil Nadu, India

10 - 14 years

INR 10.0 - 14.0 Lacs P.A.

On-site

Full Time

Looking for Hands on Architect who can design end to end solutions for Deposit Booking / Deposit Preter / Rollover application. Hands on experience in designing Database for complex transactions having large data. Able to provide end to end design for global applications building a. Micro UIs using Angular b. Global Java Microservices and c. Data base designed mapping to Entity classes. Looking for only architects who can do coding and build skeletons / frameworks / critical APIs on day to day basis. Coding frameworks / Hand holding team in developing code is mandatory. Should have good understanding of Devops ECS infra. Kuberneties / APIGEE / Docker / Jenkins experience is mandatory

AWS Data Engineer

Chennai, Tamil Nadu, India

6 - 8 years

INR 6.0 - 8.0 Lacs P.A.

On-site

Full Time

Sr. AWS Data Engineer P3 C3 TSTS Primary Skills Experience in data engineering, with a proven focus on data ingestion and extraction using Python/PySpark. Extensive AWS experience is mandatory, with proficiency in Glue, Lambda, SQS, SNS, AWS IAM, AWS Step Functions, S3, and RDS (Oracle, Aurora Postgres). 4+ years of experience working with both relational and non-relational/NoSQL databases is required. Strong SQL experience is necessary, demonstrating the ability to write complex queries from scratch. Also, experience in Redshift is required along with other SQL DB experience. Strong scripting experience with the ability to build intricate data pipelines using AWS serverless architecture. Understanding of building an end-to-end Data pipeline. Secondary Skills Strong understanding of Kinesis, Kafka, CDK. Experience with Kafka and ECS is also required. Strong understanding of data concepts related to data warehousing, business intelligence (BI), data security, data quality, and data profiling is required. Experience in Node Js and CDK. JD Responsibilities Lead the architectural design and development of a scalable, reliable, and flexible metadata-driven data ingestion and extraction framework on AWS using Python/PySpark. Design and implement a customizable data processing framework using Python/PySpark. This framework should be capable of handling diverse scenarios and evolving data processing requirements. Implement data pipeline for data ingestion, transformation, and extraction leveraging the AWS Cloud Services. Seamlessly integrate a variety of AWS services, including S3, Glue, Kafka, Lambda, SQL, SNS, Athena, EC2, RDS (Oracle, Postgres, MySQL), AWS Crawler to construct a highly scalable and reliable data ingestion and extraction pipeline. Facilitate configuration and extensibility of the framework to adapt to evolving data needs and processing scenarios. Develop and maintain rigorous data quality checks and validation processes to safeguard the integrity of ingested data. Implement robust error handling, logging, monitoring, and alerting mechanisms to ensure the reliability of the entire data pipeline. Qualifications Must Have Over 6 years of hands-on experience in data engineering, with a proven focus on data ingestion and extraction using Python/PySpark. Extensive AWS experience is mandatory, with proficiency in Glue, Lambda, SQS, SNS, AWS IAM, AWS Step Functions, S3, and RDS (Oracle, Aurora Postgres). 4+ years of experience working with both relational and non-relational/NoSQL databases is required. Strong SQL experience is necessary, demonstrating the ability to write complex queries from scratch. Strong working experience in Redshift is required along with other SQL DB experience. Strong scripting experience with the ability to build intricate data pipelines using AWS serverless architecture. Complete understanding of building an end-to-end Data pipeline. Nice to have Strong understanding of Kinesis, Kafka, CDK. A strong understanding of data concepts related to data warehousing, business intelligence (BI), data security, data quality, and data profiling is required. Experience in Node Js and CDK. Experience with Kafka and ECS is also required. Qualification AWS Data Engineer Workmode: Hybrid Work location: PAN INDIA Work Timing: 2 PM to 11 PM Primary Skill: Data Engineer Experience in data engineering, with a proven focus on data ingestion and extraction using Python/PySpark. Extensive AWS experience is mandatory, with proficiency in Glue, Lambda, SQS, SNS, AWS IAM, AWS Step Functions, S3, and RDS (Oracle, Aurora Postgres). 4+ years of experience working with both relational and non-relational/NoSQL databases is required. Strong SQL experience is necessary, demonstrating the ability to write complex queries from scratch. Also, experience in Redshift is required along with other SQL DB experience. Strong scripting experience with the ability to build intricate data pipelines using AWS serverless architecture. Understanding of building an end-to-end Data pipeline. Secondary Skills Strong understanding of Kinesis, Kafka, CDK. Experience with Kafka and ECS is also required. Strong understanding of data concepts related to data warehousing, business intelligence (BI), data security, data quality, and data profiling is required. Experience in Node Js and CDK. JD Responsibilities Lead the architectural design and development of a scalable, reliable, and flexible metadata-driven data ingestion and extraction framework on AWS using Python/PySpark. Design and implement a customizable data processing framework using Python/PySpark. This framework should be capable of handling diverse scenarios and evolving data processing requirements. Implement data pipeline for data ingestion, transformation, and extraction leveraging the AWS Cloud Services. Seamlessly integrate a variety of AWS services, including S3, Glue, Kafka, Lambda, SQL, SNS, Athena, EC2, RDS (Oracle, Postgres, MySQL), AWS Crawler to construct a highly scalable and reliable data ingestion and extraction pipeline. Facilitate configuration and extensibility of the framework to adapt to evolving data needs and processing scenarios. Develop and maintain rigorous data quality checks and validation processes to safeguard the integrity of ingested data. Implement robust error handling, logging, monitoring, and alerting mechanisms to ensure the reliability of the entire data pipeline. Qualifications Must Have Over 6 years of hands-on experience in data engineering, with a proven focus on data ingestion and extraction using Python/PySpark. Extensive AWS experience is mandatory, with proficiency in Glue, Lambda, SQS, SNS, AWS IAM, AWS Step Functions, S3, and RDS (Oracle, Aurora Postgres). 4+ years of experience working with both relational and non-relational/NoSQL databases is required. Strong SQL experience is necessary, demonstrating the ability to write complex queries from scratch. Strong working experience in Redshift is required along with other SQL DB experience. Strong scripting experience with the ability to build intricate data pipelines using AWS serverless architecture. Complete understanding of building an end-to-end Data pipeline. Nice to have Strong understanding of Kinesis, Kafka, CDK. A strong understanding of data concepts related to data warehousing, business intelligence (BI), data security, data quality, and data profiling is required. Experience in Node Js and CDK. Experience with Kafka and ECS is also required.

Java AWS Developer

Hyderabad / Secunderabad, Telangana, Telangana, India

5 - 8 years

INR 3.0 - 11.0 Lacs P.A.

On-site

Full Time

Design, develop, and maintain Java microservices-based applications Create and maintain test plans and user guides for microservice applications Lead and mentor development teams, providing technical guidance Understand and contribute to the architecture of Java microservices systems Collaborate with cross-functional teams to ensure smooth delivery and integration Monitor application performance using tools like Splunk and Dynatrace

Snowflake Consultant

Chennai, Tamil Nadu, India

5 - 10 years

INR 5.0 - 10.0 Lacs P.A.

On-site

Full Time

Design, develop, and maintain data pipelines and ETL processes using AWS and Snowflake. Implement data transformation workflows using DBT (Data Build Tool). Write efficient, reusable, and reliable code in Python. Optimize and tune data solutions for performance and scalability. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Ensure data quality and integrity through rigorous testing and validation. Stay updated with the latest industry trends and technologies in data engineering. Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Proven experience as a Data Engineer or similar role. Strong proficiency in AWS and Snowflake. Expertise in DBT and Python programming. Experience with data modeling, ETL processes, and data warehousing. Familiarity with cloud platforms and services. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities.

Operation Engineer

Pune, Maharashtra, India

5 - 10 years

INR 5.0 - 10.0 Lacs P.A.

On-site

Full Time

Participate in service/application releases, scheduled and ad-hoc product maintenance releases (OS and middleware), systems updates and post-release incident management and reporting End to end service ownership, technical, functional and operational knowledge of the service and its components Providing support to the production services. (e.g. Incident/problem/batch/change management, system DR & roleswap) Providing technical support to the IT service owner, with your functional and operational knowledge of linux based services and components Monitoring the health of production systems and recovering the systems as quickly as possible when incidents occur Analysing and troubleshooting all kind of Production system issues Communicating with business users and production support managers to report system status The ideal candidate for this role be familiar with all and expert in some of the following: Essential Experience of software development using Cobol / DB2 / C Unix Scripting (Shell Scripting) Control-M scheduler Experience of working on and supporting Data Warehouse / MI services Technical troubleshooting and service problem determination Able to provide support for out of hours releases Desirable Experience of software development using SAS (midrange) Experience of raising and owning IT Change records UNIX - Linux (Redhat 8 +), including VIOM and VCS Agile Tooling Jira, Confluence, Slack CI/CD tools such as Jenkins, Git, GitHub, Nexus Knowledge of legacy HSBC systems knowledge would be beneficial In addition to the detail listed above, the ideal candidate will: Be an approachable, collaborative, and supportive team member Take ownership of issues and work automatously to drive them to resolution Have an ability and desire to keep with up with current trends and learn new technologies Have excellent written and spoken communication skills; an ability to adapt communication to the audience and the message to be conveyed

Virtusa

Virtusa

Information Technology and Services

Southborough

20,000+ Employees

2705 Jobs

    Key People

  • Kris Canekeratne

    Chairman and CEO
  • Sanjay Singh

    President and COO
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Job Titles Overview