Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
NTT DATA is looking for an AWS Devops Engineer to join their team in Pune, Maharashtra (IN-MH), India. As an AWS Devops Engineer, you will be responsible for building and maintaining a robust, scalable real-time data streaming platform leveraging AWS and Confluent Cloud Infrastructure. Your key responsibilities will include developing and building the platform, monitoring performance, collaborating with cross-functional teams, managing code using Git, applying Infrastructure as Code principles using Terraform, and implementing CI/CD practices using GitHub Actions. The ideal candidate must have a strong proficiency in AWS services such as IAM Roles, Access Control RBAC, S3, Lambda Functions, VPC, Security Groups, RDS, CloudWatch, and more. Hands-on experience in Kubernetes (EKS) and expertise in managing resources/services like Pods, Deployments, and Helm Charts is required. Additionally, expertise in Datadog, Docker, Python, Go, Git, Terraform, and CI/CD tools is essential. Understanding of security best practices and familiarity with tools like Snyk, Sonar Cloud, and Code Scene is also necessary. Nice-to-have skills include prior experience with streaming platforms like Apache Kafka, knowledge of unit testing around Kafka topics, and experience with Splunk integration for logging and monitoring. Familiarity with Software Development Life Cycle (SDLC) principles is a plus. NTT DATA is a trusted global innovator of business and technology services, serving 75% of the Fortune Global 100. They are committed to helping clients innovate, optimize, and transform for long-term success. NTT DATA offers diverse experts in more than 50 countries and a robust partner ecosystem. Their services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation, and management of applications, infrastructure, and connectivity. NTT DATA is dedicated to providing digital and AI infrastructure and is part of the NTT Group, investing significantly in R&D to support organizations and society in moving confidently into the digital future. Visit us at us.nttdata.com.,
Posted 1 day ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
As a Cloud Engineer at PAC Panasonic Avionics Corporation in Pune, India, you will have the exciting opportunity to modernize our legacy SOAP-based Airline Gateway (AGW) by building a cloud-native, scalable, and traceable architecture using AWS, Python, and DevOps practices. Your role will involve migrating from legacy SOAP APIs to modern REST APIs, implementing CI/CD pipelines, containerization, and automation processes to enhance system performance, reliability, and maintainability. You will play a crucial part in backend development, networking, and cloud-based solutions, contributing to scalable and efficient applications. Your responsibilities will include designing, building, and deploying cloud-native solutions on AWS, developing and maintaining backend services and web applications using Python, implementing CI/CD pipelines, automation, and containerization with tools like Docker, Kubernetes, and Terraform. You will utilize Python for backend development, ensuring scalability, security, and high availability of cloud systems while adhering to AWS best practices. Monitoring and logging solutions for real-time observability, system traceability, and performance tracking will also be part of your role. You will work closely with cross-functional teams to integrate cloud-based solutions, ensure alignment with cloud security and compliance standards, and actively contribute to performance improvements and infrastructure optimization. Your skills and qualifications should include experience with AWS cloud services, strong backend development experience with Python, proficiency in building and maintaining web applications and backend services, and solid understanding of Python web frameworks like Flask, Django, or FastAPI. Experience with database integration, DevOps tools, RESTful API design, cloud security, monitoring tools, and cloud infrastructure management are also essential. If you have a passion for cloud engineering, a strong background in Python development, and the ability to deliver scalable solutions to business challenges, this role is perfect for you. Join our team and be part of the exciting transformation towards cloud-native solutions in the airline industry. Experience Range: 3 to 5 years Preferred Skills: - Experience with airline industry systems or understanding of airline-specific technologies - AWS certifications, especially AWS Certified Solutions Architect, DevOps Engineer, or Developer - Familiarity with serverless architectures (AWS Lambda, API Gateway) and microservices - Strong problem-solving skills and ability to analyze complex technical issues for tailored solutions,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
This role requires you to be adept at troubleshooting, debugging, and working within a Cloud environment. You should be familiar with Agile and other development methodologies. Your responsibilities will include creating lambda functions with all the necessary security measures in place using AWS Lambda. You must demonstrate proficiency in Java & Node JS by developing services and conducting unit and integration testing. It is essential to have a strong understanding of security best practices such as using IAM Roles, KMS, and Pseudonymization. You should be able to define services on Swagger Hub and implement serverless approaches using AWS Lambda, including the Serverless Application Model (AWS SAM). Hands-on experience with RDS, Kafka, ELB, Secret Manager, S3, API Gateway, CloudWatch, and Event Bridge services is required. You should also be knowledgeable in writing unit test cases using the Mocha framework and have experience with Encryption & Decryption of PII data and file on Transit and at Rest. Familiarity with CDK (Cloud Development Kit) and creating SQS/SNS, DynamoDB, API Gateway using CDK is preferred. You will be working on a serverless stack involving Lambda, API Gateway, Step functions, and coding in Java / Node JS. Advanced networking concepts like Transit Gateway, VPC endpoints, and multi-account connectivity are also part of the role. Strong troubleshooting and debugging skills are essential, along with excellent problem-solving abilities and attention to detail. Effective communication skills and the ability to work in a team-oriented, collaborative environment are crucial for success in this role. Virtusa is a company that values teamwork, quality of life, and professional and personal development. By joining Virtusa, you become part of a global team that focuses on your growth and provides exciting projects, opportunities, and exposure to state-of-the-art technologies throughout your career. Collaboration and fostering excellence are at the core of Virtusa's values, offering a dynamic environment for great minds to thrive and innovate.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
You will be a Cloud Engineer at PAC Panasonic Avionics Corporation based in Pune, India. Your primary responsibility will be to modernize the legacy SOAP-based Airline Gateway (AGW) by building a cloud-native, scalable, and traceable architecture using AWS, Python, and DevOps practices. This will involve migrating from legacy SOAP APIs to modern REST APIs, implementing CI/CD pipelines, containerization, and automation processes to enhance system performance and reliability. Your role will also include backend development, networking, and cloud-based solutions to contribute to scalable and efficient applications. As a Cloud Engineer, your key responsibilities will include designing, building, and deploying cloud-native solutions on AWS, with a focus on migrating from SOAP-based APIs to RESTful APIs. You will develop and maintain backend services and web applications using Python for integration with cloud services and systems. Implementing CI/CD pipelines, automation, and containerization using tools like Docker, Kubernetes, and Terraform will be crucial aspects of your role. You will also utilize Python for backend development, including writing API services, handling business logic, and managing integrations with databases and AWS services. Ensuring scalability, security, and high availability of cloud systems will be essential, along with implementing monitoring and logging solutions for real-time observability. Collaboration with cross-functional teams to integrate cloud-based solutions and deliver high-quality, reliable systems will also be part of your duties. To excel in this role, you should have experience with AWS cloud services and cloud architecture, including EC2, S3, Lambda, API Gateway, RDS, VPC, IAM, CloudWatch, among others. Strong backend development experience with Python, proficiency in building and maintaining web applications and backend services, and solid understanding of Python web frameworks like Flask, Django, or FastAPI are required. Experience with database integration, DevOps tools, RESTful API design, and cloud security best practices is essential. Additionally, familiarity with monitoring tools and the ability to manage cloud infrastructure and deliver scalable solutions are crucial skills for this position. The ideal candidate for this role would have 3 to 5 years of experience and possess additional skills such as experience with airline industry systems, AWS certifications, familiarity with serverless architectures and microservices, and strong problem-solving abilities. If you are passionate about cloud engineering, have a strong background in Python development, and are eager to contribute to the modernization of legacy systems using cutting-edge technologies, we welcome your application for this exciting opportunity at PAC Panasonic Avionics Corporation.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a member of the IT Security team at Fresenius Digital Technology, you will play a crucial role in the implementation, management, and operation of various security capabilities across different business segments within the Fresenius Group. Your responsibilities will include the deployment and maintenance of Identity Governance and Administration (IGA) solutions to ensure alignment with security, business, and compliance objectives. You will be involved in technical integrations of business applications such as Active Directory, SAP, and cloud platforms, in collaboration with application owners. Implementing best practices for IGA processes, including identity lifecycle management, access reviews, role modeling, and access policies will also be part of your role. Your expertise will be essential in troubleshooting and resolving IGA-related incidents and service requests, as well as monitoring and reporting on access risks and policy violations. Collaboration with cross-functional teams comprising business, security, infrastructure, HR, and application teams, both internal and external, will be integral to developing identity security workflows and integrations. Additionally, staying updated with industry trends, emerging technologies, and best practices in identity governance will be crucial to your success in this role. To excel in this position, you are required to have a minimum of 3 years of experience in Identity Governance and Administration or IAM roles. Hands-on experience with the IGA platform SailPoint ISC is essential, along with a solid understanding of identity governance principles and familiarity with security protocols and authentication standards. You should also possess experience in integrating IGA tools with cloud, on-premises systems, and SaaS applications, coupled with strong collaboration, communication, and documentation skills. Preferred qualifications include SailPoint ISC or IdentityNow Engineer/Architect Certification, experience with cloud environments such as AWS and Azure, and prior exposure to regulated industries and modern identity ecosystems. If you are seeking a challenging yet rewarding working environment where your expertise will be valued, Fresenius Digital Technology in Bangalore, India, may be the ideal workplace for you. To apply for this opportunity, please reach out to Amit Kumar at Amit.Singh1@fresenius.com. *Please note that by applying for this position, you agree that the country-specific labor laws of the respective legal entity will be applicable to the application process.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Platform developer at Barclays, you will play a crucial role in shaping the digital landscape and enhancing customer experiences. Leveraging cutting-edge technology, you will work alongside a team of engineers, business analysts, and stakeholders to deliver high-quality solutions that meet business requirements. Your responsibilities will include tackling complex technical challenges, building efficient data pipelines, and staying updated on the latest technologies to continuously enhance your skills. To excel in this role, you should have hands-on coding experience in Python, along with a strong understanding and practical experience in AWS development. Experience with tools such as Lambda, Glue, Step Functions, IAM roles, and various AWS services will be essential. Additionally, your expertise in building data pipelines using Apache Spark and AWS services will be highly valued. Strong analytical skills, troubleshooting abilities, and a proactive approach to learning new technologies are key attributes for success in this role. Furthermore, experience in designing and developing enterprise-level software solutions, knowledge of different file formats like JSON, Iceberg, Avro, and familiarity with streaming services such as Kafka, MSK, Kinesis, and Glue Streaming will be advantageous. Effective communication and collaboration skills are essential to interact with cross-functional teams and document best practices. Your role will involve developing and delivering high-quality software solutions, collaborating with various stakeholders to define requirements, promoting a culture of code quality, and staying updated on industry trends. Adherence to secure coding practices, implementation of effective unit testing, and continuous improvement are integral parts of your responsibilities. As a Data Platform developer, you will be expected to lead and supervise a team, guide professional development, and ensure the delivery of work to a consistently high standard. Your impact will extend to related teams within the organization, and you will be responsible for managing risks, strengthening controls, and contributing to the achievement of organizational objectives. Ultimately, you will be part of a team that upholds Barclays" values of Respect, Integrity, Service, Excellence, and Stewardship, while embodying the Barclays Mindset of Empower, Challenge, and Drive in your daily interactions and work ethic.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Platform Engineer Lead at Barclays, your role is crucial in building and maintaining systems that collect, store, process, and analyze data, including data pipelines, data warehouses, and data lakes. Your responsibility includes ensuring the accuracy, accessibility, and security of all data. To excel in this role, you should have hands-on coding experience in Java or Python and a strong understanding of AWS development, encompassing various services such as Lambda, Glue, Step Functions, IAM roles, and more. Proficiency in building efficient data pipelines using Apache Spark and AWS services is essential. You are expected to possess strong technical acumen, troubleshoot complex systems, and apply sound engineering principles to problem-solving. Continuous learning and staying updated with new technologies are key attributes for success in this role. Design experience in diverse projects where you have led the technical development is advantageous, especially in the Big Data/Data Warehouse domain within Financial services. Additional skills in enterprise-level software solutions development, knowledge of different file formats like JSON, Iceberg, Avro, and familiarity with streaming services such as Kafka, MSK, and Kinesis are highly valued. Effective communication, collaboration with cross-functional teams, documentation skills, and experience in mentoring team members are also important aspects of this role. Your accountabilities will include the construction and maintenance of data architectures pipelines, designing and implementing data warehouses and data lakes, developing processing and analysis algorithms, and collaborating with data scientists to deploy machine learning models. You will also be expected to contribute to strategy, drive requirements for change, manage resources and policies, deliver continuous improvements, and demonstrate leadership behaviors if in a leadership role. Ultimately, as a Data Platform Engineer Lead at Barclays in Pune, you will play a pivotal role in ensuring data accuracy, accessibility, and security while leveraging your technical expertise and collaborative skills to drive innovation and excellence in data management.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
You will be responsible for working with AWS CDK using Type Script and CloudFormation template to manage various AWS services such as Redshift, Glue, IAM roles, KMS keys, Secrets Manager, Airflow, SFTP, AWS Lambda, S3, and Event Bridge. Your tasks will include executing grants, store procedures, queries, and Redshift Spectrum to query S3, defining execution roles, debugging jobs, creating IAM roles with fine-grained access, integrating and deploying services, managing KMS keys, configuring Secrets Manager, creating Airflow DAGs, executing serverless AWS Lambda functions, debugging Lambda functions, managing S3 object storage including lifecycle configuration, resource-based policies, and encryption, and setting up event triggers using Lambda Event Bridge with rules. You should have knowledge of AWS Redshift SQL workbench for executing grants and a strong understanding of networking concepts, security, and cloud architecture. Experience with monitoring tools like CloudWatch and familiarity with containerization tools like Docker and Kubernetes would be beneficial. Strong problem-solving skills and the ability to thrive in a fast-paced environment are essential. Virtusa is a company that values teamwork, quality of life, and professional and personal development. With a global team of 27,000 professionals, Virtusa is committed to supporting your growth by providing exciting projects, opportunities to work with cutting-edge technologies, and a collaborative team environment that encourages the exchange of ideas and excellence. At Virtusa, you will have the chance to work with great minds and unleash your full potential in a dynamic and innovative workplace.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
punjab
On-site
As a Senior Software Developer specializing in React, AWS, and DevOps, your role in Perth will involve utilizing your hands-on experience with React/Angular applications. You will be responsible for the setup, maintenance, and enhancement of cloud infrastructure for web applications, leveraging your expertise in AWS Cloud services. Your responsibilities will include understanding and implementing core AWS services, ensuring the application's security and scalability by adhering to best practices. You will be expected to establish and manage the CI/CD pipeline using the AWS CI/CD stack, while also demonstrating proficiency in BDD/TDD methodologies. In addition, your role will require expertise in serverless approaches using AWS Lambda and the ability to write infrastructure as code using tools like CloudFormation. Knowledge of Docker and Kubernetes will be advantageous, along with a strong understanding of security best practices such as IAM Roles and KMS. Furthermore, experience with monitoring solutions like CloudWatch, Prometheus, and the ELK stack will be beneficial in ensuring the performance and reliability of the applications you work on. Additionally, having a good understanding of DevOps practices will be essential for success in this role. If you have any further inquiries or require clarification on any aspect of the job, please do not hesitate to reach out.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
You have hands-on experience in AWS Cloud Java development and are an expert in implementing AWS services like EC2, VPC, S3, Lambda, Route53, RDS, Dynamo DB, ELB/ALB/NLB, ECS, SNS, SQS, CloudWatch, API Gateway etc. You also have knowledge on EFS / S3 for File storage and Cognito for authorization. Additionally, you have strong knowledge of Containerization and have worked on AWS ECS/ECR. You are proficient in inter-service communication through REST, gRPC, or messaging (SQS, Kafka). You have knowledge of writing Unit Test cases with JUNIT and strong notions of security best practices. Your expertise extends to AWS CDK and CDK Pipelines for IaC. You are capable of implementing service discovery, load balancing, and circuit breaker patterns. You have an understanding of logging and monitoring services like AWS CloudTrail, CloudWatch, GuardDuty, and other AWS security services. Experience with CI/CD tools, DevOps implementation, and HA/DR setup is part of your skill set. You possess excellent communication and collaboration skills. Your responsibilities include hands-on experience with technologies like Java, Spring Boot, Rest API, JPA, Kubernetes, Messaging Systems, Tomcat/JBoss. You develop and maintain microservices using Java, Spring Boot, and Spring Cloud. You design RESTful APIs with clear contracts and efficient data exchange. Ensuring cloud security and compliance with industry standards is part of your routine. You maintain cloud infrastructure using AWS Cloud Development Kit (CDK) and implement security best practices, including data encryption and adherence to security protocols. Qualifications required for this role include 6+ years of hands-on experience in AWS Cloud Java development, expertise in implementing AWS services, strong knowledge of Containerization, and inter-service communication skills. You must have a solid understanding of security best practices, knowledge on File storage and authorization, and writing Unit Test cases. Familiarity with serverless approaches using AWS Lambda is also essential. Nice to have qualifications include proven expertise in AWS CDK and CDK Pipelines, implementing service discovery, load balancing, and circuit breaker patterns, familiarity with logging and monitoring services, experience with CI/CD tools, DevOps implementation, and HA/DR setup. Excellent communication and collaboration skills are a bonus to work effectively in a team-oriented environment. About Virtusa: Virtusa embodies values such as teamwork, quality of life, and professional and personal development. By joining Virtusa, you become part of a global team of 27,000 people who care about your growth. You will have the opportunity to work on exciting projects, utilize state-of-the-art technologies, and advance your career with us. At Virtusa, great minds come together to nurture new ideas and foster excellence in a collaborative team environment.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
punjab
On-site
The Senior Software Developer role in Perth requires a candidate with good hands-on experience in developing React/Angular based applications. The ideal candidate should possess a strong understanding of AWS Cloud services and be capable of setting up, maintaining, and enhancing the cloud infrastructure for web applications. It is essential for the candidate to have expertise in core AWS services, along with the ability to implement security and scalability best practices. Furthermore, the candidate will be responsible for establishing the CI/CD pipeline using the AWS CI/CD stack and should have practical experience in BDD/TDD methodologies. Familiarity with serverless approaches utilizing AWS Lambda, as well as proficiency in writing infrastructure as code using tools like CloudFormation, is required. Additionally, experience with Docker and Kubernetes would be advantageous for this role. A solid understanding of security best practices, including the utilization of IAM Roles and KMS, is essential. The candidate should also have exposure to monitoring solutions such as CloudWatch, Prometheus, and the ELK stack. Moreover, the candidate should possess good knowledge of DevOps practices to effectively contribute to the development and deployment processes. If you have any queries regarding this role, please feel free to reach out.,
Posted 2 weeks ago
3.0 - 8.0 years
5 - 10 Lacs
Hyderabad
Work from Office
The Impact you will have in this role: The Enterprise Application Support role specializes in maintaining and providing technical support for all applications that are beyond the development stage and are running in the daily operations of the firm. This role works closely with development teams, infrastructure partners, and internal clients to advance and resolve technical support incidents. 3 days onsite is mandatory with 2 optional days remote work (Onsite Tuesdays, Wednesdays and a third day of your choosing) Maybe required to work Tuesday through Saturday or Sunday through Thursday on rotational or permanent basis. Your Primary Responsibilities: Experience with using ITIL Change, Incident and Problem management processes. Assist Major Incident calls and engaging the proper parties needed and helping to determine root cause. Troubleshoot and debug system component(s) to resolve technical issues in complex and highly regulated environments comprised of ground and cloud applications and services. Analyze proposed application design(s) and provide feedback on potential gaps or provide recommendations for optimization. Hands-on experience with Monitoring and Alerting processes in Distributed, Cloud and Mainframe environments. Knowledge and understanding of cyber security best practices and general security concepts like password rotation, access restriction and malware detection. Take part in Monthly Service Reviews (MSR) with Development partners to go over KPI metrics. Participate in Disaster Recovery / Loss of Region events (planned and unplanned) executing tasks and collecting evidence. Collaborate both within the team and across teams to resolve application issues and escalate as needed. Support audit requests in a timely fashion providing needed documentation and evidence. Plan and execute certificate creation/renewals as needed. Monitor Dashboards to better catch potential issues and aide in observability. Help gather and analyze project requirements and translate them into technical specification(s). Basic understanding of all lifecycle components (code, test, deploy). Good verbal and written communication and interpersonal skills, communicating openly with team members and others. Contribute to a culture where honesty and transparency are expected. On-call support with flexible work arrangement. **NOTE: The Primary Responsibilities of this role are not limited to the details above. ** Qualifications: Minimum of 3 years of relevant Production support experience. Bachelor's degree preferred or equivalent experience. Talents Needed for Success: Technical Qualifications (Distributed/Cloud): Hands on experience in Unix, Linux, Windows, SQL/PLSQL Familiarity working with relational databases (DB2, Oracle, Snowflake) Monitoring and Data Tools experience (Splunk, DynaTrace, Thousand Eyes, Grafana, Selenium, IBM Zolda) Cloud Technologies (AWS services (S3,EC2,Lambda,SQS,IAM roles), Azure, OpenShift, RDS Aurora, Postgress) Scheduling Tool experience (CA AutoSys, Control-M) Middleware experience (Solace, Tomcat, Liberty Server, WebSphere, WebLogic, JBoss) Messaging Queue Systems (IBM MQ, Oracle AQ, ActiveMQ, RabbitMQ, Kafka) Scripting languages (Bash, Python, Ruby, Shell, Perl, JavaScript) Hands on experience with ETL tools (Informatica Datahub/IDQ, Talend ) Technical Qualifications (Mainframe): Mainframe troubleshooting and support skills (COBOL, JCL, DB2, DB2 Stored Procedures, CICS, SPUFI, File aid) Mainframe scheduling (Job abends, Predecessor/Successor)
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough