Jobs
Interviews

864 Lambda Expressions Jobs - Page 14

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

7 - 11 Lacs

Noida

Work from Office

Tech Stack Java + Spring Boot AWS (ECS, Lambda, EKS) Drools (preferred but optional) APIGEE API observability, traceability, and security Skills Required: Strong ability to understand existing codebases, reengineer and domain knowledge to some extent. Capability to analyze and integrate the new systems various interfaces with the existing APIs. Hands-on experience with Java, Spring, Spring Boot, AWS, and APIGEE . Familiarity with Drools is an added advantage. Ability to write and maintain JIRA stories (10-15% of the time) and keeping existing technical specifications updated would be required. Should take end-to-end ownership of the project , create design, guide team and work independently on iterative tasks. Should proactively identify and highlight risks during daily scrum calls and provide regular updates. Mandatory Competencies Java - Core JAVA Others - Micro services Java Others - Spring Boot Cloud - AWS Lambda Cloud - Apigee Beh - Communication and collaboration

Posted 1 month ago

Apply

4.0 - 8.0 years

8 - 12 Lacs

Noida

Work from Office

Required Skills & Qualifications: - 57 years of industry experience building and deploying machine learning models. - Strong proficiency with machine learning algorithms including XGBoost, linear regression, and classification models. - Hands-on experience with AWS SageMaker for model development, training, and deployment. - Solid programming skills in Python (and relevant libraries such as scikit-learn, pandas, NumPy, etc.). - Strong understanding of model evaluation metrics, cross-validation, hyperparameter tuning, and performance optimization. - Experience in working with structured and unstructured datasets. - Knowledge of best practices in model deployment and monitoring in a production environment (ML Ops desirable). - Familiarity with tools like Docker, Git, CI/CD pipelines, and AWS ML services. - Excellent problem-solving skills, critical thinking, and attention to detail. - Strong communication and collaboration skills. Nice to Have: - Experience with additional AWS services like Lambda, S3, Step Functions, CloudWatch. - Exposure to deep learning frameworks like TensorFlow or PyTorch. - Familiarity with DataOps practices and agile methodologies. Mandatory Competencies Data Science - Machine learning Python - Numpy Data Science - Python Python - Panda Data Science - AWS Sagemaker

Posted 1 month ago

Apply

3.0 - 5.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Hello Visionary! We know that the only way a business thrive is if our people are growing. Thats why we always put our people first. Our global, diverse team would be happy to support you and challenge you to grow in new ways. Who knows where our shared journey will take you We are looking for Software Engineer- Python + AWS Youll make a difference by Establish, maintain, and evolve concepts in continuous integration and deployment (CI/CD) pipelines for existing and new services. Collaborate with Engineering and Operations teams to improve automation of workflows, effective monitoring, infrastructure, code testing, scripting capabilities, deployment with lower costs and reduce non-conformance costs. System troubleshooting and problem resolution across various applications. Participate in on call rotation Conduct root cause analysis of incidents Implement enhancements to the monitoring solution to minimize the false positives and identify service health regressions Communicate findings in verbal and written format to the application team Generate weekly data reports summarizing the health of the application Youll win us over by You must have BE / B. Tech / MCA / ME / M. Tech qualification with 3 - 5 Years of confirmed ability Must have experience in Windows & Linux and Networking & security (ExampleIAM, Authorization) topics Awareness of DevOps principles, Design Patterns, Enterprise Architecture Patterns, Micro service Architecture, ability to learn/use a wide variety of open-source technologies and tools You are expert and love to work in large project in an agile way (SAFe-Framework) Experience in AWS servicesServerless Services (Lambda, DynamoDB, API Gateway), Container Services(like ECS, ECR), Monitoring Services(like Cloudwatch, X-Ray), Orchestration Tools ( Kubernetes, Docker) Security Services(IAM, Secrets Manager), Network Services(VPC), EC2, Backup, S3, CDK, CloudFormation, Step functions Experience in Scripting languages:Python, Bash Create a better #TomorrowWithUs! This role, based in Bangalore, is an individual contributor position. You may be required to visit other locations within India and internationally. In return, you'll have the opportunity to work with teams shaping the future. At Siemens, we are a collection of over 312,000 minds building the future, one day at a time, worldwide. We are dedicated to equality and welcome applications that reflect the diversity of the communities we serve. All employment decisions at Siemens are based on qualifications, merit, and business need. Bring your curiosity and imagination, and help us shape tomorrow Find out more about Siemens careers at

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Pune

Work from Office

Hello Visionary! We know that the only way a business thrive is if our people are growing. Thats why we always put our people first. Our global, diverse team would be happy to support you and challenge you to grow in new ways. Who knows where our shared journey will take you We are looking for a Golang + Angular Developer. Youll make a difference by: Being proficient in Designing, developing, and maintaining robust backend services using Go, including RESTful APIs and microservices. Being proficient in Build and maintain smaller frontend applications in Angular, supporting full-stack feature delivery. Having ability to Operate, monitor, and troubleshoot existing applications to ensure performance, scalability, and reliability. Contributing to the development of complex, composite applications in a distributed system. Leading and maintaining CI/CD pipelines, ensure high code quality through Test-Driven Development (TDD). Utilizing container technologies like Docker and orchestration tools like Kubernetes (GitOps experience is a plus). Driving innovation by contributing new ideas, PoCs, or participating in internal hackathons. Youll win us over by: Holding a graduate BE / B.Tech / MCA/M.Tech/M.Sc with good academic record. 5+ Years of Experience in software development with a strong focus on Go (Golang). Working experience in building and maintaining production-grade microservices and APIs. Strong grasp of cloud platforms (AWS) including services like Lambda, ECS and S3. Hands-on experience with CI/CD, Git, and containerization (Docker). Working knowledge of Angular (intermediate or above) and full-stack technologies. Familiarity with distributed systems, message queues, and API design best practices. Having Experience with observability tools for logging, monitoring, and tracing. Passion for innovation and building quick PoCs in a startup-like environment. Personal Attributes: Excellent problem-solving and communication skills, able to articulate technical ideas clearly to stakeholders. Adaptable to fast-paced environments with a solution-oriented, startup mindset. Proactive and self-driven, with a strong sense of ownership and accountability. Actively seeks clarification and asks questions rather than waiting for instructions. Create a better #TomorrowWithUs! This role, based in Pune, is an individual contributor position. You may be required to visit other locations within India and internationally. In return, you'll have the opportunity to work with teams shaping the future. At Siemens, we are a collection of over 312,000 minds building the future, one day at a time, worldwide. We are dedicated to equality and welcome applications that reflect the diversity of the communities we serve. All employment decisions at Siemens are based on qualifications, merit, and business need. Bring your curiosity and imagination, and help us shape tomorrow Find out more about Siemens careers at

Posted 1 month ago

Apply

8.0 - 13.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Hello Talented Techie! We provide support in Project Services and Transformation, Digital Solutions and Delivery Management. We offer joint operations and digitalization services for Global Business Services and work closely alongside the entire Shared Services organization. We make optimal use of the possibilities of new technologies such as Business Process Management (BPM) and Robotics as enablers for efficient and effective processes. We are looking for Sr. AWS Cloud Architect Architect and Design Develop scalable and efficient data solutions using AWS services such as AWS Glue, Amazon Redshift, S3, Kinesis(Apache Kafka), DynamoDB, Lambda, AWS Glue(Streaming ETL) and EMR Integration Integrate real-time data from various Siemens organizations into our data lake, ensuring seamless data flow and processing. Data Lake Management Design and manage a large-scale data lake using AWS services like S3, Glue, and Lake Formation. Data Transformation Apply various data transformations to prepare data for analysis and reporting, ensuring data quality and consistency. Snowflake Integration Implement and manage data pipelines to load data into Snowflake, utilizing Iceberg tables for optimal performance and flexibility. Performance Optimization Optimize data processing pipelines for performance, scalability, and cost-efficiency. Security and Compliance Ensure that all solutions adhere to security best practices and compliance requirements. Collaboration Work closely with cross-functional teams, including data engineers, data scientists, and application developers, to deliver end-to-end solutions. Monitoring and Troubleshooting Implement monitoring solutions to ensure the reliability and performance of data pipelines. Troubleshoot and resolve any issues that arise. Youd describe yourself as: Experience 8+ years of experience in data engineering or cloud solutioning, with a focus on AWS services. Technical Skills Proficiency in AWS services such as AWS API, AWS Glue, Amazon Redshift, S3, Apache Kafka and Lake Formation. Experience with real-time data processing and streaming architectures. Big Data Querying Tools: Strong knowledge of big data querying tools (e.g., Hive, PySpark). Programming Strong programming skills in languages such as Python, Java, or Scala for building and maintaining scalable systems. Problem-Solving Excellent problem-solving skills and the ability to troubleshoot complex issues. Communication Strong communication skills, with the ability to work effectively with both technical and non-technical stakeholders. Certifications AWS certifications are a plus. Create a better #TomorrowWithUs! This role, based in Bangalore, is an individual contributor position. You may be required to visit other locations within India and internationally. In return, you'll have the opportunity to work with teams shaping the future. At Siemens, we are a collection of over 312,000 minds building the future, one day at a time, worldwide. Find out more about Siemens careers at

Posted 1 month ago

Apply

4.0 - 6.0 years

17 - 22 Lacs

Bengaluru

Work from Office

Hello talented techie! We know that the only way a business thrive is if our people are growing. Thats why we always put our people first. Our global, diverse team would be happy to support you and challenge you to grow in new ways. Who knows where our shared journey will take you We are looking for Senior Dev-ops Engineer Youll make a difference by Being an SRE L1 Commander, who is responsible for ensuring the stability, availability, and performance of critical systems and services. As the first line of defense in incident management and monitoring, the role requires real-time response, proactive problem solving, and strong coordination skills to address production issues efficiently. Monitoring and AlertingProactively supervise system health, performance, and uptime using monitoring tools like Datadog, Prometheus. Serving as the primary responder for incidents to tackle and resolve issues quickly, ensuring minimal impact on end-users. Accurately categorizing incidents, prioritize them based on severity, and raise to L2/L3 teams when vital. Ensuring systems meet Service Level Objectives (SLOs) and maintain uptime as per SLAs. Collaborating with DevOps and L2 teams to automate manual processes for incident response and operational tasks. Performing root cause analysis (RCA) of incidents using log aggregators and observability tools to identify patterns and recurring issues. Following predefined runbooks/playbooks to resolve known issues and document fixes for new problems. Youd describe yourself as Experienced professional with 4 to 6 years of validated experience in SRE, DevOps, or Production Support with monitoring tools (e.g., Prometheus, Datadog). Proven understanding of Linux/Unix operating systems and basic scripting skills (Python, Gitlab actions) cloud platforms (AWS, Azure, or GCP). Familiarity with container orchestration (Kubernetes, Docker, Helmcharts) and CI/CD pipelines. Exposure with ArgoCD for implementing GitOps workflows and automated deployments for containerized applications. Possessing experience in MonitoringDatadog, InfrastructureAWS EC2, Lambda, ECS/EKS, RDS, NetworkingVPC, Route 53, ELB and StorageS3, EFS, Glacier. Strong analytical skills to resolve production incidents efficiently. Basic understanding of networking concepts (DNS, Load Balancers, Firewalls). Good communication and interpersonal skills for incident communication and issue. Having preferred certificationsAWS Certified SysOps Administrator- Associate, AWS Certified Solutions Architect- Associate or AWS Certified DevOps Engineer- Professional Create a better #TomorrowWithUs! This role, based in Bangalore, is an individual contributor position. You may be required to visit other locations within India and internationally. In return, you'll have the opportunity to work with teams shaping the future. At Siemens, we are a collection of over 312,000 minds building the future, one day at a time, worldwide. We are dedicated to equality and welcome applications that reflect the diversity of the communities we serve. All employment decisions at Siemens are based on qualifications, merit, and business need. Bring your curiosity and imagination, and help us shape tomorrow Find out more about Siemens careers at

Posted 1 month ago

Apply

4.0 - 9.0 years

8 - 12 Lacs

Mumbai

Work from Office

Skill—Java Microservices Springboot Experience4-6 Yrs Ro leT3 Responsibilities: Strong proficiency in Java (8 or higher) and Spring Boot framework. Basic foundation on AWS services such as EC2, Lambda, API Gateway, S3, CloudFormation, DynamoDB, RDS. Experience developing microservices and RESTful APIs. Understanding of cloud architecture and deployment strategies. Familiarity with CI/CD pipelines and tools such as Jenkins, GitHub Actions, or AWS CodePipeline. Experience with monitoring/logging tools like CloudWatch, ELK Stack, or Prometheus is desirable. Familiarity with security best practices for cloud-native apps (IAM roles, encryption, etc.).

Posted 1 month ago

Apply

2.0 - 5.0 years

6 - 9 Lacs

Pune

Work from Office

Educational Bachelor of Engineering,Bachelor Of Technology,Bachelor Of Comp. Applications,Master Of Comp. Applications,Master Of Technology,Master Of Engineering,Bachelor Of Science,Master Of Science Service Line Engineering Services Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to work with clients to identify business challenges and contribute to client deliverables by refining, analyzing, and structuring relevant data Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Technical and Professional : Primary skillsAmazon connect, Lex, Lambda, PythonTechnology-Communication-IVR, CCT, Technology-Functional Testing-IVR Testing, Technology-Infrastructure-Contact Center-Contact Center model, Technology-Infrastructure-Contact Center-IVR Concepts Preferred Skills: Technology-Infrastructure-Contact Center-ContactCenter model Technology-Infrastructure-Contact Center-IVR Concepts Technology-Functional Testing-IVR Testing Technology-Communication-IVR/CCT

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Gurugram

Work from Office

About the Opportunity Title Senior Analyst Programmer - Java/AWS Cloud Department FIL India Technology - GPS Location Gurgaon, India Reports To Project Manager Level Level 3 Were proud to have been helping our clients build better financial futures for over 50 years. How have we achieved thisBy working together - and supporting each other - all over the world. So, join our Reconciliation & Product management Capability in GPS business and feel like youre part of something bigger. About your team The UK Retail Group is charged with the development and operating model for Fidelitys Wrap platforms for the Personal Investing business and FundsNetworkTM, including implementation of new tax wrappers, web front end and service layer, interfaces, document composition and production, STP/automation and migrations from legacy systems. Our objective is to be the supplier of choice for all IP services to our businesses as well as having the clear mandate to identify the future opportunities that will maintain and extend Fidelitys lead in online financial services. The department is currently expanding in order to satisfy the high level of demand for both short term tactical project delivery and the execution of a programme of strategic development which will see the complete reshaping of our platform capability and customer experience through the replacement and enhancement of our existing platform capability which supports our growing direct and intermediated businesses in the region. About your role This role is for an experienced Java and AWS Cloud developer (minimum of 4 years experience) to work within the Retail Technology. The successful candidate will be expected to work on Java utilities of Feed Processing Layer (that source the Funds data from PODS/PHUB). This places the candidate in a front line position with considerable business user interaction. As such, the role requires a high level of flexibility and the ability to work under pressure. Excellent written and verbal communications skills are essential to communicate effectively and appropriately to both business users and systems colleagues at all levels. The role requires close working with colleagues in the UK offices, and as such the candidate sometimes may be required to work UK business hours. On occasions the candidate may also be required to provide on-call support outside the normal business hours, and to undertake additional out of hours work to cover changes and release implementations. This demanding role would perfectly suit a dynamic individual looking to work in a fast paced environment to ensure the smooth running of business critical systems. About you You are expected to possess the following skills for this position. Ideal candidate having 4 to 6 years of experience. Essential Skills: Java, React, Spring Boot, JaxB AWS services like Cloudfront, S3, R53, Cognito, ECS, Lambda, API Gateway etc Deep knowledge and Experience of understanding the low/high level design aspects writing the unit testable code Experience of Source Control Tools such as Github, BitBucket Strong interpersonal, communication and client facing skills Ability to work closely with cross functional teams Desirable Skills include: Understanding on XML. Understanding of Unix Command. Business Domain related Exposure to the Finance industry would be an advantage. Strong interest and willingness to understand the Mutual Funds Business. Behavioural Strong interest in Technology and its applications. Self-motivation. Team Player. Good interpersonal skills

Posted 1 month ago

Apply

10.0 - 13.0 years

12 - 15 Lacs

Bengaluru

Work from Office

About the Opportunity Job TypeApplication 31 July 2025 TitlePrincipal Data Engineer (Associate Director) DepartmentISS LocationBangalore Reports ToHead of Data Platform - ISS Grade 7 Department Description ISS Data Engineering Chapter is an engineering group comprised of three sub-chapters - Data Engineers, Data Platform and Data Visualisation that supports the ISS Department. Fidelity is embarking on several strategic programmes of work that will create a data platform to support the next evolutionary stage of our Investment Process.These programmes span across asset classes and include Portfolio and Risk Management, Fundamental and Quantitative Research and Trading. Purpose of your role This role sits within the ISS Data Platform Team. The Data Platform team is responsible for building and maintaining the platform that enables the ISS business to operate. This role is appropriate for a Lead Data Engineer capable of taking ownership and a delivering a subsection of the wider data platform. Key Responsibilities Design, develop and maintain scalable data pipelines and architectures to support data ingestion, integration and analytics.Be accountable for technical delivery and take ownership of solutions.Lead a team of senior and junior developers providing mentorship and guidance.Collaborate with enterprise architects, business analysts and stakeholders to understand data requirements, validate designs and communicate progress.Drive technical innovation within the department to increase code reusability, code quality and developer productivity.Challenge the status quo by bringing the very latest data engineering practices and techniques. Essential Skills and Experience Core Technical Skills Expert in leveraging cloud-based data platform (Snowflake, Databricks) capabilities to create an enterprise lake house.Advanced expertise with AWS ecosystem and experience in using a variety of core AWS data services like Lambda, EMR, MSK, Glue, S3.Experience designing event-based or streaming data architectures using Kafka.Advanced expertise in Python and SQL. Open to expertise in Java/Scala but require enterprise experience of Python.Expert in designing, building and using CI/CD pipelines to deploy infrastructure (Terraform) and pipelines with test automation.Data Security & Performance Optimization:Experience implementing data access controls to meet regulatory requirements.Experience using both RDBMS (Oracle, Postgres, MSSQL) and NOSQL (Dynamo, OpenSearch, Redis) offerings.Experience implementing CDC ingestion.Experience using orchestration tools (Airflow, Control-M, etc..) Bonus technical Skills: Strong experience in containerisation and experience deploying applications to Kubernetes.Strong experience in API development using Python based frameworks like FastAPI. Key Soft Skills: Problem-Solving:Leadership experience in problem-solving and technical decision-making.Communication:Strong in strategic communication and stakeholder engagement.Project Management:Experienced in overseeing project lifecycles working with Project Managers to manage resources. Feel rewarded For starters, well offer you a comprehensive benefits package. Well value your wellbeing and support your development. And well be as flexible as we can about where and when you work finding a balance that works for all of us. Its all part of our commitment to making you feel motivated by the work you do and happy to be part of our team. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com.

Posted 1 month ago

Apply

10.0 - 15.0 years

15 - 19 Lacs

Bengaluru

Work from Office

Experience: 8+ years of experience in data engineering, specifically in cloud environments like AWS. Proficiency in PySpark for distributed data processing and transformation. Solid experience with AWS Glue for ETL jobs and managing data workflows. Hands-on experience with AWS Data Pipeline (DPL) for workflow orchestration. Strong experience with AWS services such as S3, Lambda, Redshift, RDS, and EC2. Technical Skills: Proficiency in Python and PySpark for data processing and transformation tasks. Deep understanding of ETL concepts and best practices. Familiarity with AWS Glue (ETL jobs, Data Catalog, and Crawlers). Experience building and maintaining data pipelines with AWS Data Pipeline or similar orchestration tools. Familiarity with AWS S3 for data storage and management, including file formats (CSV, Parquet, Avro). Strong knowledge of SQL for querying and manipulating relational and semi-structured data. Experience with Data Warehousing and Big Data technologies, specifically within AWS. Additional Skills: Experience with AWS Lambda for serverless data processing and orchestration. Understanding of AWS Redshift for data warehousing and analytics. Familiarity with Data Lakes, Amazon EMR, and Kinesis for streaming data processing. Knowledge of data governance practices, including data lineage and auditing. Familiarity with CI/CD pipelines and Git for version control. Experience with Docker and containerization for building and deploying applications. Design and Build Data PipelinesDesign, implement, and optimize data pipelines on AWS using PySpark, AWS Glue, and AWS Data Pipeline to automate data integration, transformation, and storage processes. ETL DevelopmentDevelop and maintain Extract, Transform, and Load (ETL) processes using AWS Glue and PySpark to efficiently process large datasets. Data Workflow AutomationBuild and manage automated data workflows using AWS Data Pipeline, ensuring seamless scheduling, monitoring, and management of data jobs. Data IntegrationWork with different AWS data storage services (e.g., S3, Redshift, RDS) to ensure smooth integration and movement of data across platforms. Optimization and ScalingOptimize and scale data pipelines for high performance and cost efficiency, utilizing AWS services like Lambda, S3, and EC2.

Posted 1 month ago

Apply

12.0 - 17.0 years

16 - 20 Lacs

Hyderabad

Work from Office

Excellent understanding of AWS Services components with experience in multiple projects Strong Terraform Scripting Skills. Creating the CI/CD pipelines Good Hands-on in provisioning the Containers in AWS Container Instances and AKS etc. AWS ECS, Postgres, Lambda, S3, Route53, SNS, SQS Python (for Lambda functions) Strong Java Knowledge required. People skills Ability to quickly absorb knowledge as it relates to our application existing recorded KT sessions will be provided and core SME team will be available for any questions or addl guidance) Good communication & partnership with others Motivated in what they are working on if not clear or unsure about something, immediately raise up Ability to problem solve issues that arise

Posted 1 month ago

Apply

6.0 - 11.0 years

8 - 12 Lacs

Mumbai

Work from Office

Strong proficiency in Java (8 or higher) and Spring Boot framework. Basic foundation on AWS services such as EC2, Lambda, API Gateway, S3, CloudFormation, DynamoDB, RDS. Experience developing microservices and RESTful APIs. Understanding of cloud architecture and deployment strategies. Familiarity with CI/CD pipelines and tools such as Jenkins, GitHub Actions, or AWS CodePipeline. Experience with monitoring/logging tools like CloudWatch, ELK Stack, or Prometheus is desirable. Familiarity with security best practices for cloud-native apps (IAM roles, encryption, etc.).

Posted 1 month ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Mumbai

Work from Office

Skill—Java Microservices Springboot Experience4-6 Yrs Ro leT3 Responsibilities: Strong proficiency in Java (8 or higher) and Spring Boot framework. Basic foundation on AWS services such as EC2, Lambda, API Gateway, S3, CloudFormation, DynamoDB, RDS. Experience developing microservices and RESTful APIs. Understanding of cloud architecture and deployment strategies. Familiarity with CI/CD pipelines and tools such as Jenkins, GitHub Actions, or AWS CodePipeline. Experience with monitoring/logging tools like CloudWatch, ELK Stack, or Prometheus is desirable. Familiarity with security best practices for cloud-native apps (IAM roles, encryption, etc.).

Posted 1 month ago

Apply

14.0 - 19.0 years

11 - 16 Lacs

Hyderabad

Work from Office

10 years of hands-on experience in Thought Machine Vault, Kubernetes, Terraform, GCP/AWS, PostgreSQL, CI/CD REST APIs, Docker, Kubernetes, Microservices Architect and manage enterprise-level databases with 24/7 availability Lead efforts on optimization, backup, and disaster recovery planning Ensure compliance, implement monitoring and automation Guide developers on schema design and query optimization Conduct DB health audits and capacity planningCollaborate with cross-functional teams to define, design, and ship new features. Work on the entire software development lifecycle, from concept and design to testing and deployment. Implement and maintain AWS cloud-based solutions, ensuring high performance, security, and scalability. Integrate microservices with Kafka for real-time data streaming and event-driven architecture. Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Bachelor's or Master's degree in Computer Science or related field. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS. Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Good to have TM Vault core banking knowledge, Strong communication and collaboration skills.

Posted 1 month ago

Apply

8.0 - 13.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Overall 8+ years experience on Design, develop, test, and deploy scalable and resilient microservices using Java and Spring Boot. Collaborate with cross-functional teams to define, design, and ship new features. Work on the entire software development lifecycle, from concept and design to testing and deployment. Implement and maintain AWS cloud-based solutions, ensuring high performance, security, and scalability. Integrate microservices with Kafka for real-time data streaming and event-driven architecture. Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Should Be a Java Full Stack Developer. Bachelor's or Master's degree in Computer Science or related field. 6+ years of hands-on experience in Java development, with a focus on microservices architecture. Should have Java Full Stack developer. Proficiency in Spring Boot and other Spring Framework components. Extensive experience in designing and developing RESTful APIs. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS. Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Strong communication and collaboration skills.

Posted 1 month ago

Apply

9.0 - 14.0 years

11 - 16 Lacs

Hyderabad

Work from Office

8 years of hands-on experience in Thought Machine Vault, Kubernetes, Terraform, GCP/AWS, PostgreSQL, CI/CD REST APIs, Docker, Kubernetes, Microservices Architect and manage enterprise-level databases with 24/7 availability Lead efforts on optimization, backup, and disaster recovery planning Ensure compliance, implement monitoring and automation Guide developers on schema design and query optimization Conduct DB health audits and capacity planningCollaborate with cross-functional teams to define, design, and ship new features. Work on the entire software development lifecycle, from concept and design to testing and deployment. Implement and maintain AWS cloud-based solutions, ensuring high performance, security, and scalability. Integrate microservices with Kafka for real-time data streaming and event-driven architecture. Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Bachelor's or master's degree in computer science or related field. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Good to have TM Vault core banking knowledge Strong communication and collaboration skills

Posted 1 month ago

Apply

8.0 - 13.0 years

12 - 16 Lacs

Hyderabad

Work from Office

8 years of hands-on experience in Thought Machine Vault, Kubernetes, Terraform, GCP/AWS, PostgreSQL, CI/CD REST APIs, Docker, Kubernetes, Microservices Architect and manage enterprise-level databases with 24/7 availability Lead efforts on optimization, backup, and disaster recovery planning Design and manage scalable CI/CD pipelines for cloud-native apps Automate infrastructure using Terraform/CloudFormation Implement container orchestration using Kubernetes and ECS Ensure cloud security, compliance, and cost optimization Monitor performance and implement high-availability setups Collaborate with dev, QA, and security teams; drive architecture decisions Mentor team members and contribute to DevOps best practicesIntegrate microservices with Kafka for real-time data streaming and event-driven architecture. Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Bachelor's or Master's degree in Computer Science or related field. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS. Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Good to have TM Vault core banking knowledge, Strong communication and collaboration skills.

Posted 1 month ago

Apply

8.0 - 13.0 years

9 - 14 Lacs

Hyderabad

Work from Office

Bachelor's degree in Computer Science, Engineering, or related field (or equivalent work experience). 8+ years of professional experience in Java backend development. 2+ years of hands-on experience working with AWS cloud services. Expertise in Spring Boot, REST APIs, and microservice design. Strong understanding of cloud-native development, containers (Docker), and modern deployment techniques. Familiarity with relational and NoSQL databases such as PostgreSQL, MySQL, or DynamoDB. Experience with logging, monitoring, and performance tuning in a cloud environment. Proficiency in Git and experience with CI/CD tools (e.g., Jenkins, GitHub Actions, AWS CodePipeline). Develop and maintain robust backend services and APIs using Java (8+) and Spring Boot. Build and deploy applications in AWS using services such as Lambda, API Gateway, S3, DynamoDB, RDS, SQS/SNS, and CloudWatch. Contribute to the design of microservices and serverless architectures. Write efficient, maintainable, and testable code following software engineering best practices. Collaborate with product managers, architects, and other developers to deliver high-quality solutions. Participate in code reviews, unit testing, integration testing, and deployment processes. Help improve development processes and DevOps practices using tools such as Git, Jenkins, Docker, Terraform, or AWS CloudFormation. Troubleshoot and resolve application and system issues in a timely manner.

Posted 1 month ago

Apply

10.0 - 15.0 years

10 - 15 Lacs

Hyderabad

Work from Office

Design, develop, test, and deploy scalable and resilient microservices using Java and Spring Boot. Collaborate with cross-functional teams to define, design, and ship new features. Work on the entire software development lifecycle, from concept and design to testing and deployment. Implement and maintain AWS cloud-based solutions, ensuring high performance, security, and scalability. Integrate microservices with Kafka for real-time data streaming and event-driven architecture. Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Should Be a Java Full Stack Developer. Bachelor's or Master's degree in Computer Science or related field. 8+ years of hands-on experience in JAVA FULL STACK - JAVA SPRING BOOT Java 11+, Spring Boot, Angular/React, REST APIs, Docker, Kubernetes, Microservices Proficiency in Spring Boot and other Spring Framework components. Extensive experience in designing and developing RESTful APIs. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS. Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Good to have TM Vault core banking knowledge, Strong communication and collaboration skills.

Posted 1 month ago

Apply

8.0 - 13.0 years

3 - 7 Lacs

Mumbai

Work from Office

Skill—Java AWS Experience:6-9 Yrs Ro leT2 Responsibilities: Strong proficiency in Java (8 or higher) and Spring Boot framework. Hands-on experience with AWS services such as EC2, Lambda, API Gateway, S3, CloudFormation, DynamoDB, RDS. Experience developing microservices and RESTful APIs.ac Understanding of cloud architecture and deployment strategies. Familiarity with CI/CD pipelines and tools such as Jenkins, GitHub Actions, or AWS CodePipeline. Knowledge of containerization (Docker) and orchestration tools (ECS/Kubernetes) is a plus. Experience with monitoring/logging tools like CloudWatch, ELK Stack, or Prometheus is desirable. Familiarity with security best practices for cloud-native apps (IAM roles, encryption, etc.).Develop and maintain robust backend services and RESTful APIs using Java and Spring Boot. Design and implement microservices that are scalable, maintainable, and deployable in AWS. Integrate backend systems with AWS services including but not limited to Lambda, S3, DynamoDB, RDS, SNS/SQS, and CloudFormation. Collaborate with product managers, architects, and other developers to deliver end-to-end features. Participate in code reviews, design discussions, and agile development processes.

Posted 1 month ago

Apply

10.0 - 15.0 years

12 - 17 Lacs

Hyderabad

Work from Office

8 to 12 years of experience in information technology with an emphasis on application development, demonstrated experience with applications development throughout the entire development lifecycle. In depth knowledge of the services industry and their IT systems Practical cloud native experience Experience in Computer Science, Engineering, Mathematics, or a related field and expertise in technology disciplines Java Full Stack Developmentability to create medium large sized Java web applications from start to finish on their own. This includes but is not limited to the followingclient interaction, validating requirements, system design, frontend/UI development, interaction with a Java EE application server, web services, experience with the various Java EE APIs, development builds, application deployments, integration/enterprise testing, and support of applications within a production environment. Experience with Java/J2EE with a deep understanding of the language and core APIs, web services, multi threadedor concurrent programming, XML, design patterns, Service Oriented Architecture. Experience in implementing Micro services using Spring Boot and Event Driven architecture. Work with a team that develops smart and scalable solutions and provide a solid experience for our users. Develop an understanding of our products and the problems we are attempting to solve. Analyze infrastructure problems/constraints, inefficiencies, process gaps, risk and regulatory issues and engineer software or automation solutions Work in partnership with infrastructure engineers and architects to understand and identify operational improvements. Tech skills Java API, Microservices UI React Javascript AWS ECS Postgres Lambda, S3, Route53, SNS, SQS Infrastracture as Code concepts TestingjUnit, AFT (Selenium/Cucumber/Gherkin), Blazemeter perf testing Python (for Lambda functions) People skills Ability to quickly absorb knowledge as it relates to our application (existing recorded KT sessions will be provided and core SME team will be available for any questions or addl guidance) Good communication & partnership with others Motivated in what they are working on If not clear or unsure about something, immediately raise up Ability to problem solve issues that arise

Posted 1 month ago

Apply

7.0 - 12.0 years

5 - 9 Lacs

Hyderabad

Work from Office

4 years of hands-on experience in JAVA FULL STACK - JAVA SPRING BOOT Design, develop, test, and deploy scalable and resilient microservices using Java and Spring Boot. Collaborate with cross-functional teams to define, design, and ship new features. Work on the entire software development lifecycle, from concept and design to testing and deployment. Implement and maintain AWS cloud-based solutions, ensuring high performance, security, and scalability. Integrate microservices with Kafka for real-time data streaming and event-driven architecture. Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Should Be a Java Full Stack Developer. Bachelor's or Master's degree in Computer Science or related field. Proficiency in Spring Boot and other Spring Framework components. Extensive experience in designing and developing RESTful APIs. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS. Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Strong communication and collaboration skills.

Posted 1 month ago

Apply

8.0 - 13.0 years

5 - 10 Lacs

Hyderabad

Work from Office

6+ years of experience with Java Spark. Strong understanding of distributed computing, big data principles, and batch/stream processing. Proficiency in working with AWS services such as S3, EMR, Glue, Lambda, and Athena. Experience with Data Lake architectures and handling large volumes of structured and unstructured data. Familiarity with various data formats. Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Design, develop, and optimize large-scale data processing pipelines using Java Spark Build scalable solutions to manage data ingestion, transformation, and storage in AWS-based Data Lake environments. Collaborate with data architects and analysts to implement data models and workflows aligned with business requirements. Ensure performance tuning, fault tolerance, and reliability of distributed data processing systems.

Posted 1 month ago

Apply

5.0 - 10.0 years

6 - 10 Lacs

Hyderabad

Work from Office

1.Around 5+ years of hands on experience in Java based application development with integration into kafka messaging systems 2. Mandatory Development experience implementing Spring , Spring Boot , Microservices atleast for one year. 3. Preferred candidates with hands on experience with Apache Kafka (producers, consumers and stream processors ) 4. Familiarity with Kafka internals such as brokers, zookeepers, topics and partitions 5. Familiarity with tools like kafka connect, kafka streams and schema Registry 6. Very strong hands-on experience in Java-8 features like Generics , exception handling, collection API, Functional Interfaces , Multithreading , Lambda Expression , Stream API etc 7. Mandatory knowledge in deploying microservices in ECS environment Kubernetes , docker , Light speed etc. Knowledge and experience in Junit is must. 8. Experience in writing Oracle PL / SQL queries. 9.Good to have Angular , CSS , Banking domain , capital markets

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies