Jobs
Interviews

74 Eventbridge Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 2.0 years

6 - 10 Lacs

Mumbai, Hyderabad, Chennai

Work from Office

Your Role You would be working Enterprise Data Management Consolidation (EDMCS) Enterprise Profitability & Cost Management Cloud Services (EPCM) Oracle Integration cloud (OIC). Full life cycle Oracle EPM Cloud Implementation. Creating forms, OIC Integrations, and complex Business Rules. Understanding dependencies and interrelationships between various components of Oracle EPM Cloud. Keep abreast of Oracle EPM roadmap and key functionality to identify opportunities where it will enhance the current process within the entire Financials ecosystem. Collaborate with the FP&A to facilitate the Planning, Forecasting and Reporting process for the organization. Create and maintain system documentation, both functional and technical Your Profile Experience in Implementation in EDMCS Modules Proven ability to collaborate with internal clients in an agile manner, leveraging design thinking approaches. Collaborate with the FP&A to facilitate the Planning, Forecasting and Reporting process for the organization. Create and maintain system documentation, both functional and technical Experience of Python, AWS Cloud (Lambda, Step functions, EventBridge etc.) is preferred. What you"ll love about capgemini You can shape yourcareer with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have theopportunity to learn on one of the industry"s largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini Location - Hyderabad,Chennai,Mumbai,Pune,Bengaluru

Posted 2 weeks ago

Apply

5.0 - 8.0 years

14 - 18 Lacs

Noida

Work from Office

As a Cloud & DevOps Engineer, you will be responsible for implementing and managing AWS infrastructure for applications using AWS CDK and building GitHub Actions pipelines to deploy containerized applications on ECS/EKS. Must-to-Have: Hands-on experience with AWS CDK for provisioning infrastructure Solid understanding of key AWS services: ECS (Fargate), API Gateway, ALB/NLB, IAM, S3, KMS, Security Groups Strong experience with Cloud Platform engineering & DevOps Proficiency in building GitHub Actions workflows for build, containerization, and deployment Strong knowledge of Docker, container lifecycle, and CI/CD practices Understanding of basic networking (VPC, subnets, SGs) Good understanding of OAuth implementation Familiarity with artifact and image management (ECR, GitHub Packages) Comfortable working in Agile or DevOps-centric environments Good-to-Have: Experience with CDK Pipelines and multi-stage deployments Exposure to GitHub Actions secrets, OIDC-based role assumption Scripting skills in Python, Bash, or Shell for automation tasks Familiarity with AWS CodeBuild or CodePipeline as alternatives Knowledge of Container orchestration in AWS, ECS and EKS for future migration planning Understanding of compliance/security frameworks and audit requirements Mandatory Competencies DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Docker Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Cloud - AWS - AWS S3, S3 glacier, AWS EBS Development Tools and Management - Development Tools and Management - CI/CD DevOps/Configuration Mgmt - Cloud Platforms - AWS DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - GitLab,Github, Bitbucket

Posted 2 weeks ago

Apply

5.0 - 8.0 years

8 - 13 Lacs

Noida

Work from Office

Core JAVA + Spring + Spring Batch + Hibernate Spring FTL, JavaScript/jQuery Oracle DB Security Groups, VPC controls, Ingress& Egress rules Spring Master Slave knowledge SFTP file transfer, PGP encryption/Decryption Ansible templates knowledge (good to have) Git for source code maintenance Solution Designing Mandatory Competencies Programming Language - Java - Core Java (java 8+) Beh - Communication and collaboration Programming Language - Java - Spring Framework Middleware - API Middleware - Microservices Database - Oracle - PL/SQL Packages Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Programming Language - Java Full Stack - Angular Components and Design Patterns Programming Language - Java - Java Multithreading

Posted 2 weeks ago

Apply

6.0 - 12.0 years

0 Lacs

noida, uttar pradesh

On-site

You are looking for a skilled DotNet with AWS Developer to join your team. The ideal candidate should have a strong background in .NET Core, C#, and AWS services, along with experience in developing and integrating applications using CI/CD pipelines. As a part of this role, you will be involved in the full lifecycle development process, from requirements analysis to deployment and maintenance. Your responsibilities will include developing and integrating requirements using CI/CD code pipeline with GitHub, participating in the full development lifecycle, serving as a technical expert on projects, writing technical specifications, supporting and maintaining software functionality, evaluating new technologies, analyzing and revising code, participating in software design meetings, and consulting with end users. Required skills for this position include proficiency in .NET Core, C#, AWS SDK, experience with NoSQL databases like MongoDB and AWS DynamoDB, working with JIRA, Microsoft .NET Framework and supported programming languages, and a strong understanding of AWS services such as EC2, ECS, Lambda, SNS, SQS, EventBridge, DynamoDB, and CloudWatch. Additionally, you should have experience in backend development using C# and .Net Core, version control using Git with Copilot. Preferred skills include UI development experience with Angular 8+, working experience with Confluence, Lucid portal, and ServiceNow. You will be working with tools and technologies such as GitHub Desktop, Visual Studio Code, Visual Studio IDE (Professional 2022 with GitHub Copilot), Teams, and Outlook for communication. Soft skills required for this role include strong communication skills in English and the ability to complete tasks within the estimated time by the team.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Tech Lead with over 5 years of experience in Typescript and Node, you will be responsible for leading the design and development of highly skilled APIs using technologies such as REST, OpenAPI, as well as API gateways like Kong. Your expertise in AWS services such as serverless, Lambda, Aurora, and API Gateway will be instrumental in creating efficient and scalable solutions. Experience with authentication protocols like OIDC, OAuth2, and JWT is essential, along with proficiency in RDBs like MySQL and PostgreSQL. Programmatic testing using tools like Jest will also be part of your responsibilities. At GlobalLogic, we prioritize a culture of caring where people come first. You will experience an inclusive environment that fosters acceptance, belonging, and meaningful connections with your teammates, managers, and leaders. Continuous learning and development are key aspects of your journey at GlobalLogic, with numerous opportunities to enhance your skills and advance your career. You will have the chance to work on interesting and impactful projects that challenge your problem-solving skills and creativity, enabling you to contribute to cutting-edge solutions that shape the world. We believe in the importance of work-life balance and flexibility, offering various career areas, roles, and work arrangements to help you achieve the perfect balance. As part of a high-trust organization, integrity is fundamental to our values. By joining GlobalLogic, you are becoming part of a trustworthy, reliable, and ethical company that values truthfulness, candor, and integrity in all aspects of its operations. GlobalLogic, a Hitachi Group Company, is a digital engineering partner to leading global companies, driving innovation and transformation through intelligent products, platforms, and services. Since 2000, we have been at the forefront of the digital revolution, collaborating with clients to redefine industries and create innovative digital experiences. Join us and be part of a team that engineers impact and shapes the future of digital technology.,

Posted 2 weeks ago

Apply

4.0 - 7.0 years

6 - 9 Lacs

Noida, India

Work from Office

1. Design and manage cloud-based systems on AWS. 2. Develop and maintain backend services and APIs using Java. 3. Basic knowledge on SQL and able to write SQL queries. 4. Good hands on Docker file and multistage docker 5. Implement containerization using Docker and orchestration with ECS/Kubernetes. 6. Monitor and troubleshoot cloud infrastructure and application performance. 7. Collaborate with cross-functional teams to integrate systems seamlessly. 8. Document system architecture, configurations, and operational procedures. Need Strong Hands-on Knowledge: * ECS, ECR, NLB, ALB, ACM, IAM, S3, Lambda, RDS, KMS, API Gateway, Cognito, CloudFormation. Good to Have: * Experience with AWS CDK for infrastructure as code. * AWS certifications (e.g., AWS Certified Solutions Architect, AWS Certified Developer). * Pyhton Mandatory Competencies Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Database - Other Databases - PostgreSQL Beh - Communication DevOps/Configuration Mgmt - Cloud Platforms - AWS

Posted 2 weeks ago

Apply

4.0 - 7.0 years

6 - 9 Lacs

Noida

Work from Office

Key Responsibilities: Develop responsive web applications using Angular. Integrate front-end applications with AWS backend services. Collaborate with UX/UI designers and backend developers in Agile teams. Develop and maintain responsive web applications using Angular framework. Create engaging and interactive web interfaces using HTML, CSS, and JavaScript Optimize web performance and ensure cross-browser compatibility Integrate APIs and backend systems to enable seamless data flow Required Skills: Strong proficiency in Angular and TypeScript. Experience with RESTful APIs and integration with AWS services. Knowledge of HTML, CSS, and JavaScript. Knowledge of version control systems like Git. Background in financial applications is a plus. Mandatory Competencies User Interface - Other User Interfaces - JavaScript DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Git Beh - Communication and collaboration User Interface - Angular - Angular Components and Design Patterns Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate UX - UX - Adobe XD Agile - Agile - SCRUM User Interface - HTML - HTML/CSS User Interface - Other User Interfaces - Typescript

Posted 2 weeks ago

Apply

5.0 - 9.0 years

10 - 15 Lacs

Noida

Work from Office

1. C#, Microsoft SQL Server or Azure SQL, Azure Cosmos DB, Azure Service Bus, Azure Function Apps, Auth0, Web Sockets 2. Strong development experience in C# and .NET core technologies built up across a range of different projects 3.Experience of developing API's which conform as much as possible to REST principles in terms of Resources, Sub Resources, Responses, Error Handling 4.Experience of API design and documentation using Open API 3.x YAML Swagger 5.Some familiarity with AWS, and especially Elastic Search would be beneficial but not mandatory. 6.Azure Certifications an advantage 7. HTML5, Angular 14 or later, NodeJS, CSS Mandatory Competencies Programming Language - .Net Full Stack - Angular Programming Language - .Net - .NET Core Programming Language - .Net Full Stack - HTML CSS Beh - Communication and collaboration Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Cloud - Azure - ServerLess (Function App Logic App) Programming Language - Other Programming Language - C# Middleware - API Middleware - Microservices User Interface - Other User Interfaces - node.JS

Posted 2 weeks ago

Apply

6.0 - 7.0 years

6 - 11 Lacs

Noida

Work from Office

Responsibilities Data Architecture: Develop and maintain the overall data architecture, ensuring scalability, performance, and data quality. AWS Data Services: Expertise in using AWS data services such as AWS Glue, S3, SNS, SES, Dynamo DB, Redshift, Cloud formation, Cloud watch, IAM, DMS, Event bridge scheduler etc. Data Warehousing: Design and implement data warehouses on AWS, leveraging AWS Redshift or other suitable options. Data Lakes: Build and manage data lakes on AWS using AWS S3 and other relevant services. Data Pipelines: Design and develop efficient data pipelines to extract, transform, and load data from various sources. Data Quality: Implement data quality frameworks and best practices to ensure data accuracy, completeness, and consistency. Cloud Optimization: Optimize data engineering solutions for performance, cost-efficiency, and scalability on the AWS cloud. Team Leadership: Mentor and guide data engineers, ensuring they adhere to best practices and meet project deadlines. Qualifications Bachelors degree in computer science, Engineering, or a related field. 6-7 years of experience in data engineering roles, with a focus on AWS cloud platforms. Strong understanding of data warehousing and data lake concepts. Proficiency in SQL and at least one programming language ( Python/Pyspark ). Good to have - Experience with any big data technologies like Hadoop, Spark, and Kafka. Knowledge of data modeling and data quality best practices. Excellent problem-solving, analytical, and communication skills. Ability to work independently and as part of a team. Preferred Qualifications Certifications in AWS Certified Data Analytics - Specialty or AWS Certified Solutions Architect - Data. Mandatory Competencies Big Data - Big Data - Pyspark Data on Cloud - Azure Data Lake (ADL) Beh - Communication and collaboration Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Cloud - AWS - AWS S3, S3 glacier, AWS EBS Cloud - Azure - Azure Data Factory (ADF), Azure Databricks, Azure Data Lake Storage, Event Hubs, HDInsight Cloud - AWS - Tensorflow on AWS, AWS Glue, AWS EMR, Amazon Data Pipeline, AWS Redshift Database - Sql Server - SQL Packages Data Science and Machine Learning - Data Science and Machine Learning - Python

Posted 2 weeks ago

Apply

4.0 - 8.0 years

7 - 11 Lacs

Noida

Work from Office

We are seeking a dedicated and proactive Support Manager to lead our Maintenance and Support Team and ensure timely resolution of Client issues. The ideal candidate will be responsible for managing daily support operations, maintaining service quality, and acting as the primary point of escalation for all production critical issues and defects. Key Responsibilities: Support Manager is responsible for Resource Management - Coverage, availability, capability Oversee support team performance and ticket resolution timelines Manage escalations and ensure customer satisfaction Collaborate with other support/dev teams to resolve recurring issues Monitor KPIs and prepare regular support performance reports Act as the primary escalation point Identify, document, and mitigate Risks, Assumptions, Issue and Dependencies for the project Drive improvements in support processes and tools Requirements: Proven experience in technical application maintenance & support projects and production support leadership role Strong understanding of RAID management and issue escalation handling Strong leadership, problem-solving, and communication skills Familiarity with support tools (e.g., Jira, Service Now) Ability to work effectively under pressure in a fast-paced environment Good to have technical knowledge or hands on experience in Java, Sprint Boot, .Net, Python, Unix/Linux systems, AWS Mandatory Competencies App Support - App Support - L1, L2, L3 Support BA - Project Management Programming Language - Java - Core Java (java 8+) Programming Language - .Net Full Stack - Javascript Beh - Communication and collaboration Operating System - Operating System - Linux Operating System - Operating System - Unix Middleware - API Middleware - Microservices Data Science and Machine Learning - Data Science and Machine Learning - Python Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate

Posted 2 weeks ago

Apply

7.0 - 11.0 years

13 - 18 Lacs

Noida

Work from Office

Must-Have Skills: Expertise in AWS CDK, Services(Lambda, ECS, S3) and PostgreSQL DB management. Strong understanding serverless architecture and event-driven design(SNS, SQS). Nice to have: Knowledge of multi-account AWS Setups and Security best practices (IAM, VPC, etc.), Experience in cost optimization strategies in AWS. Mandatory Competencies Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Database - Other Databases - PostgreSQL

Posted 2 weeks ago

Apply

5.0 - 9.0 years

8 - 12 Lacs

Noida

Work from Office

Must-Have Skills: Expertise in AWS CDK, Services(Lambda, ECS, S3) and PostgreSQL DB management. Strong understanding serverless architecture and event-driven design(SNS, SQS). Nice to have: Knowledge of multi-account AWS Setups and Security best practices (IAM, VPC, etc.), Experience in cost optimization strategies in AWS. Mandatory Competencies Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Database - Other Databases - PostgreSQL Beh - Communication and collaboration Cloud - AWS - AWS S3, S3 glacier, AWS EBS Development Tools and Management - Development Tools and Management - CI/CD Cloud - AWS - ECS

Posted 2 weeks ago

Apply

4.0 - 5.0 years

5 - 9 Lacs

Noida

Work from Office

Responsibilities Data Architecture: Develop and maintain the overall data architecture, ensuring scalability, performance, and data quality. AWS Data Services: Expertise in using AWS data services such as AWS Glue, S3, SNS, SES, Dynamo DB, Redshift, Cloud formation, Cloud watch, IAM, DMS, Event bridge scheduler etc. Data Warehousing: Design and implement data warehouses on AWS, leveraging AWS Redshift or other suitable options. Data Lakes: Build and manage data lakes on AWS using AWS S3 and other relevant services. Data Pipelines: Design and develop efficient data pipelines to extract, transform, and load data from various sources. Data Quality: Implement data quality frameworks and best practices to ensure data accuracy, completeness, and consistency. Cloud Optimization: Optimize data engineering solutions for performance, cost-efficiency, and scalability on the AWS cloud. Qualifications Bachelors degree in computer science, Engineering, or a related field. 4-5 years of experience in data engineering roles, with a focus on AWS cloud platforms. Strong understanding of data warehousing and data lake concepts. Proficiency in SQL and at least one programming language ( Python/Pyspark ). Good to have - Experience with any big data technologies like Hadoop, Spark, and Kafka. Knowledge of data modeling and data quality best practices. Excellent problem-solving, analytical, and communication skills. Ability to work independently and as part of a team. Preferred Qualifications Certifications in AWS Certified Data Analytics - Specialty or AWS Certified Solutions Architect - Data. Mandatory Competencies Big Data - Big Data - Pyspark Beh - Communication and collaboration Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Database - Sql Server - SQL Packages Data Science and Machine Learning - Data Science and Machine Learning - Python

Posted 2 weeks ago

Apply

5.0 - 8.0 years

7 - 11 Lacs

Noida

Work from Office

Strong experience in Java 1.8 or above Strong experience in Cloud AWS Experience in developing front end screens with Angular framework Knowledge of multiple front-end languages and libraries (e.g. HTML/ CSS, JavaScript, JSON, jQuery) Experience with Database Ability to pick up new technologies Willingness to learn and understand the business domain Ability to meet client needs without sacrificing deadlines and quality Ability to work effectively within global team Excellent communication and teamwork skills Mandatory Competencies Fundamental Technical Skills - Spring Framework/Hibernate/Junit etc. Beh - Communication Programming Language - Java - Core Java (java 8+) Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Database - Database Programming - SQL

Posted 2 weeks ago

Apply

5.0 - 10.0 years

6 - 11 Lacs

Noida

Work from Office

5+ years of experience in data engineering with a strong focus on AWS services . Proven expertise in: Amazon S3 for scalable data storage AWS Glue for ETL and serverless data integration using Amazon S3, DataSync, EMR, Redshift for data warehousing and analytics Proficiency in SQL , Python , or PySpark for data processing. Experience with data modeling , partitioning strategies , and performance optimization . Familiarity with orchestration tools like AWS Step Functions , Apache Airflow , or Glue Workflows . Strong understanding of data lake and data warehouse architectures. Excellent problem-solving and communication skills. Mandatory Competencies Beh - Communication ETL - ETL - AWS Glue Big Data - Big Data - Pyspark Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Cloud - AWS - AWS S3, S3 glacier, AWS EBS Cloud - AWS - Tensorflow on AWS, AWS Glue, AWS EMR, Amazon Data Pipeline, AWS Redshift Programming Language - Python - Python Shell Database - Database Programming - SQL

Posted 2 weeks ago

Apply

8.0 - 12.0 years

10 - 15 Lacs

Pune

Work from Office

Key Responsibilities Design and develop scalable applications using Python and AWS services Debug and resolve production issues across complex distributed systems Architect solutions aligned with business strategies and industry standards Lead and mentor a team of India-based developers; guide career development Ensure technical deliverables meet highest standards of quality and performance Research and integrate emerging technologies and processes into development strategy Document solutions in compliance with SDLC standards using defined templates Assemble large, complex datasets based on functional and non-functional requirements Handle operational issues and recommend improvements in technology stack Facilitate end-to-end platform integration across enterprise-level applications Required Skills Technical Skills Cloud & Architecture Tools & Processes Python AWS (EC2, EKS, Glue, Lambda, S3, EMR, RDS, API Gateway) Terraform, CI/CD pipelines Data Engineering Step Functions, CloudFront EventBridge, ARRFlow, Airflow (MWAA), Quicksight Debugging & Troubleshooting System Integration SDLC, Documentation Templates Qualifications 10+ years of software development experience, preferably in financial/trading applications 5+ years of people management and mentoring experience Proven track record in technical leadership and architecture planning Expertise in developing applications using Python and AWS stack Strong grasp of Terraform and automated CI/CD processes Exceptional multitasking and prioritization capabilities

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have at least 10 years of experience. You should be proficient in setting up, configuring, and integrating API gateways in AWS. Your expertise should include API frameworks, XML/JSON, REST, and data protection in software design, build, test, and documentation. Experience with various AWS services such as Lambda, S3, CDN (CloudFront), SQS, SNS, EventBridge, API Gateway, Glue, and RDS is required. You should be able to articulate and implement projects using these AWS services effectively. Your role will involve improving business processes through effective integration solutions. Location: Bangalore, Chennai, Pune, Mumbai, Noida Notice Period: Immediate joiner If you meet the requirements mentioned above, please apply for this position by filling out the form with your Full Name, Email, Phone, Cover Letter, and uploading your CV/Resume (PDF, DOC, DOCX formats accepted). By submitting this form, you agree to the storage and handling of your data by this website.,

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III at JPMorgan Chase within the Consumer & Community Banking Team, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm's business objectives. You will execute creative software solutions, design, development, and technical troubleshooting with the ability to think beyond routine or conventional approaches to build solutions or break down technical problems. Your role includes creating secure and high-quality production code and maintaining algorithms that run synchronously with appropriate systems. You will produce architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development. Additionally, you will gather, analyze, synthesize, and develop visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems. You will also build Microservices that will run on the bank's internal cloud and the public cloud platform (AWS) and collaborate with teams in multiple regions and time zones. Participation in scrum team stand-ups, code reviews, and other ceremonies, contributing to task completion and blocker resolution within your team is expected. Required qualifications, capabilities, and skills include formal training or certification on software engineering concepts and 3+ years of applied experience in Java, AWS, Terraforms. You should have experience with technologies like Java 11/17, Spring/Spring Boot, Kafka, Relational/Non-Relational Databases such as Oracle, Cassandra, Dynamo, Postgres. A minimum of 3 years of hands-on experience on the Public Cloud platform using AWS for building secure Microservices is required. Hands-on experience with AWS services like EKS, Fargate, SQS/SNS/Eventbridge, Lambda, S3, EBS, Dynamo/Arora Postgres DB, and Terraform scripts is essential. Additionally, experience with DevOps concepts for automated build and deployment is crucial. Preferred qualifications, capabilities, and skills include familiarity with modern front-end technologies and exposure to cloud technologies.,

Posted 3 weeks ago

Apply

5.0 - 8.0 years

9 - 14 Lacs

Gurugram

Work from Office

for AWS/DevOps Analyst Analyst should be having more than 6 years of IT Experience Experience in setting up and maintaining ECS solutions Experience in designing and building AWS solutions with VPC,EC2, WAF, ECS, ALB, IAM, KMS, ACM, Secret Manager, S3, CloudFront etc Experience with SNS, SQS, EventBridge Experience in setting up and maintaining RDS, Aurora DB, Postgres DB, DynamoDB, Redis. Experience in setting up AWS Glue jobs, AWS Lambda Experience in setting up CI/CD using Azure DevOps Experience in Source code management GitHub Experience in building and maintaining cloud-native applications Experience in container technologies like docker Experience in configuring logging and monitoring solution like CloudWatch, OpenSearch Build, Release and Manage Configuration of system using Terraform and Terragrunt Ensure necessary system security by using best in class security solutions Recommend process and architecture improvements Ability to troubleshoot distributed systems. Interpersonal Skills Required for both: Strong communication and collaboration skills The ability to be a team player Having good analytical and problem-solving skill Understanding of Agile methodologies The ability and skill to train other people in procedural and technical topics Mandatory Skills: Cloud App Dev Consulting. Experience: 5-8 Years.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

18 - 22 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Job Title: AWS Data Engineer Experience Required: 5+ Years Interested? Send your resume to: aditya.rao@estrel.ai Kindly include: Updated Resume Current CTC Expected CTC Notice Period / Availability (Looking only for Immediate Joiner) LinkedIn Profile Job Overview: We are seeking a skilled and experienced Data Engineer with a minimum of 5 years of experience in Python-based data engineering solutions, real-time data processing, and AWS Cloud technologies. The ideal candidate will have hands-on expertise in designing, building, and maintaining scalable data pipelines, implementing best practices, and working within CI/CD environments. Key Responsibilities: Design and implement scalable and robust data pipelines using Python and frameworks like Pytest and PySpark . Work extensively with AWS cloud services such as AWS CDK , S3 , Lambda , DynamoDB , EventBridge , Kinesis , CloudWatch , AWS Glue , and Lake Formation . Implement data governance and data security protocols, including handling of sensitive data and encryption practices . Develop microservices and APIs using FastAPI , GraphQL , and Pydantic . Design and maintain solutions for real-time streaming and event-driven architecture . Follow SDLC best practices , ensuring code quality through TDD (Test-Driven Development) and robust documentation. Use GitLab for version control, and manage deployment pipelines with CI/CD . Collaborate with cross-functional teams to align data architecture and services with business objectives. Required Skills: Proficiency in Python v3.6+ Experience with Python frameworks: Pytest , PySpark Strong knowledge of AWS tools & services Experience with FastAPI , GraphQL , and Pydantic Expertise in real-time data processing , eventing , and microservices Good understanding of Data Governance , Security , and LakeFormation Familiarity with GitLab , CI/CD pipelines , and TDD Strong problem-solving and analytical skills Excellent communication and team collaboration skills Preferred Qualifications: AWS Certification(s) (e.g., AWS Certified Data Analytics Specialty, Solutions Architect) Experience with DataZone , data cataloging , or metadata management tools Experience in high-compliance industries (e.g., finance, healthcare) is a plus

Posted 3 weeks ago

Apply

8.0 - 10.0 years

12 - 18 Lacs

Zirakpur

Work from Office

AWS Services ( Lambda, Glue, S3, Dynamo, EventBridge, Appsync, Open search) Terraform Python React/Vite Unit testing (Jest, Pytest) Software development lifecycle

Posted 3 weeks ago

Apply

12.0 - 17.0 years

14 - 19 Lacs

Bengaluru

Work from Office

YOUR IMPACT: We are seeking a highly skilled and experienced Level 3 Site Reliability Engineer (SRE) to join our Cloud Operations team. This role is critical in driving advanced engineering initiatives to ensure infrastructure reliability, scalability, and automation across multi-cloud environments. As an L3 SRE, you will lead complex cloud support operations, troubleshoot infrastructure as code, implement observability frameworks, and guide junior SREs while helping shape future architectural direction. This role demands hands-on expertise in AWS, Azure, or GCP, advanced scripting, and deep observability integrationcontributing directly to uptime, automation maturity, and strategic improvements to cloud infrastructure. WHAT YOU NEED TO SUCCEED: Cloud Infrastructure & Architecture Architect and maintain scalable, resilient systems across AWS, Azure, and GCP. Lead cloud adoption and migration strategies while ensuring minimal disruption and high reliability. Implement security and governance controls including VPC, Security Groups, Route53, ACM, and Security Hub. Perform deep infrastructure troubleshooting and root cause analysis, especially with IaC-based deployments. Infrastructure as Code (IaC) & Configuration Management Design and manage infrastructure using Terraform, Terragrunt, and CloudFormation. Oversee configuration management using tools like AWS SSM, SaltStack, and Packer. Review and remediate issues within Git-based CI/CD workflows for IaC and service deployment. Observability & Monitoring Build and maintain monitoring/alerting pipelines using CloudWatch, EventBridge, SNS, and Hund.io. Develop custom observability tooling for end-to-end visibility and proactive issue detection. Lead incident response and contribute to post-incident reviews and reliability reports. Automation, Scripting & CI/CD Develop and maintain automation tools using Bash, Python, Ruby, or PHP. Integrate deployment pipelines into secure, scalable CI/CD processes. Automate vulnerability assessments and compliance scans with ISO 27001 standards. Containerization & Microservices Support Lead container platform deployments using EKS, ECS, ECR, and Fargate. Guide engineering teams in Kubernetes resource optimization and troubleshooting. Database & Storage Management Provide advanced operational support for RDS, PostgreSQL, and Elasticsearch. Monitor database performance and ensure availability across distributed systems. Mentorship & Strategy Mentor L1 and L2 SREs on technical tasks and troubleshooting best practices. Contribute to cloud architecture planning, operational readiness, and process improvements. Help define and track Key Performance Indicators (KPIs) related to system uptime, MTTR, and automation coverage. WHAT THE ROLE OFFERS: What You Bring (Qualifications & Skills) 7-12 years of experience in Site Reliability Engineering or DevOps roles. Advanced expertise in multi-cloud environments (AWS, Azure, GCP). Strong Linux and Windows administration background (Fedora, Debian, Microsoft). Proficiency in Terraform, Terragrunt, CloudFormation, and config management tools. Hands-on with monitoring tools like CloudWatch, SNS, EventBridge, and third-party integrations. Advanced scripting skills in Python, Bash, Ruby, or PHP. Knowledge of container platforms including EKS, ECS, and Fargate. Familiarity with Vulnerability Management, ISO 27001, and audit-readiness practices.

Posted 3 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

maharashtra

On-site

If you are a software engineering leader ready to take the reins and drive impact, weve got an opportunity just for you. As a Director of Software Engineering at JPMorgan Chase within the Asset and Wealth Management LOB, you lead a data technology area and drive impact within teams, technologies, and deliveries. Utilize your in-depth knowledge of software, applications, technical processes, and product management to drive multiple complex initiatives, while serving as a primary decision maker for your teams and be a driver of engineering innovation and solution delivery. The current role focuses on delivering data solutions for some of the Wealth Management businesses. Job responsibilities Leads engineering and delivery of a data and analytics solutions Makes decisions that influence teams resources, budget, tactical operations, and the execution and implementation of processes and procedures Carries governance accountability for coding decisions, control obligations, and measures of success such as cost of ownership & maintainability Delivers technical solutions that can be leveraged across multiple businesses and domains Influences and collaborates with peer leaders and senior stakeholders across the business, product, and technology teams Champions the firms culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills Experience managing data solutions across a large, global consumer community in the Financial Services domain Experience hiring, developing and leading cross-functional teams of technologists Experience handling multiple, global stakeholders across business, technology and product Appreciation of the data product; modeling, sourcing, quality, lineage, discoverability, access management, visibility, purging, etc. Experience researching and upgrading to latest technologies in the continuously evolving data ecosystem Practical hybrid cloud native experience, preferably AWS Experience using current technologies, such as GraphQL, Glue, Spark, SnowFlake, SNS, SQS, Kinesis, Lambda, ECS, EventBridge, QlikSense, etc. Experience with Java and/or Python programming languages Expertise in Computer Science, Computer Engineering, Mathematics, or a related technical field Preferred qualifications, capabilities, and skills Comfortable being hands-on as required to drive solutions and solve challenges for the team Exposure and appreciation of the continuously evolving data science space Exposure to the Wealth Management business,

Posted 3 weeks ago

Apply

2.0 - 7.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Your Role Experience in Enterprise Data Management Consolidation (EDMCS) Enterprise Profitability & Cost Management Cloud Services (EPCM) Oracle Integration cloud (OIC). full life cycle Oracle EPM Cloud Implementation. Experience in creating forms, OIC Integrations, and complex Business Rules. Understand dependencies and interrelationships between various components of Oracle EPM Cloud. Keep abreast of Oracle EPM roadmap and key functionality to identify opportunities where it will enhance the current process within the entire Financials ecosystem. Proven ability to collaborate with internal clients in an agile manner, leveraging design thinking approaches. Collaborate with the FP&A to facilitate the Planning, Forecasting and Reporting process for the organization. Create and maintain system documentation, both functional and technical Your Profile Proven ability to collaborate with internal clients in an agile manner, leveraging design thinking approaches. Collaborate with the FP&A to facilitate the Planning, Forecasting and Reporting process for the organization. Create and maintain system documentation, both functional and technical Experience of Python, AWS Cloud (Lambda, Step functions, EventBridge etc.) is preferred. What you'll love about capgemini You can shape yourcareer with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have theopportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. Were committed to ensure that people of all backgrounds feel encouraged and have a sense of belonging at Capgemini. You are valued for who you are, and you can bring your original self to work . Every Monday, kick off the week with a musical performance by our in-house band - The Rubber Band. Also get to participate in internal sports events , yoga challenges, or marathons. At Capgemini, you can work on cutting-edge projects in tech and engineering with industry leaders or create solutions to overcome societal and environmental challenges. About Capgemini

Posted 3 weeks ago

Apply

5.0 - 10.0 years

20 - 27 Lacs

Pune

Remote

Work Hours: Partial overlap with US PST Key Responsibilities Rapidly prototype MVPs and innovative ideas within tight timelines. Own end-to-end development and deployment of applications in non-production AWS environments. Collaborate with cross-functional teams to deliver scalable web solutions. Technical Expertise 1. Front-End Development Proficiency in React, AWS S3, and AWS CloudFront. Experience building medium to large websites (1520+ pages). 2. Back-End & Serverless Architecture Strong understanding of microservices architecture. Experience with AWS serverless stack: Lambda (Node.js), Cognito, API Gateway, EventBridge, Step Functions. Familiarity with AWS Aurora MySQL and DynamoDB (preferred but not mandatory). 3. DevOps & CI/CD Proficiency in AWS SAM, CloudFormation or AWS CDK. Experience with AWS CodePipeline or equivalent tools (e.g., GitHub Actions). Requirements Experience & Qualifications 5–10 years of software development experience. Minimum 3 years of hands-on experience with React and AWS technologies. Fast learner with the ability to adapt in a dynamic environment.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies