Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
16 - 31 Lacs
noida, pune, gurugram
Work from Office
Job Description Design, implement, and maintain data pipelines for processing large datasets, ensuring data availability, quality, and efficiency for machine learning model training and inference. Collaborate with data scientists to streamline the deployment of machine learning models, ensuring scalability, performance, and reliability in production environments. Develop and optimize ETL (Extract, Transform, Load) processes, ensuring data flow from various sources into structured data storage systems. Ensure effective model monitoring, versioning, and logging to track performance and metrics in a production setting. Ensure data security, integrity, and compliance with data governance policies. Perform troubleshooting and root cause analysis on production-level machine learning systems. Skills: Glue, Pyspark, AWS Services, Strong in SQL; Nice to have : Redshift, Knowledge of SAS Dataset
Posted Date not available
10.0 - 15.0 years
35 - 40 Lacs
bengaluru
Hybrid
Role & responsibilities Skills and experience you possess: Scripting languages - e.g., Bash, PowerShell, Python, Groovy Should have experience as a software developer in one of the following: React, NodeJS, C++, C# Containerization - Docker, Kubernetes Infrastructure as Code - one of Terraform, CloudFormation or Azure Templates Experience with a CI/CD tool - e.g., Jenkins, TeamCity, GitHub Actions, CircleCI Serverless computing - such as Google App Engine or AWS Lambda Experience with some cloud infrastructure - e.g., AWS, GCP, Azure Basic understanding of networking (subnet, DHCP, DNS, SSL) Experience with a monitoring and alerting platform Experience working alongside software engineers Understanding basic security principles and best practices You are a lifelong learner, stay current with industry trends, and discover new ways to improve Your Team: Working with engineers in the US, Bulgaria, and India across time zones Part of the DevOps Community of Practice to share tools, best practices, and solve problems Led by US-based Tech Lead and US-based Product Manager Examples of projects you might work on: CI/CD pipelines for developers across environments Custom developer productivity tools, like scripts to spin up particular services Infrastructure as code for service deployment Automating local development environments Monitoring of cloud service for performance and alerting
Posted Date not available
7.0 - 12.0 years
10 - 20 Lacs
hyderabad, pune, bengaluru
Work from Office
Hi Team, We are Looking for a Candidate with Python and AWS Experience Only those candidates revert who is Having at least 4+ Years of Exp in Python and AWS Kanishk.mittal@thehrsolution.in Mail me your Updated CV here. Job Summary: As a Python Developer with AWS , you will be responsible for developing cloud-based applications, building data pipelines, and integrating with various AWS services. You will work closely with DevOps, Data Engineering, and Product teams to design and deploy solutions that are scalable, resilient, and efficient in an AWS cloud environment. Notice Period Immediate to 45 days Key Responsibilities: Python Development : Design, develop, and maintain applications and services using Python in a cloud environment. AWS Cloud Services : Leverage AWS services such as EC2 , S3 , Lambda , RDS , DynamoDB , and API Gateway to build scalable solutions. Data Pipelines : Develop and maintain data pipelines, including integrating data from various sources into AWS-based storage solutions. API Integration : Design and integrate RESTful APIs for application communication and data exchange. Cloud Optimization : Monitor and optimize cloud resources for cost efficiency, performance, and security. Automation : Automate workflows and deployment processes using AWS Lambda , CloudFormation , and other automation tools. Security & Compliance : Implement security best practices (e.g., IAM roles, encryption) to protect data and maintain compliance within the cloud environment. Collaboration : Work with DevOps, Cloud Engineers, and other developers to ensure seamless deployment and integration of applications. Continuous Improvement : Participate in the continuous improvement of development processes and deployment practices. Required Qualifications: Python Expertise : Strong experience in Python programming, including using libraries like Pandas , NumPy , Boto3 (AWS SDK for Python), and frameworks like Flask or Django . AWS Knowledge : Hands-on experience with AWS services such as S3 , EC2 , Lambda , RDS , DynamoDB , CloudFormation , and API Gateway . Cloud Infrastructure : Experience in designing, deploying, and maintaining cloud-based applications using AWS. API Development : Experience in designing and developing RESTful APIs, integrating with external services, and managing data exchanges. Automation & Scripting : Experience with automation tools and scripts (e.g., using AWS Lambda , Boto3 , CloudFormation ). Version Control : Proficiency with version control tools such as Git . CI/CD Pipelines : Experience building and maintaining CI/CD pipelines for cloud-based applications. Preferred Qualifications: Familiarity with serverless architectures using AWS Lambda and other AWS serverless services. AWS Certification (e.g., AWS Certified Developer Associate , AWS Certified Solutions Architect Associate ) is a plus. Knowledge of containerization tools like Docker and orchestration platforms such as Kubernetes . Experience with Infrastructure as Code (IaC) tools such as Terraform or AWS CloudFormation . Skills & Attributes: Strong analytical and problem-solving skills. Ability to work effectively in an agile environment. Excellent communication and collaboration skills to work with cross-functional teams. Focus on continuous learning and staying up to date with emerging cloud technologies. Strong attention to detail and a commitment to high-quality code.
Posted Date not available
2.0 - 5.0 years
15 - 25 Lacs
pune
Work from Office
Company Name: Karini AI Job Title: Generative AI Engineer Location: Wakad, Pune (Onsite) Employment Type: Full-time Job Overview: Karini AI is looking for a highly motivated and skilled Generative AI Engineer to join our AI team. The ideal candidate should have expertise in Large Language Models (LLMs), Machine Learning (ML), and Deep Learning . We are looking for problem solvers, collaborative team players, and passionate individuals eager to work on cutting-edge AI technologies. If you aspire to be part of an innovative team and accelerate your career in AI, this is the perfect opportunity for you! Roles & Responsibilities: Collaborate with researchers, engineers, and domain experts to design, develop, and optimize Agentic AI applications . Fine-tune LLMs on large-scale datasets to enhance performance and accuracy. Develop compelling demos and lead proof-of-concept (PoC) projects. Stay updated with the latest research advancements in Generative AI and implement best practices. Deploy large-scale machine learning models in cloud-based environments. Mandatory Technical Skills: Hands-on experience in Generative AI and Deep Learning . Advanced knowledge of prompt engineering . Strong experience in PyTorch and working with GPUs . Proficiency in Python (Intermediate to Advanced level). Understanding of model fine-tuning and distributed computing in the cloud. Eligibility Criteria: Bachelors degree in Computer Science, AI, or a related field (4-year program). Masters degree in Artificial Intelligence or a related domain (preferred). Proven experience working as an AI Engineer with a strong foundation in Generative AI . Proficiency in deep learning frameworks and NLP techniques. Experience with cloud-based infrastructure for deploying machine learning models. Key Skills: Generative AI, Deep Learning, Machine Learning, Natural Language Processing (NLP), PyTorch, Python, Cloud Computing, LLM Fine-Tuning, Agentic AI, Lang graph.
Posted Date not available
8.0 - 13.0 years
15 - 30 Lacs
bengaluru
Hybrid
Job Role : React Native / Mobile application developer Key Responsibilities : 1: Responsible for translating designs wireframes into high quality code 2: Ability to work creatively and analytically in a problem-solving 3: Ability to operate independently and make decisions with little direct supervision 4: Great attention to code quality & coding standards 5: Good grasp of architectural principles and patterns Technical Experience : Proficiency in one or more front-end development technologies including iOS, React Native. Proficiency in backend development using AWS stack including Lambda, EC2, Elasticache, S3, Redshift, Kinesis, RDS, DynamoDB, Glacier, ELB, Route53, API Gateway, CloudWatch CloudFormation. Strong core JavaScript, TypeScript, JSX and CSS, HTML 5+ years of JavaScript development experience Minimum of 5 - 8 years of experience with React Native and Strong in React-Native UI framework Experience in mobile application development Good hold on React JS Concepts Strong in React Native design patterns , Redux and Context API with Hooks Experience with Unit Testing frameworks such as Jest and UI testing frameworks like Jest, Enzyme etc Professional Attributes : 1: Good verbal written communication skills to connect with customers at varying levels of the organization Strong active listening, multitasking, interpersonal and organizational skills 2: Logical problem-solving skills and identify solutions based on written procedures, guidelines and process Key Skills: React Native, ReactJS, JavaScript, CSS, HTML, AJAX, jQuery, XML, AWS
Posted Date not available
5.0 - 8.0 years
8 - 12 Lacs
noida
Work from Office
Our employee value proposition (EVP) is about Being Your Best as a professional and person. It is about being challenged by work that inspires us, being empowered to excel and grow in your career, and being part of a culture where talent is valued. Were a place where everyone can discover and be their best version. Key Tasks & Responsibilities Development of highly performant public facing REST API's and associated system integrations in an Azure hosted environment. Documentation of API's conforming to the Open API 3.x framework. Participating in code reviews, design workshops, story/ticket elaboration, etc. Review existing legacy implementations and input from the architecture team, and aid in designing and building appropriate solutions on the new platform following best practices. Ensure that the new platform is developed, tested and hosted in the most secure, scalable manner. Aid in the automation of testing and deployments of all deliverables. Required Skills C#, Microsoft SQL Server or Azure SQL, Azure CosmosDB, Azure Service Bus, Azure Function Apps, Auth0, WebSockets Strong development experience in C# and .NET core technologies built up across a range of different projects Experience of developing API's which conform as much as possible to REST principles in terms of Resources, Sub Resources, Responses, Error Handling Experience of API design and documentation using Open API 3.x YAML Swagger Experience of development, deployment and support within an Azure Environment, with an understanding of security and authorisation concepts Performance tuning of APIs and Azure Functions Ability and willingness to learn quickly and adapt to a fast-changing environment, with a strong interest in continuous improvement and delivery. Strong problem-solving skills and a good understanding of the best practices and the importance of Test Automation processes. Some familiarity with AWS, and especially ElasticSearch would be beneficial but not mandatory. Education and Professional Membership Educated to Degree Level or equivalent, preferably in Computer Science or related subject Azure Certifications an advantage Mandatory Competencies Programming Language - .Net - .NET Core Beh - Communication and collaboration Programming Language - Other Programming Language - C# Cloud - Azure - Azure Data Factory (ADF), Azure Databricks, Azure Data Lake Storage, Event Hubs, HDInsight Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Database - Database Programming - SQL
Posted Date not available
4.0 - 8.0 years
15 - 25 Lacs
hyderabad, pune, chennai
Hybrid
We are looking for AWS Data Engineer Permanent Role. Experience : 4 to 8 Years Location : Hyderabad / Chennai/Noida/Pune NP-Immediate - Skills: Expertise in Data warehousing and ETL Design and implementation Hands on experience with Programming language like Python Good understanding of Spark architecture along with internals Hand on experience using AWS services like Glue (Pyspark), Lambda, S3, Athena, Experience on Snowflake is good to have Hands on experience on implementing different loading strategies like SCD1 and SCD2, Table/ partition refresh, insert update, Swap Partitions, Experience in Parallel Loading and Dependencies orchestrations Awareness of scheduling and orchestration tools Experience on RDBMS systems and concepts Expertise in writing and complex SQL queries and developing Database components including creating views, stored procedures, triggers etc. Create test cases and perform unit testing of ETL Jobs
Posted Date not available
7.0 - 12.0 years
2 - 6 Lacs
chennai
Work from Office
Job Title:Database Administrator Experience: 7-14 Years Location:Chennai Job Description : We are looking for a highly skilled Database Administrator (DBA) to manage, maintain, and optimize our databases across multiple platforms. The ideal candidate will have extensive experience with AWS RDS, Microsoft SQL Server, and MongoDB, along with a strong understanding of database security, performance tuning, and high-availability architectures. This role is crucial in ensuring data integrity, security, and efficiency for our SaaS applications while meeting HIPAA and other healthcare compliance requirements. Key Responsibilities Database Management & Administration Design, configure, and maintain AWS RDS (PostgreSQL, MySQL, SQL Server), Microsoft SQL Server, and MongoDB databases. Ensure high availability, performance, and scalability of all databases. Implement backup and disaster recovery strategies, including point-in-time recovery (PITR) and failover mechanisms. Monitor and optimize database performance using tools like AWS CloudWatch, SQL Profiler, and MongoDB Atlas Performance Advisor. Manage database provisioning, patching, and version upgrades in production and non-production environments. Security & Compliance Enforce data security best practices, including encryption, access controls (IAM, RBAC), and compliance with HIPAA and other healthcare regulations. Perform regular security audits and vulnerability assessments using tools like AWS Security Hub and Tenable. Implement and maintain database auditing, logging, and monitoring to detect and prevent unauthorized access. Optimization & Automation Analyze and optimize query performance, indexing strategies, and database schema design. Automate database maintenance tasks using Terraform, AWS Lambda, PowerShell, or Python scripts. Work with DevOps to integrate CI/CD pipelines for database changes (e.g., Flyway, Liquibase). Optimize storage and resource utilization in AWS to reduce costs while maintaining performance. Collaboration & Support Work closely with DevOps, Engineering, and Security teams to ensure database reliability and security. Provide guidance and best practices to developers on database design, indexing, and query performance tuning. Support application teams with troubleshooting, query optimization, and data modeling. Participate in on-call rotation for database-related incidents and outages. Required Qualifications & Experience 5+ years of experience as a Database Administrator in a SaaS or cloud environment. Strong expertise in AWS RDS (PostgreSQL, MySQL, or SQL Server). Proficient in Microsoft SQL Server, including T-SQL, SSMS, and high-availability configurations. Experience with NoSQL databases like MongoDB (Atlas preferred). Deep understanding of performance tuning, query optimization, indexing strategies, and partitioning. Familiarity with Terraform, AWS CloudFormation, or other Infrastructure-as-Code (IaC) tools. Experience with backup and disaster recovery strategies in AWS and on-prem environments. Knowledge of database replication, clustering, and high-availability architectures. Proficiency in scripting (Python, PowerShell, Bash) for automation. Strong knowledge of security best practices (IAM, RBAC, data encryption, audit logging). Familiarity with healthcare compliance requirements (HIPAA, HITRUST) is a plus. Preferred Skills & Certifications AWS Certified Database Specialty Microsoft Certified: Azure Database Administrator Associate MongoDB Certified DBA Associate Experience with AI/ML-driven database performance optimization tools Exposure to data warehousing and analytics (Redshift, Snowflake, or BigQuery) Other Duties Please note this job description is not designed to cover or contain a comprehensive listing of activities, duties, or responsibilities that are required of the employee for this job. Duties, responsibilities, and activities may change at any time with or without advance notice. Any changes may be for an indeterminate time frame. EEO Statement
Posted Date not available
5.0 - 10.0 years
7 - 12 Lacs
noida
Work from Office
Drives the overall software development lifecycle including working across functional teams to transform requirements into features, managing development teams and processes, and conducting software testing and maintenance. Specific project areas of focus include translating user requirements into technical specifications, writing code and managing the preparation of design specifications. Supports system design, provides advice on security requirements and debugs business systems and service applications. Applies deep knowledge of algorithms, data structures and programming languages to develop high quality technology applications and services - including tools, standards, and relevant software platforms based on business requirements. Translates user needs into technical specifications by understanding, conceptualizing, and facilitating technical requirements from PO/user. Analyzes, develops, tests, and implements new software programs, and documentation of entire software development life cycle execution. Performs preventative and corrective maintenance, troubleshooting and fault rectification of system and core software components. Ensures that code/configurations adhere to the security, logging, error handling, and performance standards and non-functional requirements. Evaluates new technologies for fit with the program/system/eco-system and the associated upstream and downstream impacts on process, data, and risk. Follows release management processes and standards and applies version controls. Assists in interpreting and documentation of client requirements. Focus is primarily on business/group within BMO; may have broader, enterprise-wide focus. Provides specialized consulting, analytical and technical support. Exercises judgment to identify, diagnose, and solve problems within given rules. Works independently and regularly handles non-routine situations. Broader work or accountabilities may be assigned as needed. Experience with Event-driven design architecture Qualifications: Foundational level of proficiency: Creative thinking. Building and managing relationships. Emotional agility. Intermediate level of proficiency: Cloud computing Microservices. Technology Business Requirements Definition, Analysis and Mapping. Adaptability. Verbal & written communication skills. Analytical and problem-solving skills. Advanced level of proficiency: Programming Applications Integration. System Development Lifecycle. System and Technology Integration. Typically, between 5 - 10 years of relevant experience and post-secondary degree in related field of study or an equivalent combination of education and experience. Technology Required: Java Spring Boot framework OpenShift Python NodeJS Ansible Apache Kafka/Spark/Hadoop/HDFS Oracle Databases Linux/Unix/Windows Oracle IBM WebSphere/HIS Microservices Cloud Computing (AWS) AWS Lambda/SNS/SQS/DynamoDB/Redshift/CDK Event Driven Architecture Test Driven Development Agile/Scrum SDLC JSON and XML data notations Knowledge of ISO 20022 standard ServiceNow Programming Language - Java - Core Java (java 8+) Fundamental Technical Skills - Programming Multithreading Collections Database - Database Programming - SQL Programming Language - Java - Spring Framework Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Middleware - Message Oriented Middleware - Messaging (JMS, ActiveMQ, RabitMQ, Kafka, SQS, ASB etc) Architecture - Architectural Patterns - Microservices Programming Language - Java - OOPS Concepts
Posted Date not available
6.0 - 8.0 years
8 - 12 Lacs
noida
Work from Office
Full-stack developer with 6-8 years of experience in designing and developing robust, scalable, and maintainable applications applying Object Oriented Design principles . Strong experience in Spring frameworks like Spring Boot, Spring Batch, Spring Data etc. and Hibernate, JPA. Strong experience in microservices architecture and implementation Strong knowledge of HTML, CSS and JavaScript, React Experience with SOAP Web-Services, REST Web-Services and Java Messaging Service (JMS) API. Familiarity designing, developing, and deploying web applications using Amazon Web Services (AWS). Good experience on AWS Services - S3, Lambda, SQS, SNS, DynamoDB, IAM, API Gateways Hands on experience in SQL, PL/SQL and should be able to write complex queries. Hands-on experience in REST-APIs Experience with version control systems (e.g., Git) Knowledge of web standards and accessibility guidelines Knowledge of CI/CD pipelines and experience in tools such as JIRA, Splunk, SONAR etc . Must have strong analytical and problem-solving abilities Good experience in JUnit testing and mocking techniques Experience in SDLC processes (Waterfall/Agile), Docker, Git, SonarQube Excellent communication and interpersonal skills, Ability to work independently and as part of a team. Mandatory Competencies Programming Language - Java - Core Java (java 8+) Programming Language - Java Full Stack - HTML/CSS Programming Language - Java - Spring Framework Programming Language - Java - Hibernate Programming Language - Java Full Stack - JavaScript Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Git DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Docker Fundamental Technical Skills - Spring Framework/Hibernate/Junit etc. Beh - Communication and collaboration Middleware - API Middleware - API (SOAP, REST) Middleware - API Middleware - WebServies (REST, SOAP) Middleware - API Middleware - Microservices Middleware - Java Middleware - Springboot Programming Language - Java Full Stack - Spring Framework User Interface - Other User Interfaces - React Development Tools and Management - Development Tools and Management - CI/CD Cloud - AWS - AWS S3, S3 glacier, AWS EBS Agile - Agile - SCRUM Database - Oracle - PL/SQL Packages
Posted Date not available
5.0 - 8.0 years
7 - 15 Lacs
hyderabad
Work from Office
Hi, Please find the below Job Description Key Responsibilities: Design and implement microservices-based backend solutions using AWS technologies. Develop and deploy serverless applications using AWS Lambda, API Gateway, DynamoDB, SQS, SNS, and other AWS services. Create and maintain RESTful APIs or GraphQL endpoints. Write clean, scalable, and well-documented code. Optimize application performance and ensure high availability and reliability. Implement security and data protection best practices. Collaborate with frontend developers, DevOps, and QA to deliver integrated solutions. Participate in code reviews, sprint planning, and architecture discussions. Monitor and troubleshoot production systems, ensuring incident resolution and uptime. Required Skills & Qualifications: 3+ years of backend development experience. 2+ years of working with AWS services , particularly in a serverless environment . Proficiency in Node.js , Python , or Java for backend development. Solid understanding of microservices design principles and API development . Hands-on experience working with SQL and NOSQL databases. Experience with AWS Lambda , API Gateway , DynamoDB , S3 , SQS , SNS , and CloudWatch . Familiarity with CI/CD pipelines , Infrastructure as Code (IaC) using CDK or SAM or Terraform . Knowledge of IOT data and related technologies will be a plus. Experience with Docker and container orchestration (e.g., ECS , EKS ) is a plus. Strong understanding of REST , JSON , and asynchronous messaging patterns. Knowledge of security best practices in cloud-based environments.
Posted Date not available
3.0 - 6.0 years
4 - 8 Lacs
hyderabad, bengaluru
Work from Office
Your Profile You would be working on Amazon Lex and Amazon Connect. AWS services such as Lambda, CloudWatch, DynamoDB, S3, IAM, and API Gateway, Node.js or Python. Contact center workflows and customer experience design. NLP and conversational design best practices. CI/CD processes and tools in the AWS ecosystem. Your Role Experience in design, develop, and maintain voice and chatbots using Amazon Lex. Experience in Lex bots with Amazon Connect for seamless customer experience. Experience in Amazon Connect contact flows, queues, routing profiles, and Lambda integrations. Experience in develop and deploy AWS Lambda functions for backend logic and Lex fulfillment. What youll love about working here You can shape yourcareer with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have theopportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini Location - Chennai (ex Madras), Bangalore, Hyderabad, Mumbai, Pune
Posted Date not available
2.0 - 5.0 years
4 - 8 Lacs
bengaluru
Work from Office
Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your Role Should have developed/Worked for atleast 1 Gen AI project. Has data pipeline implementation experience with any of these cloud providers - AWS, Azure, GCP. Experience with cloud storage, cloud database, cloud data warehousing and Data lake solutions like Snowflake, Big query, AWS Redshift, ADLS, S3. Has good knowledge of cloud compute services and load balancing. Has good knowledge of cloud identity management, authentication and authorization. Proficiency in using cloud utility functions such as AWS lambda, AWS step functions, Cloud Run, Cloud functions, Azure functions. Experience in using cloud data integration services for structured, semi structured and unstructured data such as Azure Databricks, Azure Data Factory, Azure Synapse Analytics, AWS Glue, AWS EMR, Dataflow, Dataproc. Your Profile Good knowledge of Infra capacity sizing, costing of cloud services to drive optimized solution architecture, leading to optimal infra investment vs performance and scaling. Able to contribute to making architectural choices using various cloud services and solution methodologies. Expertise in programming using python. Very good knowledge of cloud Dev-ops practices such as infrastructure as code, CI/CD components, and automated deployments on cloud. Must understand networking, security, design principles and best practices in cloud. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.
Posted Date not available
3.0 - 6.0 years
4 - 8 Lacs
hyderabad, bengaluru
Work from Office
Your Profile You would be working on Amazon Lex and Amazon Connect. AWS services such as Lambda, CloudWatch, DynamoDB, S3, IAM, and API Gateway, Node.js or Python. Contact center workflows and customer experience design. NLP and conversational design best practices. CI/CD processes and tools in the AWS ecosystem. Your Role Experience in design, develop, and maintain voice and chatbots using Amazon Lex. Experience in Lex bots with Amazon Connect for seamless customer experience. Experience in Amazon Connect contact flows, queues, routing profiles, and Lambda integrations. Experience in develop and deploy AWS Lambda functions for backend logic and Lex fulfillment. What youll love about working here You can shape yourcareer with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have theopportunity to learn on one of the industry"s largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini Location - Chennai (ex Madras), Bangalore, Hyderabad, Mumbai, Pune
Posted Date not available
2.0 - 5.0 years
4 - 8 Lacs
bengaluru
Work from Office
Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your Role Should have developed/Worked for atleast 1 Gen AI project. Has data pipeline implementation experience with any of these cloud providers - AWS, Azure, GCP. Experience with cloud storage, cloud database, cloud data warehousing and Data lake solutions like Snowflake, Big query, AWS Redshift, ADLS, S3. Has good knowledge of cloud compute services and load balancing. Has good knowledge of cloud identity management, authentication and authorization. Proficiency in using cloud utility functions such as AWS lambda, AWS step functions, Cloud Run, Cloud functions, Azure functions. Experience in using cloud data integration services for structured, semi structured and unstructured data such as Azure Databricks, Azure Data Factory, Azure Synapse Analytics, AWS Glue, AWS EMR, Dataflow, Dataproc. Your Profile Good knowledge of Infra capacity sizing, costing of cloud services to drive optimized solution architecture, leading to optimal infra investment vs performance and scaling. Able to contribute to making architectural choices using various cloud services and solution methodologies. Expertise in programming using python. Very good knowledge of cloud Dev-ops practices such as infrastructure as code, CI/CD components, and automated deployments on cloud. Must understand networking, security, design principles and best practices in cloud. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.
Posted Date not available
5.0 - 10.0 years
15 - 25 Lacs
kolkata
Hybrid
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant-Data Engineer, AWS+Python, Spark, Kafka for ETL! Responsibilities Develop, deploy, and manage ETL pipelines using AWS services, Python, Spark, and Kafka. Integrate structured and unstructured data from various data sources into data lakes and data warehouses. Design and deploy scalable, highly available, and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Proven expertise in AWS services such as S3, Lambda, Glue, EMR, and Redshift. Advanced programming skills in Python for data processing and automation. Hands-on experience with Apache Spark for large-scale data processing. Experience with Apache Kafka for real-time data streaming and event processing. Proficiency in SQL for data querying and transformation. Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Master’s Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
Posted Date not available
5.0 - 10.0 years
15 - 25 Lacs
kolkata
Hybrid
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant-Data Engineer, AWS+Python, Spark, Kafka for ETL! Responsibilities Develop, deploy, and manage ETL pipelines using AWS services, Python, Spark, and Kafka. Integrate structured and unstructured data from various data sources into data lakes and data warehouses. Design and deploy scalable, highly available, and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Proven expertise in AWS services such as S3, Lambda, Glue, EMR, and Redshift. Advanced programming skills in Python for data processing and automation. Hands-on experience with Apache Spark for large-scale data processing. Experience with Apache Kafka for real-time data streaming and event processing. Proficiency in SQL for data querying and transformation. Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Master’s Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
Posted Date not available
5.0 - 8.0 years
7 - 11 Lacs
noida
Work from Office
Tech Stack Java + Spring Boot AWS (ECS, Lambda, EKS) Drools (preferred but optional) APIGEE API observability, traceability, and security Skills Required: Strong ability to understand existing codebases, reengineer and domain knowledge to some extent. Capability to analyze and integrate the new systems various interfaces with the existing APIs. Hands-on experience with Java, Spring, Spring Boot, AWS, and APIGEE . Familiarity with Drools is an added advantage. Ability to write and maintain JIRA stories (10-15% of the time) and keeping existing technical specifications updated would be required. Should take end-to-end ownership of the project , create design, guide team and work independently on iterative tasks. Should proactively identify and highlight risks during daily scrum calls and provide regular updates. Mandatory Competencies Java - Core JAVA Others - Micro services Java Others - Spring Boot Cloud - AWS Lambda Cloud - Apigee Beh - Communication and collaboration
Posted Date not available
4.0 - 7.0 years
12 - 22 Lacs
pune
Work from Office
Role Overview: We are seeking a skilled Data Engineer to design and implement scalable data pipelines using AWS Lambda that push structured and semi-structured data into a PostgreSQL data store. The role also requires experience in data modeling , Looker dashboard development , and strong SQL/database expertise to support reporting and analytics needs across the organization. Roles and Responsibilities Required Skills: 3–5 years of experience in Data Engineering or Backend Engineering roles Strong experience with AWS Lambda and serverless data architecture Proficient in Python or Node.js for writing Lambda functions Solid experience with PostgreSQL – schema design, optimization, and advanced SQL Proven expertise in data modeling for analytics and reporting Hands-on experience with Looker (LookML, dashboards, data exploration) Familiarity with AWS services like S3, CloudWatch, API Gateway, and IAM Excellent debugging, problem-solving, and communication skills Key Responsibilities: Design, develop, and deploy serverless data ingestion pipelines using AWS Lambda Write and optimize Lambda functions to clean, transform, and push data into PostgreSQL Develop and maintain scalable, efficient data models supporting analytical workloads Create LookML models and build dashboards in Looker to enable self-service analytics Maintain database integrity, indexing, and performance optimization Collaborate with product, engineering, and analytics teams to understand data needs Build robust error-handling, logging, and retry mechanisms for data pipelines Ensure data governance, quality, and security best practices are followed Good to Have: Experience integrating third-party APIs or webhooks into Lambda functions Familiarity with data warehousing concepts (e.g., Snowflake, Redshift, or BigQuery) Exposure to CI/CD for data pipelines using tools like GitLab or Jenkins Understanding of modern data stack tools (Fivetran, dbt, Airflow, etc.)
Posted Date not available
5.0 - 8.0 years
10 - 20 Lacs
hyderabad
Hybrid
Job Description: We are looking for a skilled Software Developer with 5+ years of experience to join our dynamic engineering team. The ideal candidate will have strong expertise in React, Typescript, Node.js, PostgreSQL , and AWS , with a proven track record in B2B SaaS or Fintech environments. Key Responsibilities: Design and build scalable web applications using React and Typescript . Develop and manage robust backend APIs with Node.js . Optimize and maintain PostgreSQL databases. Deploy and monitor applications on AWS cloud infrastructure . Collaborate with cross-functional teams to deliver product features. Ensure high code quality, security, and performance standards. Troubleshoot production issues across frontend, backend, and infrastructure layers. Required Skills: React (Hooks, State Management, Component Architecture) Typescript (Interfaces, Type Safety, Generics) Node.js (REST APIs, Performance Tuning) PostgreSQL (Schema Design, SQL Optimization) AWS (EC2, Lambda, RDS, S3, CloudFormation) Git (Version Control) Preferred Skills (Good to Have): Microservices or Serverless Architecture CI/CD Pipelines Docker/Kubernetes Third-party API Integration & Payment Gateways DevOps & Infrastructure-as-Code experience
Posted Date not available
5.0 - 10.0 years
7 - 12 Lacs
hyderabad
Work from Office
ABOUT THE TEAM The GenAI Product Engineering team accelerates AI innovation by transforming cutting-edge research into user-centric applications. As a Full Stack Developer, you will bridge the gap between AI models and end-users by building intuitive interfaces, internal tools, and rapid prototypes. Your work will enable stakeholders to interact with GenAI capabilities, validate ideas, and scale proofs-of-concept (POCs) into production-ready features. At ABC, we love entrepreneurs because we are entrepreneurs. We roll our sleeves up, we act fast, and we learn together. WHAT YOULL DO Develop internal tools, admin dashboards, and lightweight POCs using React.js and Node.js to showcase GenAI capabilities. Integrate backend APIs (REST/GraphQL) with frontend interfaces, ensuring seamless data flow for AI-driven features. Rapidly translate wireframes into production-ready code during agile sprints, prioritizing usability and performance. Collaborate with ML engineers to embed AI logic into UIs, enabling real-time interactions with models like GPT-4 and Claude. Optimize application performance using AWS/Azure cloud services, reducing latency by 40% in high-traffic scenarios. Implement automated testing pipelines (Jest, Cypress) to maintain code quality across 50+ internal tools. WHAT YOULL NEED 5 years of full-stack development experience with React.js, Node.js, and TypeScript. Proficiency in REST/GraphQL API design and integration with databases (PostgreSQL, MongoDB). Experience building admin panels and dashboards using modern frameworks (Next.js, Chakra UI). Familiarity with cloud platforms (AWS Lambda, Azure Functions) and CI/CD tools (Jenkins, GitHub Actions). Ability to thrive in fast-paced agile environments, delivering production-ready code within 2-week sprints. Portfolio demonstrating rapid prototyping of AI/ML-powered applications. AND ITS NICE TO HAVE Exposure to GenAI toolchains (LangChain, OpenAI API) or vector databases (Pinecone, FAISS). Certifications in AWS Certified Developer or Azure Fundamentals. Experience with AI observability tools (Weights & Biases, MLflow).
Posted Date not available
5.0 - 7.0 years
15 - 30 Lacs
kochi, bengaluru, thiruvananthapuram
Work from Office
Job Title: AWS Serverless Developer Location: Kochi Experience: 5-9 years Work Mode: [Hybrid] Job Summary: We are seeking an experienced AWS Serverless Developer with strong proficiency in TypeScript and expertise in building scalable cloud-native applications using AWS services. The ideal candidate will be hands-on with AWS Lambda , API Gateway , and DynamoDB , and will have a deep understanding of serverless architecture and best practices. Key Responsibilities: Design, develop, and maintain serverless applications on AWS. Write clean, scalable, and efficient code in TypeScript . Build and manage APIs using API Gateway and Lambda functions . Design and manage data persistence using DynamoDB . Implement and maintain CI/CD pipelines and IaC (Infrastructure as Code) for AWS environments. Work closely with product managers, architects, and other developers to deliver high-quality solutions. Participate in code reviews and ensure adherence to best practices. Monitor application performance and troubleshoot production issues. Required Skills: Strong hands-on experience with AWS Serverless architecture . Proficient in TypeScript . Solid experience with AWS Lambda , API Gateway , and DynamoDB . Familiar with AWS tools and services such as CloudWatch, IAM, and CloudFormation or CDK. Good understanding of RESTful API design and microservices architecture. Strong problem-solving and debugging skills. Nice to Have: Experience with DevOps practices in AWS environments. Familiarity with GraphQL. Prior experience in Agile/Scrum environments. Education: Bachelor's degree in Computer Science, Engineering, or related field. Required Skills Aws Cloud,React.Js,Terraform,Jira
Posted Date not available
5.0 - 9.0 years
7 - 11 Lacs
hyderabad
Work from Office
Keyloop bridges the gap between dealers, manufacturers, technology suppliers and car buyers. We empower car dealers and manufacturers to fully embrace digital transformation. How? By creating innovative technology that makes selling cars better for our customers, and buying and owning cars better for theirs. We use cutting-edge technology to link our clients’ systems, departments and sites. We provide an open technology platform that’s shaping the industry for the future. We use data to help clients become more efficient, increase profitability and give more customers an amazing experience. Want to be part of it? The purpose of this role is to deliver technical and people leadership and direction for platform and application infrastructure that supports strategic initiatives. Advise on tactical and strategic issues relating to established technology and future adoption calling upon internal and external technical advances. Carry out complex design, development, testing, documentation, code review and analysis of various software applications and technical specifications, complying with established methodologies. Creatively enhance existing product lines with new capabilities and features. Focus on product enhancements and solutions for complex problems and perform a pivotal role in the integration of developed software utilizing user interfaces and data. Engineer sophisticated new solutions for large-scale go-to-market system offerings, encompassing Keyloops entire product line. Be a subject matter expert in technical leadership, direction, and design, and mentor others whilst fostering collaboration and innovation throughout the Product & Engineering community. Key Responsibilities Develop trusting & collaborative relationships with your team members and highly empower them, provide clear direction, and lead to outstanding performance. Provide mentorship and facilitate professional development. Performance Management for team members. Leave Management Guide Team Development: Lead and mentor the development team, fostering a collaborative and productive environment Maintain Quality Standards: Establish and enforce best practices and coding standards to ensure high-quality software development Technical Problem-Solving: Analyze and resolve technical issues, ensuring smooth operation and performance of applications Code Reviews: Conduct regular code reviews to ensure code quality and adherence to standards Documentation: Prepare and maintain documentation on the status, operation, and maintenance of software Collaboration: Work closely with other departments, such as QA and operations, to ensure seamless integration and deployment of applications Security Audits: Conduct security audits to identify and address potential vulnerabilities Continuous Improvement: Stay updated with the latest industry trends and technologies to continuously improve the team's skills and the project's quality Owning NFR requirements. Essentials Skills and Qualifications Minimum 8 years of relevant work experience. Net Core/.NET, ASP.NETCore, C# (8 or above), AWS, Caching, Exp in RDBMS such as SQL Server /MySQL/PostgreSQL, REST API, MVC / MVVM frameworks, ApiGee (Added Advantage) Event Driven Architecture, AWS Lambda experience, Graph SQL (Added Advantage). EST principles and swagger, identify server (Added advantage). Fluency with object-oriented design patterns. Experience with ReactJS and JavaScript with ES6 features. Commercial experience with React, including building reusable components and an expert on using hooks and context. Experience with redux desirable but not essential. Experience with a CSS-in-JS library such as style-components or emotion. Experience with modern CSS including flexbox, grid and responsive application design. Experience with accessibility (a11y) and internationalization (i18n). Exposure to create-react-app tooling such as eslint, jest, babel and webpack would be useful. Experience with Unit Testing (Frontend and backend) as well as Integration Testing & Contract Testing (Cypress and PACT) Experience in building Web services and strong knowledge on REST Strong knowledge of the DOM and CSS (HTML5 & CSS3) Experience with Git, AWS and CI/CD Good Understanding of web security Good understanding and prior experience of the Agile process (Scrum or Kanban Why join us? We’re on a journey to become market leaders in our space – and with that comes some incredible opportunities. Collaborate and learn from industry experts from all over the globe. Work with game-changing products and services. Get the training and support you need to try new things, adapt to quick changes and explore different paths. Join Keyloop and progress your career, your way. An inclusive environment to thrive We’re committed to fostering an inclusive work environment. One that respects all dimensions of diversity. We promote an inclusive culture within our business, and we celebrate different employees and lifestyles – not just on key days, but every day. Be rewarded for your efforts We believe people should be paid based on their performance so our pay and benefits reflect this and are designed to attract the very best talent. We encourage everyone in our organisation to explore opportunities which enable them to grow their career through investment in their development but equally by working in a culture which fosters support and unbridled collaboration. Keyloop doesn’t require academic qualifications for this position. We select based on experience and potential, not credentials. We are also an equal opportunity employer committed to building a diverse and inclusive workforce. We value diversity and encourage candidates of all backgrounds to apply .
Posted Date not available
5.0 - 10.0 years
7 - 12 Lacs
kolkata
Work from Office
We are seeking a Senior DevOps Engineer with at least 5years of hands-on experience in building, managing, and optimizing scalableinfrastructure and CI/CD pipelines. The idealcandidate will play a crucial role in automating deployment workflows, securingcloud environments and managing container orchestration platforms. You will leverage yourexpertise in AWS, Kubernetes, ArgoCD, and CI/CD to streamline our developmentprocesses, ensure the reliability and scalability of our systems, and drive theadoption of best practices across the team. Key Responsibilities: Design, implement, and maintain CI/CD pipelines using GitHubActions and Bitbucket Pipelines. Develop and manage Infrastructure as Code (IaC) using Terraform for AWS-based infrastructure. Setup and administer SFTP servers on cloud-based VMs using chrootconfigurations and automate file transfers to S3-backed Glacier . Manage SNS for alerting and notification integration. Ensure cost optimization of AWS services through billing reviewsand usage audits. Implement and maintain secure secrets management using AWS KMS , ParameterStore , and Secrets Manager . Configure, deploy, and maintain a wide range of AWS services, includingbut not limited to: Compute Services o Provision and manage compute resources using EC2, EKS, AWS Lambda, andEventBridge for compute-driven, serverless and event-driven architectures. Storage & Content Delivery o Manage data storage and archival solutions using S3, Glacier, andcontent delivery through CloudFront. Networking & Connectivity o Design and manage secure network architectures with VPCs, LoadBalancers, Security Groups, VPNs, and Route 53 for DNS routing and failover. Ensureproper functioning of Network Services like TCP/IP, reverse proxies (e.g.,NGINX). Monitoring & Observability o Implement monitoring, logging, and tracing solutions using CloudWatch,Prometheus, Grafana, ArgoCD, and OpenTelemetry to ensure system health andperformance visibility. Database Services o Deploy and manage relational databases via RDS for MySQL, PostgreSQL,Aurora, and healthcare-specific FHIR database configurations. Security & Compliance o Enforce security best practices using IAM (roles, policies), AWS WAF,Amazon Inspector, GuardDuty, Security Hub, and Trusted Advisor to monitor,detect, and mitigate risks. GitOps o Apply excellent knowledge of GitOps practices, ensuring allinfrastructure and application configuration changes are tracked and versionedthrough Git commits. Architect and manage Kubernetes environments (EKS) , implementingHelm charts, ingress controllers, autoscaling (HPA/VPA), and service meshes(Istio), troubleshoot advanced issues related to pods, services, DNS, andkubelets. Apply best practices in Git workflows (trunk-based, featurebranching) in both monorepo and multi-repo environments. Maintain, troubleshoot, and optimize Linux-based systems (Ubuntu,CentOS, Amazon Linux). Support the engineering and compliance teams by addressing requirementsfor HIPAA, GDPR, ISO 27001, SOC 2 , and ensuring infrastructurereadiness. Perform rollback and hotfix procedures withminimal downtime. Collaborate with developers to define release anddeployment processes. Manage and standardize build environments acrossdev, staging, and production.Manage release and deployment processes across dev,staging, and production. Work cross-functionally with development and QAteams. Lead incident postmortems and drive continuousimprovement. Perform root cause analysis and implementcorrective/preventive actions for system incidents. Set up automated backups/snapshots, disasterrecovery plans, and incident response strategies. Ensure on-time patching. Mentor junior DevOps engineers. Required Qualifications: Bachelors degree in Computer Science, Engineering, or equivalentpractical experience. 5+ years of proven DevOps engineering experience in cloud-basedenvironments. Advanced knowledge of AWS, Terraform, CI/CD tools,and Kubernetes (EKS). Strong scripting and automation mindset. Solid experience with Linux system administration and networking. Excellent communication and documentation skills. Ability to collaborate across teams and lead DevOps initiatives independently. Preferred Qualifications: Experience with infrastructureas code tools such as Terraform or CloudFormation. Experience with GitHub Actions is a plus. Certifications in AWS (e.g., AWS DevOpsEngineer, AWS SysOps Administrator) or Kubernetes (CKA/CKAD). Experience working in regulated environments(e.g., healthcare or fintech). Exposure to container security tools and cloudcompliance scanners. Perks and benefits: Health insurance Hybrid working mode Provident Fund Parental leave Yearly Bonus Gratuity
Posted Date not available
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City