Jobs
Interviews

8 Secrets Manager Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

haryana

On-site

Seeking a highly skilled and experienced Senior Software Engineer (Full-stack) to join our team. As the Senior Software Engineer, you will collaborate with a team of developers to build backend applications using NestJs/NodeJs and front-end applications using NextJs/ReactJs/Contentful. You are a strong technical contributor who enjoys striking a thoughtful, pragmatic balance between moving fast and writing high-quality code. Bringing an ownership mindset to the products and systems you build, you adapt quickly and smoothly as priorities change, embracing the rapid iteration characteristic of an experimental and data-driven product development lifecycle. You hold informed opinions about technology and system design, finding joy in collaborative problem-solving. Believing in the significance of direct yet compassionate peer feedback and strong relationships within product teams for organizational success, you are a team player with strong written and verbal communication skills. Your passion for infrastructure as code is evident through your experience in provisioning and managing Cloud infrastructure using AWS CLI, AWS CloudFormation, AWS Cloud Development Kit (CDK), and/or TerraForm. You possess strong knowledge and hands-on experience with JavaScript, Typescript, NestJS, and Node.js. With 4+ years of experience in Full stack development using NodeJs, Python, Express, and AWS, along with 3+ years of experience in GraphQL, Apollo Client, and MySQL, you excel in integrating monitoring and logging using tools like App Dynamics, Splunk, etc. Your hands-on experience in building/debugging CI/CD Pipelines using AWS, coupled with experience in working with AWS CloudWatch, Secrets Manager, S3, Docker, and both Relational and non-relational Databases, showcases your understanding of web services and complex software systems. You have a background in writing automated unit tests, experience with E2E testing frameworks like Cypress or Playwright, code repositories, and version control practices, as well as agile development methodology. In this role, you will collaborate with the team to develop and maintain scalable, high-performance web applications using NodeJs, NextJs, and ReactJs. Working together to design and develop applications optimized for the front-end and backend, you will participate in code reviews to ensure adherence to coding standards. Your work with AWS hosted apps will focus on ensuring applications are scalable and secure, all while upholding our core values of Transparency, Integrity, and Equality. Understanding the Business/Stakeholder/Technical requirements, you will play a key role in analyzing existing solutions and creating simple, modular, extensible functional designs for the product/solution in adherence to the requirements. You will develop highly innovative UI design and Web components through collaboration and dialogue with other experts in the field, attending all technical discussions/design/development meetings to provide technical inputs for enhancing code quality and processes. Location: This position can be based in any of the following locations: Gurgaon Current Guardian Colleagues: Please apply through the internal Jobs Hub in Workday,

Posted 18 hours ago

Apply

1.0 - 3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

We are looking for a proactive and skilled Software Engineer with experience in full-stack development, cloud deployments, and automation testing. The ideal candidate will have hands-on experience with C#, .NET, AWS, and DevOps practices, and will contribute to scalable software solutions, API integrations, and production support in an agile environment. Role - Developer Full Stack & Cloud Level - Sr. Executive / AM Location Whitefield, Bengaluru Work Mode Work from Office Only (No Hybrid) Key Result Areas Develop and deploy Docker containers on AWS using Azure DevOps for CI/CD pipelines. Automate regression testing using Selenium and C# to enhance test coverage and reduce manual effort. Support authentication migration from ADFS to Azure AD, ensuring seamless cloud integration. Design, develop, and maintain RESTful APIs with robust unit testing and integration. Provide production support and incident management using OpenTelemetry, UVP, and xMatters. Enhance admin and workflow systems, focusing on task and process management features. Collaborate with cross-functional agile teams to deliver high-quality software solutions. Adhere to coding standards and best practices throughout the development lifecycle. Skills Required Backend: C#, .NET Core, ASP.NET, Web API, ASP.NET MVC, Entity Framework Additional Backend: SQL Server, Java, Python, C, Django Frontend: JavaScript, HTML5, CSS3, jQuery Cloud & DevOps: AWS (S3, Lambda, API Gateway, SQS, Secrets Manager, Load Balancer, DynamoDB) DevOps Tools: Azure DevOps, Docker, Kubernetes General: RESTful APIs, Unit Testing, Integration Testing, Agile/Scrum methodologies, Web Development, API Development, Basic Machine Learning concepts Qualification Bachelor or equivalent degree in Computer Science, Information Systems, or Engineering Experience Minimum 1 to 3 years of experience in full stack development and cloud-based systems Show more Show less

Posted 1 day ago

Apply

10.0 - 14.0 years

0 Lacs

haryana

On-site

As a Digital Product Engineering company, Nagarro is seeking a talented individual to join our dynamic and non-hierarchical work culture as a Data Engineer. With over 17500 experts across 39 countries, we are scaling in a big way and are looking for someone with 10+ years of total experience to contribute to our team. **Requirements:** - The ideal candidate should possess strong working experience in Data Engineering and Big Data platforms. - Hands-on experience with Python and PySpark is required. - Expertise with AWS Glue, including Crawlers and Data Catalog, is essential. - Experience with Snowflake and a strong understanding of AWS services such as S3, Lambda, Athena, SNS, and Secrets Manager are necessary. - Familiarity with Infrastructure-as-Code (IaC) tools like CloudFormation and Terraform is preferred. - Strong experience with CI/CD pipelines, preferably using GitHub Actions, is a plus. - Working knowledge of Agile methodologies, JIRA, and GitHub version control is expected. - Exposure to data quality frameworks, observability, and data governance tools and practices is advantageous. - Excellent communication skills and the ability to collaborate effectively with cross-functional teams are essential for this role. **Responsibilities:** - Writing and reviewing high-quality code to meet technical requirements. - Understanding clients" business use cases and converting them into technical designs. - Identifying and evaluating different solutions to meet clients" requirements. - Defining guidelines and benchmarks for Non-Functional Requirements (NFRs) during project implementation. - Developing design documents explaining the architecture, framework, and high-level design of applications. - Reviewing architecture and design aspects such as extensibility, scalability, security, design patterns, user experience, and NFRs. - Designing overall solutions for defined functional and non-functional requirements and defining technologies, patterns, and frameworks. - Relating technology integration scenarios and applying learnings in projects. - Resolving issues raised during code/review through systematic analysis of the root cause. - Conducting Proof of Concepts (POCs) to ensure suggested designs/technologies meet requirements. **Qualifications:** - Bachelors or master's degree in computer science, Information Technology, or a related field is required. If you are passionate about Data Engineering, experienced in working with Big Data platforms, proficient in Python and PySpark, and have a strong understanding of AWS services and Infrastructure-as-Code tools, we invite you to join Nagarro and be part of our innovative team.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be responsible for working with AWS CDK using Type Script and CloudFormation template to manage various AWS services such as Redshift, Glue, IAM roles, KMS keys, Secrets Manager, Airflow, SFTP, AWS Lambda, S3, and Event Bridge. Your tasks will include executing grants, store procedures, queries, and Redshift Spectrum to query S3, defining execution roles, debugging jobs, creating IAM roles with fine-grained access, integrating and deploying services, managing KMS keys, configuring Secrets Manager, creating Airflow DAGs, executing serverless AWS Lambda functions, debugging Lambda functions, managing S3 object storage including lifecycle configuration, resource-based policies, and encryption, and setting up event triggers using Lambda Event Bridge with rules. You should have knowledge of AWS Redshift SQL workbench for executing grants and a strong understanding of networking concepts, security, and cloud architecture. Experience with monitoring tools like CloudWatch and familiarity with containerization tools like Docker and Kubernetes would be beneficial. Strong problem-solving skills and the ability to thrive in a fast-paced environment are essential. Virtusa is a company that values teamwork, quality of life, and professional and personal development. With a global team of 27,000 professionals, Virtusa is committed to supporting your growth by providing exciting projects, opportunities to work with cutting-edge technologies, and a collaborative team environment that encourages the exchange of ideas and excellence. At Virtusa, you will have the chance to work with great minds and unleash your full potential in a dynamic and innovative workplace.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

You should have a minimum of 5 years of experience in DevOps, SRE, or Infrastructure Engineering. Your expertise should include a strong command of Azure Cloud and Infrastructure-as-Code using tools such as Terraform and CloudFormation. Proficiency in Docker and Kubernetes is essential. You should be hands-on with CI/CD tools and scripting languages like Bash, Python, or Go. A solid knowledge of Linux, networking, and security best practices is required. Experience with monitoring and logging tools such as ELK, Prometheus, and Grafana is expected. Familiarity with GitOps, Helm charts, and automation will be an advantage. Your key responsibilities will involve designing and managing CI/CD pipelines using tools like Jenkins, GitLab CI/CD, and GitHub Actions. You will be responsible for automating infrastructure provisioning through tools like Terraform, Ansible, and Pulumi. Monitoring and optimizing cloud environments, implementing containerization and orchestration with Docker and Kubernetes (EKS/GKE/AKS), and maintaining logging, monitoring, and alerting systems (ELK, Prometheus, Grafana, Datadog) are crucial aspects of the role. Ensuring system security, availability, and performance tuning, managing secrets and credentials using tools like Vault and Secrets Manager, troubleshooting infrastructure and deployment issues, and implementing blue-green and canary deployments will be part of your responsibilities. Collaboration with developers to enhance system reliability and productivity is key. Preferred skills include certification as an Azure DevOps Engineer, experience with multi-cloud environments, microservices, and event-driven systems, as well as exposure to AI/ML pipelines and data engineering workflows.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

indore, madhya pradesh

On-site

You should possess expert-level proficiency in Python and Python frameworks or Java. Additionally, you must have hands-on experience with AWS Development, PySpark, Lambdas, Cloud Watch (Alerts), SNS, SQS, CloudFormation, Docker, ECS, Fargate, and ECR. Your deep experience should cover key AWS services such as Compute (PySpark, Lambda, ECS), Storage (S3), Databases (DynamoDB, Snowflake), Networking (VPC, 53, CloudFront, API Gateway), DevOps/CI-CD (CloudFormation, CDK), Security (IAM, KMS, Secrets Manager), and Monitoring (CloudWatch, X-Ray, CloudTrail). Moreover, you should be proficient in NoSQL Databases like Cassandra, PostgreSQL, and have strong hands-on knowledge of using Python for integrations between systems through different data formats. Your expertise should extend to deploying and maintaining applications in AWS, with hands-on experience in Kinesis streams and Auto-scaling. Designing and implementing distributed systems and microservices, scalability, high availability, and fault tolerance best practices are also key aspects of this role. You should have strong problem-solving and debugging skills, with the ability to lead technical discussions and mentor junior engineers. Excellent communication skills, both written and verbal, are essential. You should be comfortable working in agile teams with modern development practices, collaborating with business and other teams to understand business requirements and work on project deliverables. Participation in requirements gathering and understanding, designing solutions based on available frameworks and code, and experience with data engineering tools or ML platforms (e.g., Pandas, Airflow, SageMaker) are expected. An AWS certification (AWS Certified Solutions Architect or Developer) would be advantageous. This position is based in multiple locations in India, including Indore, Mumbai, Noida, Bangalore, and Chennai. To qualify, you should hold a Bachelor's degree or a foreign equivalent from an accredited institution. Alternatively, three years of progressive experience in the specialty can be considered in lieu of each year of education. A minimum of 8+ years of Information Technology experience is required for this role.,

Posted 3 weeks ago

Apply

4.0 - 6.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Job Title: AWS Developer About the Company/Team Oracle FSGIU's Finergy division is a specialized team dedicated to the Banking, Financial Services, and Insurance (BFSI) industry, offering deep domain expertise and innovative solutions. With a focus on accelerated implementation, Finergy helps financial institutions rapidly deploy multi-channel platforms, ensuring an exceptional customer experience. Our team provides end-to-end banking solutions, leveraging integrated analytics and dashboards for improved efficiency. Finergy's consulting services offer strategic guidance, aligning technology investments with business objectives. Job Summary We are on the lookout for a skilled AWS Developer with 4-6 years of experience to design and build cutting-edge applications on the Amazon Web Services (AWS) platform. The ideal candidate will have hands-on expertise in developing serverless and containerized applications, integrating various AWS services, and ensuring the performance, security, and scalability of cloud-native solutions. Key Responsibilities Design and develop scalable applications using AWS Lambda, API Gateway, and other AWS services, focusing on serverless architecture. Build and manage RESTful APIs, integrating with Amazon DynamoDB, RDS, and S3 for data storage and management. Implement Infrastructure as Code (IaC) using CloudFormation or Terraform to provision and manage AWS resources. Set up and maintain CI/CD pipelines using AWS CodePipeline, CodeBuild, and CodeDeploy for efficient software delivery. Automate workflows and background processes using Step Functions, SQS, and SNS for enhanced application functionality. Utilize CloudWatch, X-Ray, and CloudTrail for logging, monitoring, and troubleshooting, ensuring application health. Implement security measures using IAM roles, KMS, and Secrets Manager to protect sensitive data. Collaborate closely with DevOps, testers, and product owners in an Agile environment to deliver high-quality solutions. Qualifications & Skills Mandatory: 4-6 years of software development experience, including at least 2 years in AWS development. Proficiency in Node.js, Python, or Java for backend development. In-depth knowledge of AWS services: Lambda, API Gateway, S3, DynamoDB, RDS, IAM, SNS/SQS. Hands-on experience with CI/CD pipelines and version control systems like Git, GitHub, or Bitbucket. Understanding of containerization: Docker, and familiarity with Amazon ECS or EKS. Scripting skills using Bash, Python, or AWS CLI for automation. Awareness of cloud security best practices, cost optimization techniques, and performance tuning. Good-to-Have: AWS certification: AWS Certified Developer - Associate or AWS Certified Solutions Architect - Associate. Experience with microservices, serverless computing, and event-driven architecture. Exposure to multi-cloud or hybrid cloud environments. Strong communication and collaboration skills, with a problem-solving mindset. Self-Assessment Questions: Describe a serverless application you developed on AWS. What services did you use, and how did you ensure scalability and security Explain your experience with CI/CD pipelines on AWS. How have you utilized CodePipeline, CodeBuild, and CodeDeploy to automate the deployment process Share your approach to monitoring and troubleshooting AWS-based applications. What tools do you use, and how do you identify and resolve issues Discuss a scenario where you implemented security measures using AWS IAM and other security services. Job Title: AWS Developer About the Company/Team Oracle FSGIU's Finergy division is a specialized team dedicated to the Banking, Financial Services, and Insurance (BFSI) industry, offering deep domain expertise and innovative solutions. With a focus on accelerated implementation, Finergy helps financial institutions rapidly deploy multi-channel platforms, ensuring an exceptional customer experience. Our team provides end-to-end banking solutions, leveraging integrated analytics and dashboards for improved efficiency. Finergy's consulting services offer strategic guidance, aligning technology investments with business objectives. Job Summary We are on the lookout for a skilled AWS Developer with 4-6 years of experience to design and build cutting-edge applications on the Amazon Web Services (AWS) platform. The ideal candidate will have hands-on expertise in developing serverless and containerized applications, integrating various AWS services, and ensuring the performance, security, and scalability of cloud-native solutions. Key Responsibilities Design and develop scalable applications using AWS Lambda, API Gateway, and other AWS services, focusing on serverless architecture. Build and manage RESTful APIs, integrating with Amazon DynamoDB, RDS, and S3 for data storage and management. Implement Infrastructure as Code (IaC) using CloudFormation or Terraform to provision and manage AWS resources. Set up and maintain CI/CD pipelines using AWS CodePipeline, CodeBuild, and CodeDeploy for efficient software delivery. Automate workflows and background processes using Step Functions, SQS, and SNS for enhanced application functionality. Utilize CloudWatch, X-Ray, and CloudTrail for logging, monitoring, and troubleshooting, ensuring application health. Implement security measures using IAM roles, KMS, and Secrets Manager to protect sensitive data. Collaborate closely with DevOps, testers, and product owners in an Agile environment to deliver high-quality solutions. Qualifications & Skills Mandatory: 4-6 years of software development experience, including at least 2 years in AWS development. Proficiency in Node.js, Python, or Java for backend development. In-depth knowledge of AWS services: Lambda, API Gateway, S3, DynamoDB, RDS, IAM, SNS/SQS. Hands-on experience with CI/CD pipelines and version control systems like Git, GitHub, or Bitbucket. Understanding of containerization: Docker, and familiarity with Amazon ECS or EKS. Scripting skills using Bash, Python, or AWS CLI for automation. Awareness of cloud security best practices, cost optimization techniques, and performance tuning. Good-to-Have: AWS certification: AWS Certified Developer - Associate or AWS Certified Solutions Architect - Associate. Experience with microservices, serverless computing, and event-driven architecture. Exposure to multi-cloud or hybrid cloud environments. Strong communication and collaboration skills, with a problem-solving mindset. Self-Assessment Questions: Describe a serverless application you developed on AWS. What services did you use, and how did you ensure scalability and security Explain your experience with CI/CD pipelines on AWS. How have you utilized CodePipeline, CodeBuild, and CodeDeploy to automate the deployment process Share your approach to monitoring and troubleshooting AWS-based applications. What tools do you use, and how do you identify and resolve issues Discuss a scenario where you implemented security measures using AWS IAM and other security services. Career Level - IC2

Posted 1 month ago

Apply

5.0 - 7.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Lead Consultant - Cloud Engineer! In this role, you will be responsible for designing, provisioning, and securing scalable cloud infrastructure to support AI/ML and Generative AI workloads. A key focus will be ensuring high availability, cost efficiency, and performance optimization of infrastructure through best practices in architecture and automation. Responsibilities Design and implement secure VPC architecture, subnets, NAT gateways, and route tables. Build and maintain IAC modules for repeatable infrastructure provisioning. Build CI/CD pipelines that support secure, auto-scalable AI deployments using GitHub Actions, AWS CodePipeline , and Lambda triggers. Monitor and tune infrastructure health using AWS CloudWatch, GuardDuty , and custom alerting. Track and optimize cloud spend using AWS Cost Explorer, Trusted Advisor, and usage dashboards. Deploy and manage cloud-native services including SageMaker, Lambda, ECR, API Gateway etc. Implement IAM policies, Secrets Manager, and KMS encryption for secure deployments. Enable logging and monitoring using CloudWatch and configure alerts and dashboards. Set up and manage CloudTrail, GuardDuty , and AWS Config for audit and security compliance. Assist with cost optimization strategies including usage analysis and budget alerting. Support multi-cloud or hybrid integration patterns (e.g., data exchange between AWS and Azure/GCP). Collaborate with MLOps and Data Science teams to translate ML/ GenAI requirements into production-grade, resilient AWS environments. Maintain multi-cloud compatibility as needed (e.g., data egress readiness, common abstraction layers). Be engaging in the design, development and maintenance of data pipelines for various AI use cases Required to actively contribution to key deliverables as part of an agile development team Be collaborating with others to source, analyse, test and deploy data processes. Qualifications we seek in you! Minimum Qualifications Good years of hands-on AWS infrastructure experience in production environments. Degree/qualification in Computer Science or a related field, or equivalent work experience Proficiency in Terraform, AWS CLI, and Python or Bash scripting. Strong knowledge of IAM, VPC, ECS/EKS, Lambda, and serverless computing. Experience supporting AI/ML or GenAI pipelines in AWS (especially for compute and networking). Hands on experience to multiple AI / ML /RAG/LLM workloads and model deployment infrastructure. Exposure to multi-cloud architecture basics (e.g., SSO, networking, blob exchange, shared VPC setups). AWS Certified DevOps Engineer or Solutions Architect - Associate/Professional. Experience in developing, testing, and deploying data pipelines using public cloud. Clear and effective communication skills to interact with team members, stakeholders and end users Preferred Qualifications/ Skills Experience deploying infrastructure in both AWS and another major cloud provider (Azure or GCP). Familiarity with multi-cloud tools (e.g., HashiCorp Vault, Kubernetes with cross-cloud clusters). Strong understanding of DevSecOps best practices and compliance requirements. Exposure to RAG/LLM workloads and model deployment infrastructure. Knowledge of governance and compliance policies, standards, and procedures Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies