Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
3.0 - 8.0 years
6 - 11 Lacs
Bengaluru
Work from Office
About the team - Engineering at HashiCorp, an IBM Company (HashiCorp) On the HashiCorp engineering team, we build the Infrastructure Cloud which allows enterprises to take a unified approach to Infrastructure and Security Lifecycle Management: Infrastructure Lifecycle ManagementBuild / Deploy / Manage Terraform allows you to use infrastructure as code to provision and manage any infrastructure across your organization. Packer standardizes image workflows across cloud providers, allowing teams to build, govern and manage any image for any cloud. Waypoint makes infrastructure easily accessible at scale, enabling platform teams to deliver golden patterns and workflows with an internal developer platform. Nomad brings modern application scheduling to any type of software, allowing you to manage containers, binaries and VMs efficiently in the cloud, on-premises and across edge environments. Security Lifecycle ManagementProtect / Inspect / Connect Vault provides organizations with identity-based security to automatically authenticate and authorize access to secrets and other sensitive data. Boundary standardizes secure remote access across dynamic environments, allowing organizations to connect users and manage access with identity-based security controls. Consul standardizes service networking, allowing you to discover and securely connect any service across any runtime with identity-based service networking. We deliver the Infrastructure Cloud through an enterprise-grade unified SaaS platform,HCP, as well as to enterprises through self-managed/on-premises options. Across product engineering and platform engineering teams, we are looking for great engineers to come join us in developing the Infrastructure Cloud! What you’ll do (responsibilities) We’re looking for Mid-Level Engineers with a deep backend focus to join our team. In this role, you can expect to: Design, prototype and implement features and tools while ensuring stability and usability Collaborate closely with Product Design and Product Management partners, as well as engineers on your team and others Follow through on assigned tasks to build and ship medium-sized features, managing task expectations as needed. Engage in team discussions around diagnosis, planning, and workflow improvements based on product requirements. Apply independent judgment within team practices to determine appropriate actions and solutions. Address unforeseen challenges, making recommendations to keep tasks on track. Debug and resolve medium-level bugs in products or solutions to maintain quality. Review technical contributions for quality and consistency, collaborating with stakeholders to resolve issues and recommend technical or architectural changes. Suggest improvements to current processes and propose solutions to enhance the efficiency of architectural components and design. Participate in on-call rotations, pairing, and team planning to support product needs. Engage in team discussions around diagnosis, planning, and workflow improvements based on product requirements. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise You have at least 3+ years of experience as an engineer You have professional experience developing with modern programming languages and frameworks, and are interested in working in Golang and Ruby specifically You have experience working with distributed systems, particularly on a cloud provider such as AWS, Azure or GCP, with a focus on scalability, resilience and security. Experience in reviewing & refactoring code & making suggestions that improve the codebase and product Writing tests for more complex or edge cases Demonstrated ability to build trust and foster relationships across teams and stakeholders, with a focus on valuing diverse perspectives and proficiently managing expectations Cloud-native mindset and solid understanding of DevOps principles in a cloud environment Emerging experience in mentoring team members, helping to enhance their problem-solving, critical thinking, and planning skills. Proven decision-making abilities with an intentional, data-driven approach to solving complex technical challenges and delivering results Strong customer focus and systems-thinking mindset, with a commitment to personal accountability, self-awareness, and continuous improvement in support of high-quality outcomes Preferred technical and professional experience You have experience using HashiCorp products (Terraform, Packer, Waypoint, Nomad, Vault, Boundary, Consul). You have prior experience working in cloud platform engineering teams
Posted 4 days ago
2.0 - 5.0 years
6 - 11 Lacs
Bengaluru
Work from Office
About the team - Engineering at HashiCorp On the HashiCorp engineering team, we build the Infrastructure Cloud which allows enterprises to take a unified approach to Infrastructure and Security Lifecycle Management: Infrastructure Lifecycle ManagementBuild / Deploy / Manage Terraform allows you to use infrastructure as code to provision and manage any infrastructure across your organization. Packer standardizes image workflows across cloud providers, allowing teams to build, govern and manage any image for any cloud. Waypoint makes infrastructure easily accessible at scale, enabling platform teams to deliver golden patterns and workflows with an internal developer platform. Nomad brings modern application scheduling to any type of software, allowing you to manage containers, binaries and VMs efficiently in the cloud, on-premises and across edge environments. Security Lifecycle ManagementProtect / Inspect / Connect Vault provides organizations with identity-based security to automatically authenticate and authorize access to secrets and other sensitive data. Boundary standardizes secure remote access across dynamic environments, allowing organizations to connect users and manage access with identity-based security controls. Consul standardizes service networking, allowing you to discover and securely connect any service across any runtime with identity-based service networking. We deliver the Infrastructure Cloud through an enterprise-grade unified SaaS platform, HCP, as well as to enterprises through self-managed/on-premises options. Across product engineering and platform engineering teams, we are looking for great engineers to come join us in developing the Infrastructure Cloud! What you’ll do (responsibilities) We’re looking for Senior Engineers with a deep backend focus to join our team. In this role, you can expect to: Design, prototype and implement features and tools while ensuring stability and usability Collaborate closely with Product Design and Product Management partners, as well as engineers on your team and others Act as a subject matter expert on quality development with an emphasis on Golang development Lead and execute large-scale projects, ensuring the reliable delivery of key features from design through full implementation and troubleshooting. Drive end-to-end project lifecycle, including architecture design, implementation, and issue resolution, with a focus on quality and efficiency. Evaluate project tradeoffs and propose solutions, proactively removing blockers and keeping stakeholders informed on progress, issues, and milestones. Collaborate with internal teams, customers, and external stakeholders to design solutions that align with requirements and customer needs. Advocate for strategic technical roadmap initiatives that enhance the system’s overall effectiveness across teams and the organization. Debug and resolve complex issues to improve the quality and stability of products or solutions Review and assess code for quality, design patterns, and optimization opportunities, ensuring best practices are followed Mentor and guide software engineers, sharing technical knowledge and promoting best practices in development processes Facilitate collaborative team activities, such as code pairing and group troubleshooting, to foster a productive and cohesive team environment Support reliable production environments, including participating in an oncall rotation Strive for quality through maintainable code and comprehensive testing from development to deployment Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Typically, you should have at least 3 or more years of experience as an engineer You have professional experience developing with modern programming languages and frameworks, and are interested in working in Golang and Ruby specifically You have experience working with distributed systems, particularly on a cloud provider such as AWS, Azure or GCP, with a focus on scalability, resilience and security. Emerging ability to direct work and influence others, with a strategic approach to problem-solving and decision-making in a collaborative environment Demonstrated business acumen and customer focus, with a readiness for change and adaptability in dynamic situations Cloud-native mindset and solid understanding of DevOps principles in a cloud environment Familiarity with cloud monitoring tools to implement robust observability practices that prioritize metrics, logging and tracing for high reliability and performance. Intentional focus on stakeholder management and effective communication, fostering trust and relationship-building across diverse teams Integrated skills in critical thinking and data-driven analysis, promoting a growth mindset and continuous improvement to support high-quality outcomes Preferred technical and professional experience You have experience using HashiCorp products (Terraform, Packer, Waypoint, Nomad, Vault, Boundary, Consul). You have prior experience working in cloud platform engineering teams
Posted 4 days ago
3.0 - 7.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Extensive experience in UNIX/Linux engineering including RHEL, Rocky, Ubuntu and SuSE Advance understanding of Linux Kernel. Extensive experience using management tools like ip, dnf, nmcli., apt, ubiquity, debian installer, ufw. Extensive experience writing, reading and understanding yaml, and json files. Extensive automation/Scripting skillsperl, bash and python or related languages. vcenter, vmware ovftool, and hashicorp vault and packer tools. Advanced Gitlab experience creating workflows, CI/CD pipelines, setting variables. Primary Skills Red Hat Certified System Administrator (RHCSA) Red Hat Certified Engineer (RHCE)
Posted 1 week ago
5.0 - 10.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Understand product vision and business needs to define product requirements and product architectural solutions. Use tools and methodologies to create representations for functions and user interface of desired product Develop high-level product specifications with attention to system integration and feasibility Define all aspects of development from appropriate technology and workflow to coding standards Communicate successfully all concepts and guidelines to development team Ensure software meets all requirements of quality, security, modifiability, extensibility etc. Collaborate with other professionals to determine functional and non-functional requirements for new software or applications Provide support for production escalations and problem resolution for customers. Analyse requirements, design develop & maintain software products in alignment with the technology strategy of the organization Participate in technical reviews of requirements, specifications, designs, code and other artifacts. Ensure commitments are agreed, reviewed and met. Learn new skills and adopt new practices readily in order to develop innovative and cutting-edge software products that maintain Company's technical leadership position. Plan, develop and manage the infrastructure to enable strategic and effective use of tools. Lead the evaluation/evolution of tools/technologies/programs with input from internal teams, external developers. Proactively identifying issues and improvement opportunities. Directing resources to diagnose and resolve complex system, application software, security and related problems that impact system and availability. Required education Bachelor's Degree Required technical and professional expertise 5 to 10 years of experience in DevOps, security engineering, or related fields. Strong understanding of DevOps practices, CI/CD tools (e.g., Jenkins, GitLab CI, GitHub Actions). Hands-on experience with security tools (e.g., SonarQube, Checkmarx, Aqua, Twistlock, Snyk). Familiarity with containerization and orchestration tools (Docker, Kubernetes). Solid scripting and automation skills (Python, Bash, Terraform, etc.). Experience with cloud platforms (AWS, Azure, GCP) and their security services. Knowledge of secure SDLC and threat modeling techniques. Preferred technical and professional experience Security certifications such as CISSP, CEH, OSCP, or AWS Security Specialty. Experience with policy-as-code tools like OPA/Gatekeeper or HashiCorp Sentinel. Understanding of modern application architectures (microservices, serverless). Familiarity with regulatory compliance frameworks.
Posted 1 week ago
3.0 - 5.0 years
10 - 14 Lacs
Pune
Work from Office
: Job TitleGCP Data Engineer, AS LocationPune, India Corporate TitleAssociate Role Description An Engineer is responsible for designing and developing entire engineering solutions to accomplish business goals. Key responsibilities of this role include ensuring that solutions are well architected, with maintainability and ease of testing built in from the outset, and that they can be integrated successfully into the end-to-end business process flow. They will have gained significant experience through multiple implementations and have begun to develop both depth and breadth in several engineering competencies. They have extensive knowledge of design and architectural patterns.They will provide engineering thought leadership within their teams and will play a role in mentoring and coaching of less experienced engineers. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Design, develop and maintain data pipelines using Python and SQL programming language on GCP. Experience in Agile methodologies, ETL, ELT, Data movement and Data processing skills. Work with Cloud Composer to manage and process batch data jobs efficiently. Develop and optimize complex SQL queries for data analysis, extraction, and transformation. Develop and deploy google cloud services using Terraform. Implement CI CD pipeline using GitHub Action Consume and Hosting REST API using Python. Monitor and troubleshoot data pipelines, resolving any issues in a timely manner. Ensure team collaboration using Jira, Confluence, and other tools. Ability to quickly learn new any existing technologies Strong problem-solving skills. Write advanced SQL and Python scripts. Certification on Professional Google Cloud Data engineer will be an added advantage. Your skills and experience 6+ years of IT experience, as a hands-on technologist. Proficient in Python for data engineering. Proficient in SQL. Hands on experience on GCP Cloud Composer, Data Flow, Big Query, Cloud Function, Cloud Run and well to have GKE Hands on experience in REST API hosting and consumptions. Proficient in Terraform/ Hashicorp. Experienced in GitHub and Git Actions Experienced in CI-CD Experience in automating ETL testing using python and SQL. Good to have API knowledge Good to have Bit Bucket How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs
Posted 1 week ago
8.0 - 13.0 years
13 - 17 Lacs
Gurugram
Work from Office
Project Role : Security Architect Project Role Description : Define the cloud security framework and architecture, ensuring it meets the business requirements and performance goals. Document the implementation of the cloud security controls and transition to cloud security-managed operations. Must have skills : CyberArk Privileged Access Management Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are looking for an experienced CyberArk PAM Specialist to design, implement, and support CyberArk Privileged Access Management (PAM) solution. Roles and Responsibilities:1. Define, design, and implement CyberArk Privilege Cloud (SaaS).2. Install and configure cloud connectors.3. Configure MFA, SAML, LDAP, SIEM integration4. Troubleshoot and resolve CyberArk related technical issues.5. Work closely with application teams to onboard different types to systems to CyberArk6. Generate custom CPM,PSM plugins if required7. Support application onboarding, including access policies, group assignments, and role management.8. Communicate effectively with business teams, external clients, and solution providers.9. Document technical designs, solutions, and implementation plans.10. Work independently and take ownership of technical deliverables. Professional & Technical Skills: 1. Strong experience in CyberArk P-cloud, Conjur Secrets Management, CyberArk PAM (Vault, CPM, PVWA, AAM)2. Solid understanding of security standards and protocols including SSO, MFA, SAML, OAuth, OIDC, LDAP, RADIUS, and Kerberos.3. Proficient in CyberArk and related technologies. Experience in system administration, scripting (UNIX, Linux scripting), Rest API, LDAP directories, Active Directory4. Experience in providing guidance in CyberArk strategy; must have PAM deep-dive experience.5. Strong understanding of PAM Architecture, deployment methodologies and best practices.6. Effective at presenting information to different audiences at the correct level of detail (e.g., from engineering teams to executive management).7. Be a product and domain expert in PAM domain experienced in conducting environment assessments and health checks in line with best practices.8. Strong troubleshooting and problem-solving skills.9. Experience in EPM is desirable but not mandatory10. Excellent verbal and written communication skills.11. Ability to work independently on technical tasks and client engagements.12. Candidate must be an independent self-starter able to perform all deployment activities with oversight and as a member of a project team.13. Candidate must have Sentry Certification. Nice to have CyberArk CDE14. Good to Have Skills Thycotic (Delinea), Beyond Trust, HashiCorp Vault Additional Information:1. 7+ years experience related to designing, deploying, and configuring PAM solutions, or 6+ years direct PAM consulting experience.2. Candidate must have completed 16 years of full-time education.3. This position is open to all Accenture locations. Qualification 15 years full time education
Posted 2 weeks ago
9.0 - 14.0 years
13 - 17 Lacs
Gurugram
Work from Office
Project Role : Security Architect Project Role Description : Define the cloud security framework and architecture, ensuring it meets the business requirements and performance goals. Document the implementation of the cloud security controls and transition to cloud security-managed operations. Must have skills : CyberArk Privileged Access Management Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are looking for an experienced CyberArk PAM Specialist to design, implement, and support CyberArk Privileged Access Management (PAM) solution. Roles and Responsibilities:1. Define, design, and implement CyberArk Privilege Cloud (SaaS).2. Install and configure cloud connectors.3. Configure MFA, SAML, LDAP, SIEM integration4. Troubleshoot and resolve CyberArk related technical issues.5. Work closely with application teams to onboard different types to systems to CyberArk6. Generate custom CPM,PSM plugins if required7. Support application onboarding, including access policies, group assignments, and role management.8. Communicate effectively with business teams, external clients, and solution providers.9. Document technical designs, solutions, and implementation plans.10. Work independently and take ownership of technical deliverables. Professional & Technical Skills: 1. Strong experience in CyberArk P-cloud, Conjur Secrets Management, CyberArk PAM (Vault, CPM, PVWA, AAM)2. Solid understanding of security standards and protocols including SSO, MFA, SAML, OAuth, OIDC, LDAP, RADIUS, and Kerberos.3. Proficient in CyberArk and related technologies. Experience in system administration, scripting (UNIX, Linux scripting), Rest API, LDAP directories, Active Directory4. Experience in providing guidance in CyberArk strategy; must have PAM deep-dive experience.5. Strong understanding of PAM Architecture, deployment methodologies and best practices.6. Effective at presenting information to different audiences at the correct level of detail (e.g., from engineering teams to executive management).7. Be a product and domain expert in PAM domain experienced in conducting environment assessments and health checks in line with best practices.8. Strong troubleshooting and problem-solving skills.9. Experience in EPM is desirable but not mandatory10. Excellent verbal and written communication skills.11. Ability to work independently on technical tasks and client engagements.12. Candidate must be an independent self-starter able to perform all deployment activities with oversight and as a member of a project team.13. Candidate must have Sentry Certification. Nice to have CyberArk CDE14. Good to Have Skills Thycotic (Delinea), Beyond Trust, HashiCorp Vault Additional Information:1. 9+ years experience related to designing, deploying, and configuring PAM solutions, or 6+ years direct PAM consulting experience.2. Candidate must have completed 16 years of full-time education.3. This position is open to all Accenture locations." Qualification 15 years full time education
Posted 2 weeks ago
7.0 - 12.0 years
13 - 17 Lacs
Gurugram
Work from Office
Project Role : Security Architect Project Role Description : Define the cloud security framework and architecture, ensuring it meets the business requirements and performance goals. Document the implementation of the cloud security controls and transition to cloud security-managed operations. Must have skills : CyberArk Privileged Access Management Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are looking for an experienced HashiCorp Secrets Vault Specialist to design, implement Hashicorp Secret Vault. As a Security Lead, you will be responsible for defining security frameworks and architectures, aligning security controls with business requirements, and supporting the transition to Hashicorp Secret Vault.Roles and Responsibilities:1. Define, design, and implement HashiCorp Vaults Infra Architecture, scaling, reliability, DR, Raft storage, Consul storage etc.2. Implement HashiCorp Vaults Logical Architecture, policies, secret engine, namespace, auth method, governance etc.3. Basic Linux commands.4. Troubleshoot and resolve complex Hashicorp Vault-related technical issues.5. Experience in scripting / coding.6. Idea on basic DevOps tools like GitLab, artifactory, Splunk etc.7. Experience and knowledge on HashiCorp Terraform and its use to deploy HashiCorp Vault resources.8. Idea on how License and client count works in Vault.9. Communicate effectively with business teams, external clients, and solution providers.10. Document technical designs, solutions, and implementation plans.11. Work independently and take ownership of technical deliverables. Professional & Technical Skills: 1. Strong experience working with Hashicorp Secrets Vault solution.2. HashiCorp Vaults Infra Architecture, scaling, reliability, DR, Raft storage, Consul storage etc.3. HashiCorp Vaults Logical Architecture, policies, secret engine, namespace, auth method, governance etc.4. Good understanding on different types of secret engines and auth methods.5. Hands-on knowledge of using Vault API, CLI and UI.6. Audit and Monitoring for HashiCorp Vault and its integration with tools like Splunk.7. Integrations with different tools / platforms such as HSM, GitLab, Microservices, Kubernetes, Databases, Cloud, Vault Agent etc.8. Strong troubleshooting and problem-solving skills.9. Experience in working on Enterprise version of HashiCorp Vault.10. Excellent verbal and written communication skills.11. Ability to work independently on technical tasks and client engagements. Additional Information:1. Minimum of 7+ years of relevant experience in Hashicorp with a focus on Secrets Management.2. Candidate must have completed 16 years of full-time education.3. This position is open to all Accenture locations." Qualification 15 years full time education
Posted 2 weeks ago
5 - 10 years
25 - 35 Lacs
Hyderabad
Work from Office
As a Database Architect specializing in NoSQL and MongoDB , you will play a pivotal role in designing and optimizing data architectures that power our systems and applications. Your focus will be on creating scalable, resilient, and high-performance database solutions to handle large volumes of data while ensuring data integrity and security. You will work closely with cross-functional teams to architect and build innovative solutions that solve complex operational problems and drive business value. In this role, you will leverage your deep expertise in NoSQL databases, particularly MongoDB, to develop and optimize data infrastructure, ensuring that our systems run efficiently and effectively. Much of your work will center around building and maintaining databases that can scale to meet growing demands, optimizing performance, and automating processes to reduce operational overhead. As part of a team of highly skilled problem-solvers, you will have the opportunity to take risks, experiment with new approaches, and implement cutting-edge technologies. Your contributions will directly impact the success of our production applications and systems, allowing us to deliver high-quality, scalable solutions in a fast-paced environment. Responsibilities Assist/work with Cloud Solution and Infra Architects. Database administration, design, and deployment of MongoDB(ATLAS) cluster and database. Design and implement sharing and indexing strategies for MongoDB (ATLAS). Advise MongoDB (ATLAS) HA strategies, including replica sets and sharding. Maintain MongoDB (ATLAS) databases/DB projects Design, implement and manage the security of MongoDB (ATLAS) databases. Responsible for Mongo backups and restores Implement and maintain MongoDB OPS Manager Solve difficult technical challenges Administer MongoDB to achieve 100% availability Collaborate with other teams to solve technical issues Maintain detailed documentation of database Design/Architecture and setup. Experience and Skills 3+ years of experience working in Mongo Database administration. Very good experience in the Linux environment in a database administrator role 1+ years of shell scripting Hands-on experience with solving MongoDB performance issues Hands-on experience with building and maintaining MongoDB replica sets Hands-on experience with building and maintaining MongoDB sharded environment Proactively work on monitoring, identifying, and fixing database-related issues Ability to learn quickly and adapt to handle ambiguous situations Ability to work under pressure and to deadlines Ability to work in a collaborative team-oriented environment Team player with good interpersonal and communication skills Experience in automating database administration tasks Experience in Privileged Access Management - authentication mechanisms like LDAP, Kerberos, Hashi Corp, Active Directory Qualifications Bachelor's degree in any stream Benefits: Best in Industry Salary Health Insurance Flexible working hours
Posted 1 month ago
3 - 7 years
10 - 14 Lacs
Pune
Work from Office
About The Role : Job Title GCP Data Engineer, AS LocationPune, India Role Description An Engineer is responsible for designing and developing entire engineering solutions to accomplish business goals. Key responsibilities of this role include ensuring that solutions are well architected, with maintainability and ease of testing built in from the outset, and that they can be integrated successfully into the end-to-end business process flow. They will have gained significant experience through multiple implementations and have begun to develop both depth and breadth in several engineering competencies. They have extensive knowledge of design and architectural patterns. They will provide engineering thought leadership within their teams and will play a role in mentoring and coaching of less experienced engineers. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Design, develop and maintain data pipelines using Python and SQL programming language on GCP. Experience in Agile methodologies, ETL, ELT, Data movement and Data processing skills. Work with Cloud Composer to manage and process batch data jobs efficiently. Develop and optimize complex SQL queries for data analysis, extraction, and transformation. Develop and deploy google cloud services using Terraform. Implement CI CD pipeline using GitHub Action Consume and Hosting REST API using Python. Monitor and troubleshoot data pipelines, resolving any issues in a timely manner. Ensure team collaboration using Jira, Confluence, and other tools. Ability to quickly learn new any existing technologies Strong problem-solving skills. Write advanced SQL and Python scripts. Certification on Professional Google Cloud Data engineer will be an added advantage. Your skills and experience 6+ years of IT experience, as a hands-on technologist. Proficient in Python for data engineering. Proficient in SQL. Hands on experience on GCP Cloud Composer, Data Flow, Big Query, Cloud Function, Cloud Run and well to have GKE Hands on experience in REST API hosting and consumptions. Proficient in Terraform/ Hashicorp. Experienced in GitHub and Git Actions Experienced in CI-CD Experience in automating ETL testing using python and SQL. Good to have APIGEE. Good to have Bit Bucket How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.
Posted 1 month ago
3 - 7 years
10 - 14 Lacs
Pune
Work from Office
About The Role : Job Title GCP Data Engineer, AS LocationPune, India Role Description An Engineer is responsible for designing and developing entire engineering solutions to accomplish business goals. Key responsibilities of this role include ensuring that solutions are well architected, with maintainability and ease of testing built in from the outset, and that they can be integrated successfully into the end-to-end business process flow. They will have gained significant experience through multiple implementations and have begun to develop both depth and breadth in several engineering competencies. They have extensive knowledge of design and architectural patterns. They will provide engineering thought leadership within their teams and will play a role in mentoring and coaching of less experienced engineers. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Design, develop and maintain data pipelines using Python and SQL programming language on GCP. Experience in Agile methodologies, ETL, ELT, Data movement and Data processing skills. Work with Cloud Composer to manage and process batch data jobs efficiently. Develop and optimize complex SQL queries for data analysis, extraction, and transformation. Develop and deploy google cloud services using Terraform. Implement CI CD pipeline using GitHub Action Consume and Hosting REST API using Python. Monitor and troubleshoot data pipelines, resolving any issues in a timely manner. Ensure team collaboration using Jira, Confluence, and other tools. Ability to quickly learn new any existing technologies Strong problem-solving skills. Write advanced SQL and Python scripts. Certification on Professional Google Cloud Data engineer will be an added advantage. Your skills and experience 6+ years of IT experience, as a hands-on technologist. Proficient in Python for data engineering. Proficient in SQL. Hands on experience on GCP Cloud Composer, Data Flow, Big Query, Cloud Function, Cloud Run and well to have GKE Hands on experience in REST API hosting and consumptions. Proficient in Terraform/ Hashicorp. Experienced in GitHub and Git Actions Experienced in CI-CD Experience in automating ETL testing using python and SQL. Good to have APIGEE. Good to have Bit Bucket How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.
Posted 2 months ago
2 - 6 years
3 - 6 Lacs
Bengaluru
Work from Office
Description Looking for 5 to 8yrs of exp ones Proficient in Python coding, Must have exp on DevOps tools like Jenkins, Terraform. Shell Scripting is good to have Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade C Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family To be defined Local Role Name To be defined Local Skills CICD;Python;HashiCorp Terraform Languages RequiredENGLISH Role Rarity To Be Defined
Posted 2 months ago
5 - 8 years
7 - 11 Lacs
Bengaluru
Work from Office
Description Primary Skill Automation Chef Puppet Hiring Manager shivayogeppasalotagicapgeminicom Job Overview This job overview outlines a comprehensive role for an experienced Infrastructure Engineer requiring expertise in managing diverse platforms and cloud services with a strong focus on automation scripting and CICD pipeline management The candidate is expected to work with both Windows and Linux platforms leveraging cloud infrastructure on AWS and Azure and utilizing automation tools like HashiCorp Packer and Vault Responsibilities Oversee and maintain both Windows and Linux systems ensuring stability and performance Architect deploy and manage scalable infrastructure solutions on AWS and Azure along with OnPrem environments Develop and manage Golden hardened images using HashiCorp Packer Use PowerShell and BashShell scripting to automate infrastructure tasks Build and manage CICD pipelines using GitLab and Docker containers Implement strategies for source control and automation using Git and GitLab Handson experience in securing sensitive data using HashiCorp vault Primary Skills HashiCorp Packer PowerShell scripting and BashShell scripting GitGitLab Secondary Skills Docker HashiCorp vault Qualifications Proven experience working with Windows and Linux platforms Expertise in automation with Packer and scripting Strong understanding of CICD processes and source control strategies Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade B Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family To be defined Local Role Name To be defined Local Skills Automation - Chef, Puppet Languages RequiredENGLISH Role Rarity To Be Defined
Posted 2 months ago
5 - 10 years
7 - 11 Lacs
Bengaluru
Work from Office
Description Must have Terraform, terragrunt , Kubernetes , DevOps , Azure Good to have Azure Monitoring , Cloud Operation , Design cloud solution 1. In-depth understanding of the Azure DevOps Engineering. 2.Significant experience with the Scripting(Python/Bash/shell 3. Experience in Terraform as IaaC 4.must to have hands on Docker Kubernetes using Helm 5. Experience in CI/CD With Github Actions 6. Experience on PostgreSql DB its functionalities 7. Experience in Networking and Network security using NSG Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade C Level LEVEL 3 - SENIOR6-9 Years Experience Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family To be defined Local Role Name To be defined Local Skills Kubernetes;Microsoft Azure DevOps;HashiCorp Terraform Languages RequiredENGLISH Role Rarity To Be Defined
Posted 2 months ago
2 - 7 years
4 - 9 Lacs
Karnataka
Work from Office
Description Senior Devops Engineer Candidate should have good Platform experience on Azure with Terraform. The devops engineer needs to help developers, create the Pipelines and K8s Deployment Manifests. Good to have experience on migration and upgrade activities (DBs, OS, AKS, etc) To manage/automate infrastructure automatically using Terraforms. Jenkins is the key CI/CD tool which we uses and it will be used to run these Terraforms. VMs to be provisioned on Azure Cloud and managed. Good hands on experience of Networking on Cloud is required. Ability to setup Database on VM as well as managed DB and Proper set up of cloud hosted microservices needs to be done to communicate with the db services. Kubernetes, Storage, KeyValult, Networking(load balancing and routing) and VMs are the key infrastructure expertise which are essential. Requirement is to administer Kubernetes cluster end to end. (Application deployment, managing namespaces, load balancing, policy setup, using blue-green/canary deployment models etc). Python experience is optional however Power shell is mandatory Know-how on the use of GitHub Good knowledge on Setting up and configuring Kafka is needed. Administration of Azure Kubernetes services Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade C Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family To be defined Local Role Name To be defined Local Skills Microsoft Azure DevOps;HashiCorp Terraform Languages RequiredENGLISH Role Rarity To Be Defined
Posted 2 months ago
7 - 10 years
9 - 12 Lacs
Bengaluru
Work from Office
About The Role : Minimum 5 years of technical experience in DevOps Minimum of 3 years of experience in information security and/or computer science Bachelors degree in computer science, engineering, applied information technology or related subject matter. Experience of working in a challenging work environment and track record of meeting project timelines Experience leading a DevOps transformation across the pillars of people, process, and technology. Primary Skills Familiar with secret management using tools like Hashicorp Vault and AWS Secrets Manager Strong experience designing and implementing CI/CD pipelines across multiple tools, such as Terraform, Packers, GitLab, Jenkins Familiar with automating application security controls. using techniques such as SCA, SAST, DAST, and supply chain management Strong experience with scaling Observability in an organization . Strong grasp of at least one programming language (JavaScript, Python preferred)
Posted 3 months ago
4 - 9 years
10 - 15 Lacs
Pune
Hybrid
Dear Consultant, Greetings from SRS Infoway We are having openings with top MNC company Skill: HashiCorp Vault Administrator Experience: 4+ Years Job location : Mysore / Pune This is a contract to hire opportunity. Duration 1+Year (Extendable). Detailed JD: Skilled and experienced Web Systems and HashiCorp Vault Administrator. The ideal candidate will have a strong background in managing web systems, with specific expertise in HashiCorp Vault, JavaScript application delivery, Linux administration, and outstanding customer communication skills. This role will involve providing Tier 0 production support and interfacing directly with customers to address their needs, concerns, and new feature requirements. * Manage and administer web systems using PM2, ExpressJS, and NextJS, ensuring high availability, performance, and security. * Deploy, configure, and maintain HashiCorp Vault for secrets management and encryption key storage. * Handle Tier 0 production support, troubleshooting issues, and ensuring rapid resolution. * Communicate effectively with customers, understanding their requirements and providing appropriate solutions and support. * Implement and maintain monitoring solutions to ensure system health and performance. * Performance necessary change management in our technology delivery cycle using ServiceNow. * Work closely with development teams to deploy applications and provide infrastructure support. * Automate routine tasks and workflows to improve efficiency and reduce manual effort. * Perform regular security audits and implement necessary measures to ensure compliance and data protection. * Document system configurations, processes, and procedures for knowledge sharing and future reference. If anyone interested kindly share your updated resume to shairabanu@srsinfoway.com We are looking for Immediate to 30 days notice period. Regards, Shaira Banu
Posted 3 months ago
7 - 11 years
7 - 12 Lacs
Pune
Work from Office
Role & responsibilities Job Description: L3 Cloud DevOps & CI/CD pipeline Operations Engineer ( IT Operations & Infrastructure) Employment Type On-roll Reporting Manager Direct reports LocationPune/Mumbai Role PurposeCloud DevOps & CI/CD pipeline Operations Engineer (L3) Key Responsibilities / AccountabilitiesWe are seeking an experienced L3 Operations Engineer specializing in DevOps & CI/CD Pipelines within a Cloud Service Provider (CSP) environment. The role requires deep expertise in automation, CI/CD pipeline design, cloud-native development, and DevOps best practices across AWS, Azure, Google Cloud Platform (GCP), and Oracle Cloud Infrastructure (OCI). As an L3 engineer, you will serve as the final escalation point for complex CI/CD and DevOps issues, drive automation strategies, implement security best practices, and optimize multi-cloud DevOps workflows. You will collaborate with architects, developers, and security teams to ensure seamless, reliable, and secure application deployments. Major Duties & Responsibilities : CI/CD Pipeline Architecture & Optimization: Design, implement, and manage complex CI/CD pipelines for cloud-native and hybrid environments. Optimize GitOps workflows and automated deployment strategies for efficiency and reliability. Troubleshoot CI/CD failures, deployment issues, and performance bottlenecks at an advanced level. Integrate DevSecOps principles into CI/CD pipelines, enforcing security best practices. DevOps Automation & Infrastructure as Code (IaC): Lead the development and automation of cloud infrastructure using Terraform, CloudFormation, ARM, or Pulumi. Implement configuration management solutions using Ansible, Chef, or Puppet. Support multi-cloud container orchestration using Kubernetes (EKS, AKS, GKE, OCI Kubernetes Engine). Automate serverless and microservices-based application deployments. Implement policy-as-code solutions using Open Policy Agent (OPA) and HashiCorp Sentinel. Cloud Infrastructure Operations & Optimization: Manage and optimize scalability, performance, and cost efficiency of cloud-based DevOps environments. Support hybrid and multi-cloud deployments, ensuring high availability and failover strategies. Troubleshoot networking, storage, and compute resource performance issues in cloud environments. Optimize serverless computing platforms, container orchestration, and API gateway configurations. Security, Compliance & Governance: Enforce DevSecOps best practices by integrating security scanning, vulnerability assessment, and compliance checks into CI/CD pipelines. Implement secrets management, access control, and IAM best practices across cloud platforms. Ensure compliance with ISO 27001, NIST, CIS benchmarks, SOC2, GDPR, and HIPAA. Automate security auditing and monitoring processes for DevOps environments. Monitoring, Logging & Incident Management: Implement and manage monitoring/logging tools (Prometheus, Grafana, ELK, CloudWatch, Azure Monitor, GCP Operations Suite). Act as final escalation point for DevOps and CI/CD incidents, ensuring root cause analysis (RCA) and permanent fixes. Work within ITIL frameworks for Incident, Problem, and Change Management. Collaboration, Technical Leadership & Mentorship: Work closely with developers, architects, and security teams to improve DevOps processes. Mentor L1 and L2 engineers, conducting training sessions and knowledge-sharing workshops. Lead CI/CD and DevOps architecture improvements, ensuring alignment with business goals. Document SOPs, best practices, and runbooks for CI/CD and DevOps workflows. Qualifications and Experience Bachelor s or Master s degree in Computer Science, IT, or related field (or equivalent experience). 7+ years of experience in DevOps, CI/CD pipeline management, and cloud automation. Expert-level experience with CI/CD tools such as Jenkins, GitHub Actions, GitLab CI/CD, Azure DevOps, or Circle CI. Proficiency in Infrastructure as Code (IaC) using Terraform, CloudFormation, Pulumi, or ARM templates. Advanced scripting skills in Python, Bash, or PowerShell. Strong knowledge of Kubernetes (AKS, EKS, GKE, OCI Kubernetes Engine) and container orchestration. Experience with serverless technologies (AWS Lambda, Azure Functions, Google Cloud Functions). Strong understanding of cloud security, IAM policies, and DevSecOps best practices. Exposure to ITIL best practices for Incident and Change Management. Certifications AWS Certified DevOps Engineer - Professional Microsoft Certified: DevOps Engineer Expert Google Cloud Certified: Professional DevOps Engineer Oracle Cloud Infrastructure (OCI) DevOps Professional Certified Kubernetes Administrator (CKA) HashiCorp Certified: Terraform Associate ITIL v4 Foundation Certification (preferred) Location: Pune
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2