Jobs
Interviews

638 Eks Jobs - Page 26

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8 - 13 years

25 - 30 Lacs

Bengaluru

Work from Office

About The Role About The Role At Kotak Mahindra Bank, customer experience is at the forefront of everything we do on Digital Platform. To help us build & run platform for Digital Applications , we are now looking for an experienced Sr. DevOps Engineer . They will be responsible for deploying product updates, identifying production issues and implementing integrations that meet our customers' needs. If you have a solid background in software engineering and are familiar with AWS EKS, ISTIO/Services Mesh/tetrate, Terraform,Helm Charts, KONG API Gateway, Azure DevOps, SpringBoot , Ansible, Kafka/MOngoDB we"™d love to speak with you. Objectives of this Role Building and setting up new development tools and infrastructure Understanding the needs of stakeholders and conveying this to developers Working on ways to automate and improve development and release processes Investigate and resolve technical issues Develop scripts to automate visualization Design procedures for system troubleshooting and maintenance Skills and Qualifications BSc in Computer Science, Engineering or relevant field Experience as a DevOps Engineer or similar software engineering role minimum 5 Yrs Proficient with git and git workflows Good knowledge of Kubernets EKS,Teraform,CICD ,AWS Problem-solving attitude Collaborative team spirit Testing and examining code written by others and analyzing results Identifying technical problems and developing software updates and "˜fixes"™ Working with software developers and software engineers to ensure that development follows established processes and works as intended Monitoring the systems and setup required Tools Daily and Monthly Responsibilities Deploy updates and fixes Provide Level 3 technical support Build tools to reduce occurrences of errors and improve customer experience Develop software to integrate with internal back-end systems Perform root cause analysis for production errors

Posted 2 months ago

Apply

6 - 10 years

12 - 17 Lacs

Bengaluru

Work from Office

At F5, we strive to bring a better digital world to life. Our teams empower organizations across the globe to create, secure, and run applications that enhance how we experience our evolving digital world. We are passionate about cybersecurity, from protecting consumers from fraud to enabling companies to focus on innovation. Everything we do centers around people. That means we obsess over how to make the lives of our customers, and their customers, better. And it means we prioritize a diverse F5 community where each individual can thrive. About The Role Position Summary The Senior Product Manager plays a pivotal role in product development for F5 Distributed Cloud App Delivery strategies. This position requires an in-depth understanding of market dynamics in Kubernetes platforms, Multicloud Networking, Public Cloud and SaaS platforms as well as strong leadership, partnering and analytical abilities, to help build a shared vision and execute to establish a market leading position. Primary Responsibilities Product Delivery: Drive product management activities for F5 Network Connect and F5 Distributed Apps Build compelling technical marketing content to drive product awareness including building reference architectures and customer case studies Deliver web content, whitepapers, and demonstrations to drive customer adoption, and ensure technical marketing alignment with key partners Ensure accountability for product success and present data-backed findings during business reviews and QBRs Customer Engagement & Feedback: Engage with customers to understand their business goals, constraints, and requirements Prioritize feature enhancements based on customer feedback and business value Utilize the Digital Adoption Platform to identify areas of improvement, increase revenue and reduce churn Market Analysis: Position F5 Network Connect and Distributed Apps with a competitive edge in the market Validate market demand based on customer usage Conduct in-depth research to stay abreast of developments in Multicloud Networking as well as Kubernetes (CaaS/PaaS) ecosystem Team Collaboration: Collaborate with stakeholders to make informed decisions on product backlog prioritization Foster strong relationships with engineering, sales, marketing, and customer support teams Work with technical teams to ensure seamless product rollouts Work with key decision makers in marketing and sales to ensure smoot product delivery to customers Knowledge, Skills, and Abilities Technical Skills: Proficient with core networking technologies such as BGP, VPNs and tunneling, routing, NAT, etc. Proficient with core Kubernetes technologies and ecosystem such as CNIs, Ingress Controllers, etc. Proficient with core Public Cloud networking services – especially with AWS, Azure and GCP Proficient with PaaS services such as OpenShift, EKS (AWS), GKE (GCP), AKS (Azure) Well versed with L4/L7 load balancing & proxy technologies and protocols Stakeholder Management: Demonstrate strong leadership, negotiation, and persuasion capabilities Effectively manage and navigate expectations from diverse stakeholder groups Uphold a data-driven approach amidst a fast-paced, changing environment Analytical Skills: Ability to generate data-driven reports and transform complex data into actionable insights Proven skills in data analytics and making data-backed decisions Strong awareness of technology trends and potential influence on F5’s business Qualifications BA/BS degree in a relevant field 4+ years in technical product management or a related domain 2+ years of product management in Multicloud Networking, PaaS or an adjacent area (exSSE/SD-WAN) Experience developing relationships with suppliers and co-marketing partners highly desirable. The About The Role is intended to be a general representation of the responsibilities and requirements of the job. However, the description may not be all-inclusive, and responsibilities and requirements are subject to change. Please note that F5 only contacts candidates through F5 email address (ending with @f5.com) or auto email notification from Workday (ending with f5.com or @myworkday.com ) . Equal Employment Opportunity It is the policy of F5 to provide equal employment opportunities to all employees and employment applicants without regard to unlawful considerations of race, religion, color, national origin, sex, sexual orientation, gender identity or expression, age, sensory, physical, or mental disability, marital status, veteran or military status, genetic information, or any other classification protected by applicable local, state, or federal laws. This policy applies to all aspects of employment, including, but not limited to, hiring, job assignment, compensation, promotion, benefits, training, discipline, and termination. F5 offers a variety of reasonable accommodations for candidates . Requesting an accommodation is completely voluntary. F5 will assess the need for accommodations in the application process separately from those that may be needed to perform the job. Request by contacting accommodations@f5.com.

Posted 2 months ago

Apply

2 - 5 years

6 - 10 Lacs

Bengaluru

Work from Office

12 plus years of overall IT experience 5 plus years of Cloud implementation experience (AWS - S3), Terraform, Docker, Kubernetes Expert in troubleshooting cloud impementation projects Expert in cloud native technologies Good working knowledge in Terraform and Quarkus Must Have skills Cloud AWS Knowledge (AWSS3, Load-Balancers,VPC/VPC-Peering/Private-Public-Subnets, EKS, SQS, Lambda,Docker/Container Services, Terraform or other IaC-Technologies for normal deployment), Quakrus, PostgreSQL, Flyway, Kubernetes, OpenId flow, Open-Search/Elastic-Search, Open API/Swagger, Java OptionalKafka, Python #LI-INPAS Job Segment Developer, Java, Technology

Posted 2 months ago

Apply

2 - 6 years

8 - 12 Lacs

Bengaluru

Work from Office

NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Sr. Staff Engineer to join our team in Bangalore, Karnataka (IN-KA), India (IN). Title - Lead Data Architect (Streaming) Required Skills and Qualifications Overall 10+ years of IT experience of which 7+ years of experience in data architecture and engineering Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS Strong experience with Confluent Strong experience in Kafka Solid understanding of data streaming architectures and best practices Strong problem-solving skills and ability to think critically Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders Knowledge of Apache Airflow for data orchestration Bachelor's degree in Computer Science, Engineering, or related field Preferred Qualifications An understanding of cloud networking patterns and practises Experience with working on a library or other long term product Knowledge of the Flink ecosystem Experience with Terraform Deep experience with CI/CD pipelines Strong understanding of the JVM language family Understanding of GDPR and the correct handling of PII Expertise with technical interface design Use of Docker Key Responsibilities Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS, Kafka and Confluent, all within a larger and overarching programme ecosystem Architect data processing applications using Python, Kafka, Confluent Cloud and AWS Develop data ingestion, processing, and storage solutions using Python and AWS Lambda, Confluent and Kafka Ensure data security and compliance throughout the architecture Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions Optimize data flows for performance, cost-efficiency, and scalability Implement data governance and quality control measures Ensure delivery of CI, CD and IaC for NTT tooling, and as templates for downstream teams Provide technical leadership and mentorship to development teams and lead engineers Stay current with emerging technologies and industry trends Collaborate with data scientists and analysts to enable efficient data access and analysis Evaluate and recommend new technologies to improve data architecture Position Overview: We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Developer, Computer Science, Consulting, Technology

Posted 2 months ago

Apply

2 - 6 years

8 - 12 Lacs

Bengaluru

Work from Office

NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Sr. Staff Engineer to join our team in Bangalore, Karnataka (IN-KA), India (IN). TitleLead Data Architect (Warehousing) Required Skills and Qualifications Overall 10+ years of IT experience of which 7+ years of experience in data architecture and engineering Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS Proficiency in Python Solid understanding of data warehousing architectures and best practices Strong Snowflake skills Strong Data warehouse skills Strong problem-solving skills and ability to think critically Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders Experience of data cataloguing Knowledge of Apache Airflow for data orchestration Experience modelling, transforming and testing data in DBT Bachelor's degree in Computer Science, Engineering, or related field Preferred Qualifications Familiarity with Atlan for data catalog and metadata management Experience integrating with IBM MQ Familiarity with Sonarcube for code quality analysis AWS certifications (e.g., AWS Certified Solutions Architect) Experience with data modeling and database design Knowledge of data privacy regulations and compliance requirements An understanding of Lakehouses An understanding of Apache Iceberg tables SnowPro Core certification Key Responsibilities Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS, as well as Snowflake, DBT and Apache Airflow, all within a larger and overarching programme ecosystem Develop data ingestion, processing, and storage solutions using Python and AWS Lambda and Snowflake Architect data processing applications using Python Ensure data security and compliance throughout the architecture Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions Optimize data flows for performance, cost-efficiency, and scalability Implement data governance and quality control measures Ensure delivery of CI, CD and IaC for NTT tooling, and as templates for downstream teams Provide technical leadership and mentorship to development teams and lead engineers Stay current with emerging technologies and industry trends Ensure data security and implement best practices using tools like Synk Collaborate with data scientists and analysts to enable efficient data access and analysis Evaluate and recommend new technologies to improve data architecture Position Overview: We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Developer, Solution Architect, Data Warehouse, Computer Science, Database, Technology

Posted 2 months ago

Apply

4 - 9 years

16 - 20 Lacs

Bengaluru

Work from Office

Req ID: 301930 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Solution Architect Lead Advisor to join our team in Bangalore, Karnataka (IN-KA), India (IN). TitleData Solution Architect Position Overview: We are seeking a highly skilled and experienced Data Solution Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. Required Skills and Qualifications - Bachelor's degree in Computer Science, Engineering, or related field - 7+ years of experience in data architecture and engineering - Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS - Proficiency in Kafka/Confluent Kafka and Python - Experience with Synk for security scanning and vulnerability management - Solid understanding of data streaming architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders Preferred Qualifications - Experience with Kafka Connect and Confluent Schema Registry - Familiarity with Atlan for data catalog and metadata management - Knowledge of Apache Flink for stream processing - Experience integrating with IBM MQ - Familiarity with Sonarcube for code quality analysis - AWS certifications (e.g., AWS Certified Solutions Architect) - Experience with data modeling and database design - Knowledge of data privacy regulations and compliance requirements Key Responsibilities - Design and implement scalable data architectures using AWS services and Kafka - Develop data ingestion, processing, and storage solutions using Python and AWS Lambda - Ensure data security and implement best practices using tools like Synk - Optimize data pipelines for performance and cost-efficiency - Collaborate with data scientists and analysts to enable efficient data access and analysis - Implement data governance policies and procedures - Provide technical guidance and mentorship to junior team members - Evaluate and recommend new technologies to improve data architecture - Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS - Design and implement data streaming pipelines using Kafka/Confluent Kafka - Develop data processing applications using Python - Ensure data security and compliance throughout the architecture - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions - Optimize data flows for performance, cost-efficiency, and scalability - Implement data governance and quality control measures - Provide technical leadership and mentorship to development teams - Stay current with emerging technologies and industry trend About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA is an equal opportunity employer and considers all applicants without regarding to race, color, religion, citizenship, national origin, ancestry, age, sex, sexual orientation, gender identity, genetic information, physical or mental disability, veteran or marital status, or any other characteristic protected by law. We are committed to creating a diverse and inclusive environment for all employees. If you need assistance or an accommodation due to a disability, please inform your recruiter so that we may connect you with the appropriate team. Job Segment Solution Architect, Consulting, Database, Computer Science, Technology

Posted 2 months ago

Apply

10 - 15 years

17 - 22 Lacs

Mumbai, Hyderabad, Bengaluru

Work from Office

Job roles and responsibilities : The AWS DevOps Engineer is responsible for automating, optimizing, and managing CI/CD pipelines, cloud infrastructure, and deployment processes on AWS. This role ensures smooth software delivery while maintaining high availability, security, and scalability. Design and implement scalable and secure cloud infrastructure on AWS, utilizing services such as EC2,EKS, ECS, S3, RDS, and VPC Automate the provisioning and management of AWS resources using Infrastructure as Code tools: (Terraform/ Cloud Formation / Ansible ) and YAML Implement and maintain continuous integration and continuous deployment (CI/CD) pipelines using tools like Jenkins, GitLab, or AWS CodePipeline Advocate for a No-Ops model, striving for console-less experiences and self-healing systems Experience with containerization technologies: Docker and Kubernetes Mandatory Skills: Overall experience is 5 - 8 Years on AWS Devops Speicalization (AWS CodePipeline, AWS CodeBuild, AWS CodeDeploy, AWS CodeCommit) Work experience on AWS Devops, IAM Work expertise on coding tools - Terraform or Ansible or Cloud Formation , YAML Good on deployment work - CI/CD pipelining Manage containerized workloads using Docker, Kubernetes (EKS), or AWS ECS , Helm Chart Has experience of database migration Proficiency in scripting languages (Python AND (Bash OR PowerShell)). Develop and maintain CI/CD pipelines using (AWS CodePipeline OR Jenkins OR GitHub Actions OR GitLab CI/CD) Experience with monitoring and logging tools (CloudWatch OR ELK Stack OR Prometheus OR Grafana) Career Level - IC4 Responsibilities Job roles and responsibilities : The AWS DevOps Engineer is responsible for automating, optimizing, and managing CI/CD pipelines, cloud infrastructure, and deployment processes on AWS. This role ensures smooth software delivery while maintaining high availability, security, and scalability. Design and implement scalable and secure cloud infrastructure on AWS, utilizing services such as EC2,EKS, ECS, S3, RDS, and VPC Automate the provisioning and management of AWS resources using Infrastructure as Code tools: (Terraform/ Cloud Formation / Ansible ) and YAML Implement and maintain continuous integration and continuous deployment (CI/CD) pipelines using tools like Jenkins, GitLab, or AWS CodePipeline Advocate for a No-Ops model, striving for console-less experiences and self-healing systems Experience with containerization technologies: Docker and Kubernetes Mandatory Skills: Overall experience is 5 - 8 Years on AWS Devops Speicalization (AWS CodePipeline, AWS CodeBuild, AWS CodeDeploy, AWS CodeCommit) Work experience on AWS Devops, IAM Work expertise on coding tools - Terraform or Ansible or Cloud Formation , YAML Good on deployment work - CI/CD pipelining Manage containerized workloads using Docker, Kubernetes (EKS), or AWS ECS , Helm Chart Has experience of database migration Proficiency in scripting languages (Python AND (Bash OR PowerShell)). Develop and maintain CI/CD pipelines using (AWS CodePipeline OR Jenkins OR GitHub Actions OR GitLab CI/CD) Experience with monitoring and logging tools (CloudWatch OR ELK Stack OR Prometheus OR Grafana)

Posted 2 months ago

Apply

6 - 11 years

20 - 30 Lacs

Bengaluru

Work from Office

Responsibilities Owns all technical aspects of software development for assigned applications Participates in the design and development of systems & application programs Functions as Senior member of an agile team and helps drive consistent development practices tools, common components, and documentation Works with product owners to prioritize features for ongoing sprints and managing a list of technical requirements based on industry trends, new technologies, known defects, and issues Qualifications In depth experience configuring and administering EKS clusters in AWS . In depth experience in configuring DataDog in AWS environments especially in EKS In depth understanding of OpenTelemetry and configuration of OpenTelemetry Collectors In depth knowledge of observability concepts and strong troubleshooting experience. Experience in implementing comprehensive monitoring and logging solutions in AWS using CloudWatch Experience in Terraform and Infrastructure as code. Experience in Helm Strong scripting skills in Shell and/or python . Experience with large-scale distributed systems and architecture knowledge (Linux/UNIX and Windows operating systems, networking, storage) in a cloud computing or traditional IT infrastructure environment. Must have a good understanding of cloud concepts (Storage /compute/network). Experience in Collaborating with several cross functional teams to architect observability pipelines for various AWS services like EKS, SQS etc. Experience with Git and GitHub . Proficient in developing and maintaining technical documentation, ADRs, and runbooks.

Posted 2 months ago

Apply

5 - 10 years

0 - 0 Lacs

Hyderabad

Work from Office

Job Description: DevOps Engineer Qualifications: - Bachelors or Masters degree in Computer Science or Computer Engineering. - 4 to 8 years of experience in DevOps. Key Skills and Responsibilities: - Passionate about continuous build, integration, testing, and delivery of systems. - Strong understanding of distributed systems, APIs, microservices, and cloud computing. - Experience in implementing applications on private and public cloud infrastructure. - Proficient in container technologies such as Kubernetes, including experience with public clouds like AWS, GCP, and other platforms through migrations, scaling, and day-to-day operations. - Hands-on experience with AWS services (VPC, EC2, EKS, S3, IAM, etc.) and Elastic Beanstalk. - Knowledge of source control management (Git, GitHub, GitLab). - Hands-on experience with Kafka for data streaming and handling microservices communication. - Experience in managing Jenkins for CI/CD pipelines. - Familiar with logging tools and monitoring solutions. - Experience working with network load balancers (Nginx, Netscaler). - Proficient with KONG API gateways, Kubernetes, PostgreSQL, NoSQL databases, and Kafka. - Experience with AWS S3 buckets, including policy management, storage, and backup using S3 and Glacier. - Ability to respond to production incidents and take on-call responsibilities. - Experience with multiple cloud providers and designing applications accordingly. - Skilled in owning and operating mission-critical, large-scale product operations (provisioning, deployment, upgrades, patching, and incidents) on the cloud. - Strong commitment to ensuring high availability and scalability of production systems. - Continuously raising the standard of engineering excellence by implementing best DevOps practices. - Quick learner with a balance between listening and taking charge. Responsibilities: - Develop and implement tools to automate and streamline operations. - Develop and maintain CI/CD pipeline systems for application development teams using Jenkins. - Prioritize production-related issues alongside operational team members. - Conduct root cause analysis, resolve issues, and implement long-term fixes. - Expand the capacity and improve the performance of current operational systems. Regards Mohammed Umar Farooq HR Recruitment Team Revest Solutions 9949051730

Posted 2 months ago

Apply

2 - 4 years

12 - 14 Lacs

Navi Mumbai

Work from Office

Overview GEP is a diverse, creative team of people passionate about procurement. We invest ourselves entirely in our client’s success, creating strong collaborative relationships that deliver extraordinary value year after year. Our clients include market global leaders with far-flung international operations, Fortune 500 and Global 2000 enterprises, leading government and public institutions. We deliver practical, effective services and software that enable procurement leaders to maximise their impact on business operations, strategy and financial performance. That’s just some of the things that we do in our quest to build a beautiful company, enjoy the journey and make a difference. GEP is a place where individuality is prized, and talent respected. We’re focused on what is real and effective. GEP is where good ideas and great people are recognized, results matter, and ability and hard work drive achievements. We’re a learning organization, actively looking for people to help shape, grow and continually improve us. Are you one of us? GEP is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, ethnicity, color, national origin, religion, sex, disability status, or any other characteristics protected by law. We are committed to hiring and valuing a global diverse work team. For more information please visit us on GEP.com or check us out on LinkedIn.com. Responsibilities The candidate will be responsible for creating infrastructure designs and guiding the development and implementation of infrastructure, applications, systems and processes. This position will be working directly with infrastructure, application development and QA teams to build and deploy highly available and scalable systems in private or public cloud environments along with release management. • Candidate must have experience on AZURE or GCP Cloud Platform • Building a highly scalable, highly available, private or public infrastructure • Owning and maintaining and enhancing the infrastructure and the related tools • Help build out an entirely CI ecosystem, including automated and auto scaling testing systems. • Design and implement monitoring and alerting for production systems used by DevOps staff • Work closely with developers and other staff to solve DevOps issues with customer facing services, tools and apps Qualifications REQUIREMENTS • 2+ of experience working in a DevOps role in a continuous integration environment specially in Micro-Soft technologies. • Strong knowledge of configuration management software such as Power Shell, Ansible, Continuous integration tools such as Octopus, Azure DevOps, Jenkins • Developing complete solutions considering sizing, infrastructure, data protection, disaster recovery, security, application requirements on cloud enterprise systems. • Experience adhering to an Agile development environment and iterative sprint cycle. • Familiarity with Database Deployment and CI/CD Pipeline. • Hands-on experience with CI/CD tools like VSTS, Azure DevOps, Jenkins(at least one of this tools experience) • Worked on Docker, Container, Kubernetes, AWS EKS, API Gateway, Application Load balancer , WAF, Cloud Front • Experience with GIT, or Github and the gitflow model, administration, User Management. Must be worked on AWS Platform with minimum 2 years of experience. • Strong understanding of Linux. Strong experience in various tools related to Continuous Integration and Continuous Deployment. • Automating builds using MS Build scripts • Any Scripting language(ruby,python, Yaml, Terraform) or any other application development experience(.net , java or golan etc) • Ability to write in multiple languages including Python, Java, Ruby, and Bash scripting. • Experience with setting up SLAs and monitoring of infrastructure and applications using Nagios, New Relic, Pingdom, VictorOps/Pagerduty like tools. • Experience with network configurations (switches, routers, firewalls) and a good understanding of routing and switching, firewalls, VPN tunnels.

Posted 2 months ago

Apply

9 - 14 years

30 - 40 Lacs

Navi Mumbai

Work from Office

Overview GEP is a diverse, creative team of people passionate about procurement. We invest ourselves entirely in our client’s success, creating strong collaborative relationships that deliver extraordinary value year after year. Our clients include market global leaders with far-flung international operations, Fortune 500 and Global 2000 enterprises, leading government and public institutions. We deliver practical, effective services and software that enable procurement leaders to maximise their impact on business operations, strategy and financial performance. That’s just some of the things that we do in our quest to build a beautiful company, enjoy the journey and make a difference. GEP is a place where individuality is prized, and talent respected. We’re focused on what is real and effective. GEP is where good ideas and great people are recognized, results matter, and ability and hard work drive achievements. We’re a learning organization, actively looking for people to help shape, grow and continually improve us. Are you one of us? GEP is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, ethnicity, color, national origin, religion, sex, disability status, or any other characteristics protected by law. We are committed to hiring and valuing a global diverse work team. For more information please visit us on GEP.com or check us out on LinkedIn.com. Responsibilities The candidate will be responsible for creating infrastructure designs and guiding the development and implementation of infrastructure, applications, systems and processes. This position will be working directly with infrastructure, application development and QA teams to build and deploy highly available and scalable systems in private or public cloud environments along with release management. • Candidate must have experience on AZURE or GCP Cloud Platform • Building a highly scalable, highly available, private or public infrastructure • Owning and maintaining and enhancing the infrastructure and the related tools • Help build out an entirely CI ecosystem, including automated and auto scaling testing systems. • Design and implement monitoring and alerting for production systems used by DevOps staff • Work closely with developers and other staff to solve DevOps issues with customer facing services, tools and apps Qualifications 9+ of experience working in a DevOps role in a continuous integration environment specially in Micro-Soft technologies. • Strong knowledge of configuration management software such as Power Shell, Ansible, Continuous integration tools such as Octopus, Azure DevOps, Jenkins • Developing complete solutions considering sizing, infrastructure, data protection, disaster recovery, security, application requirements on cloud enterprise systems. • Experience adhering to an Agile development environment and iterative sprint cycle. • Familiarity with Database Deployment and CI/CD Pipeline. • Hands-on experience with CI/CD tools like VSTS, Azure DevOps, Jenkins(at least one of this tools experience) • Worked on Docker, Container, Kubernetes, AWS EKS, API Gateway, Application Load balancer , WAF, Cloud Front • Experience with GIT, or Github and the gitflow model, administration, User Management. Must be worked on AWS Platform with minimum 2 years of experience. • Strong understanding of Linux. Strong experience in various tools related to Continuous Integration and Continuous Deployment. • Automating builds using MS Build scripts • Any Scripting language(ruby,python, Yaml, Terraform) or any other application development experience(.net , java or golan etc) • Ability to write in multiple languages including Python, Java, Ruby, and Bash scripting. • Experience with setting up SLAs and monitoring of infrastructure and applications using Nagios, New Relic, Pingdom, VictorOps/Pagerduty like tools. • Experience with network configurations (switches, routers, firewalls) and a good understanding of routing and switching, firewalls, VPN tunnels

Posted 2 months ago

Apply

3 - 6 years

20 - 25 Lacs

Hyderabad

Work from Office

Overview Job Title: Senior DevOps Engineer Location: Bangalore / Hyderabad / Chennai / Coimbatore Position: Full-time Department: Annalect Engineering Position Overview Annalect is currently seeking a Senior DevOps Engineer to join our technology team remotely, We are passionate about building distributed back-end systems in a modular and reusable way. We're looking for people who have a shared passion for data and desire to build cool, maintainable and high-quality applications to use this data. In this role you will participate in shaping our technical architecture, design and development of software products, collaborate with back-end developers from other tracks, as well as research and evaluation of new technical solutions. Responsibilities Key Responsibilities: Build and maintain cloud infrastructure through terraform IaC. Cloud networking and orchestration with AWS (EKS, ECS, VPC, S3, ALB, NLB). Improve and automate processes and procedures. Constructing CI/CD pipelines. Monitoring and handling incident response of the infrastructure, platforms, and core engineering services. Troubleshooting infrastructure, network, and application issues. Help identify and troubleshoot problems within environment. Qualifications Required Skills 5 + years of DevOps experience 5 + years of hands-on experience in administering cloud technologies on AWS, especially with IAM, VPC, Lambda, EKS, EC2, S3, ECS, CloudFront, ALB, API Gateway, RDS, Codebuild, SSM, Secret Manager, Lambda, API Gateway etc. Experience with microservices, containers (Docker), container orchestration (Kubernetes). Demonstrable experience of using Terraform to provision and configure infrastructure. Scripting ability - PowerShell, Python, Bash etc. Comfortable working with Linux/Unix based operating systems (Ubuntu preferred) Familiarity with software development, CICD and DevOps tools (Bitbucket, Jenkins, GitLab, Codebuild, Codepipeline) Knowledge of writing Infrastructure as Code (laC) using Terraform. Experience with microservices, containers (Docker), container orchestration (Kubernetes), serverless computing (AWS Lambda) and distributed/scalable systems. Possesses a problem-solving attitude. Creative, self-motivated, a quick study, and willing to develop new skills. Additional Skills Familiarity with working with data and databases (SQL, MySQL, PostgreSQL, Amazon Aurora, Redis, Amazon Redshift, Google BigQuery). Knowledge of Database administration. Experience with continuous deployment/continuous delivery (Jenkins, Bamboo). AWS/GCP/Azure Certification is a plus. Experience in python coding is welcome. Passion for data-driven software. All of our tools are built on top of data and require work with data. Knowledge of laaS/PaaS architecture with good understanding of Infrastructure and Web Application security Experience with logging/monitoring (CloudWatch, Datadog, Loggly, ELK). Passion for writing good documentation and creating architecture diagrams.

Posted 2 months ago

Apply

5 - 10 years

13 - 23 Lacs

Mangalore, Bengaluru

Work from Office

Position Overview We are looking for a Technical Lead with hands-on experience in React, Node.js, and cloud platforms like AWS or Azure. Youll drive the development of scalable, high-performance systems using modern architectures, collaborate on migration strategies, and build robust APIs. Strong knowledge of cloud services, containerization, and IoT technologies is essential. Job Role: Technical Lead Job Type: Full Time Experience: Minimum 5+ years Job Location: Bangalore/ Mangalore Technical Skills:AWS Cloud, Azure Cloud, TypeScript, Node, React About Us: We are a multi-award-winning creative engineering company. Since 2011, we have worked with our customers as a design and technology enablement partner, helping them on their digital transformation journey. Roles and Responsibilities: Evaluate existing systems and propose enhancements to improve efficiency, security, and scalability. Create technical documentation and architectural guidelines for the development team. Experience in developing software platforms using event-driven architecture Develop high-performance and throughput systems. Ability to define, track and deliver items to schedule. Collaborate with cross-functional teams to define migration strategies, timelines, and milestone Technical Skills: Hands-on experience in React & Node Hands-on experience in any one of the cloud provider like AWS, GCP or Azure Multiple database proficiency including SQL and NoSQL Highly skilled at facilitating and documenting requirements Experience developing REST API with JSON, XML for data transfer. lAbility to develop both internal facing and external facing APIs using JWT and OAuth2.0 Good understanding of cloud technologies, such as Docker, Kubernetes, MQTT, EKS, Lambda, IoT Core, and Kafka. Good understanding of messaging systems like SQS, PubSub Ability to establish priorities and proceed with objectives without supervision. Familiar with HA/DR, scalability, performance, code optimizations Good organizational skills and the ability to work on more than one project at a time. Exceptional attention to detail and good communication skills. Experience with Amazon Web Services, JIRA, Confluence, GIT, Bitbucket. Other Skills: Experience working with Go & Python Good understanding of IoT systems. Exposure to or knowledge of the energy industry. What we offer: A competitive salary and comprehensive benefits package. The opportunity to work on international projects and cutting-edge technology. A dynamic work environment that promotes professional growth, continuous learning, and mentorship. If you are passionate to work in a collaborative and challenging environment, apply now!

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies