Jobs
Interviews

35 Aws Stack Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 10.0 years

12 - 14 Lacs

Hyderabad

Work from Office

ABOUT THE ROLE At Amgen, we believe that innovation can and should be happening across the entire company. Part of the Artificial Intelligence & Data function of the Amgen Technology and Medical Organizations (ATMOS), the AI & Data Innovation Lab (the Lab) is a center for exploration and innovation, focused on integrating and accelerating new technologies and methods that deliver measurable value and competitive advantage. Weve built algorithms that predict bone fractures in patients who havent even been diagnosed with osteoporosis yet. Weve built software to help us select clinical trial sites so we can get medicines to patients faster. Weve built AI capabilities to standardize and accelerate the authoring of regulatory documents so we can shorten the drug approval cycle. And thats just a part of the beginning. Join us! We are seeking a Senior DevOps Software Engineer to join the Labs software engineering practice. This role is integral to developing top-tier talent, setting engineering best practices, and evangelizing full-stack development capabilities across the organization. The Senior DevOps Software Engineer will design and implement deployment strategies for AI systems using the AWS stack, ensuring high availability, performance, and scalability of applications. Roles & Responsibilities: Design and implement deployment strategies using the AWS stack, including EKS, ECS, Lambda, SageMaker, and DynamoDB. Configure and manage CI/CD pipelines in GitLab to streamline the deployment process. Develop, deploy, and manage scalable applications on AWS, ensuring they meet high standards for availability and performance. Implement infrastructure-as-code (IaC) to provision and manage cloud resources consistently and reproducibly. Collaborate with AI product design and development teams to ensure seamless integration of AI models into the infrastructure. Monitor and optimize the performance of deployed AI systems, addressing any issues related to scaling, availability, and performance. Lead and develop standards, processes, and best practices for the team across the AI system deployment lifecycle. Stay updated on emerging technologies and best practices in AI infrastructure and AWS services to continuously improve deployment strategies. Familiarity with AI concepts such as traditional AI, generative AI, and agentic AI, with the ability to learn and adopt new skills quickly. Functional Skills: Deep expertise in designing and maintaining CI/CD pipelines and enabling software engineering best practices and overall software product development lifecycle. Ability to implement automated testing, build, deployment, and rollback strategies. Advanced proficiency managing and deploying infrastructure with the AWS cloud platform, including cost planning, tracking and optimization. Proficiency with backend languages and frameworks (Python, FastAPI, Flask preferred). Experience with databases (Postgres/DynamoDB) Experience with microservices architecture and containerization (Docker, Kubernetes). Good-to-Have Skills: Familiarity with enterprise software systems in life sciences or healthcare domains. Familiarity with big data platforms and experience in data pipeline development (Databricks, Spark). Knowledge of data security, privacy regulations, and scalable software solutions. Soft Skills: Excellent communication skills, with the ability to convey complex technical concepts to non-technical stakeholders. Ability to foster a collaborative and innovative work environment. Strong problem-solving abilities and attention to detail. High degree of initiative and self-motivation. Basic Qualifications: Bachelors degree in Computer Science, AI, Software Engineering, or related field. 8+ years of experience in full-stack software engineering.

Posted 1 month ago

Apply

6.0 - 10.0 years

3 - 6 Lacs

Pune

Work from Office

Job Information Job Opening ID ZR_2099_JOB Date Opened 22/01/2024 Industry Technology Job Type Work Experience 6-10 years Job Title Golang Developer City Pune City Province Maharashtra Country India Postal Code 411001 Number of Positions 4 LocationsBangalore & Pune Work ModeHybrid Work effectively as a member of self-organized agile team that builds, owns and runs the service. Contribute to all aspects of service development including back-end and quality. Assist in the operation of the service, e.g. monitoring, alerting, metrics, logging and troubleshooting Work closely with architect and product management to understand requirements and translate them to elegant implementations. Use the current system behaviour to identify opportunities for continuous improvement of the scalability, reliability, usability and security of the system. Excellent troubleshooting skills; able to debug complex technical issues involving multiple system components. Minimum Qualifications: BS or MS in Computer Science or related technical field 6 - 10 years of experience building web applications Experience with Golang/ Web API / RESTful API design Experience in building Cloud native web services with high performance, high availability at web scale Good understanding of software design and architectural patterns Committed to quality, including security and performance Experience with agile methodologies (Scrum or Kanban) Possess strong verbal and written communication skills Experience with relational data stores such as MSSQL / MySQL Preferred Qualifications Strong design and coding skills with the ability to pick up new languages, tools and design patterns as needed. Experience with AWS stack is a plus check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 1 month ago

Apply

4.0 - 9.0 years

12 - 16 Lacs

Gurugram

Work from Office

ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it , our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage an d passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What you’ll do We are looking for experienced Knowledge Graph developers who have the following set of technical skillsets and experience. Undertake complete ownership in accomplishing activities and assigned responsibilities across all phases of project lifecycle to solve business problems across one or more client engagements. Apply appropriate development methodologies (e.g.agile, waterfall) and best practices (e.g.mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely completion of assignments. Collaborate with other team members to leverage expertise and ensure seamless transitions; Exhibit flexibility in undertaking new and challenging problems and demonstrate excellent task management. Assist in creating project outputs such as business case development, solution vision and design, user requirements, prototypes, and technical architecture (if needed), test cases, and operations management. Bring transparency in driving assigned tasks to completion and report accurate status. Bring Consulting mindset in problem solving, innovation by leveraging technical and business knowledge/ expertise and collaborate across other teams. Assist senior team members, delivery leads in project management responsibilities. Build complex solutions using Programing languages, ETL service platform, etc. What you’ll bring Bachelor’s or master’s degree in computer science, Engineering, or a related field. 4+ years of professional experience in Knowledge Graph development in Neo4j or AWS Neptune or Anzo knowledge graph Database. 3+ years of experience in RDF ontologies, Data modelling & ontology development Strong expertise in python, pyspark, SQL Strong ability to identify data anomalies, design data validation rules, and perform data cleanup to ensure high-quality data. Project management and task planning experience, ensuring smooth execution of deliverables and timelines. Strong communication and interpersonal skills to collaborate with both technical and non-technical teams. Experience with automation testing Performance OptimizationKnowledge of techniques to optimize knowledge graph operations like data inserts. Data ModelingProficiency in designing effective data models within Knowledge Graph, including relationships between tables and optimizing data for reporting. Motivation and willingness to learn new tools and technologies as per the team’s requirements. Additional Skills: Strong communication skills, both verbal and written, with the ability to structure thoughts logically during discussions and presentations Experience in pharma or life sciences dataFamiliarity with pharmaceutical datasets, including product, patient, or healthcare provider data, is a plus. Experience in manufacturing data is a plus Capability to simplify complex concepts into easily understandable frameworks and presentations Proficiency in working within a virtual global team environment, contributing to the timely delivery of multiple projects Travel to other offices as required to collaborate with clients and internal project teams Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At www.zs.com

Posted 1 month ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Hyderabad

Work from Office

Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : Python (Programming Language) Good to have skills : AWS AdministrationMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Engineer with Python expertise, you will develop data-driven applications on AWS. Responsible for the creation of scalable data pipelines and algorithms to process and deliver actionable vehicle data insights. Roles & Responsibilities:1.Lead the design and development of Python based applications and services2.Architect and implement cloud-native solutions using AWS services 3.Mentor and guide the Python development team, promoting best practices and code quality4.Collaborate with data scientists and analysts to implement data processing pipelines5.Participate in architecture discussions and contribute to technical decision-making 6.Ensure the scalability, reliability, and performance of Python applications on AWS 7.Stay current with Python ecosystem developments, AWS services, and industry best practices Professional & Technical Skills: 1.Python Programming.2.Web framework expertise (Django, Flask, or FastAPI) 3.Data processing and analysis 4.Database technologies (SQL and NoSQL) 5.API development 6.Significant experience working with AWS Lambda 7.AWS services (e.g., EC2, S3, RDS, Lambda, SageMaker, EMR) with Any AWS certification is a plus.8.Infrastructure as Code (e.g., AWS CloudFormation, Terraform) 9.Test-Driven Development (TDD) 10.DevOps practices 11.Agile methodologies.12.Experience with big data technologies and data warehousing solutions on AWS (e.g., Redshift, EMR, Athena).13.Strong knowledge of AWS platform and services (e.g., EC2, S3, RDS, Lambda, API Gateway, VPC, IAM). Additional Information:1.The candidate should have a minimum of 5 years of experience in Python Programming.2.This position is based at our Hyderabad office3.A 15 years full time education is required (Bachelor of computer science, or any related stream. masters degree preferred.) Qualification 15 years full time education

Posted 2 months ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Python (Programming Language) Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Engineer with Python expertise, you will develop data-driven applications on AWS. Responsible for the creation of scalable data pipelines and algorithms to process and deliver actionable vehicle data insights. Roles & Responsibilities:1.Lead the design and development of Python based applications and services2.Architect and implement cloud-native solutions using AWS services 3.Mentor and guide the Python development team, promoting best practices and code quality4.Collaborate with data scientists and analysts to implement data processing pipelines5.Participate in architecture discussions and contribute to technical decision-making 6.Ensure the scalability, reliability, and performance of Python applications on AWS 7.Stay current with Python ecosystem developments, AWS services, and industry best practices Professional & Technical Skills: 1.Python Programming.2.Web framework expertise (Django, Flask, or FastAPI) 3.Data processing and analysis 4.Database technologies (SQL and NoSQL) 5.API development 6.Significant experience working with AWS Lambda 7.AWS services (e.g., EC2, S3, RDS, Lambda, SageMaker, EMR) with Any AWS certification is a plus.8.Infrastructure as Code (e.g., AWS CloudFormation, Terraform) 9.Test-Driven Development (TDD) 10.DevOps practices 11.Agile methodologies.12.Experience with big data technologies and data warehousing solutions on AWS (e.g., Redshift, EMR, Athena).13.Strong knowledge of AWS platform and services (e.g., EC2, S3, RDS, Lambda, API Gateway, VPC, IAM). Additional Information:1.The candidate should have a minimum of 5 years of experience in Python Programming.2.This position is based at our Hyderabad office3.A 15 years full time education is required (Bachelor of computer science, or any related stream. masters degree preferred.) Qualification 15 years full time education

Posted 2 months ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Python (Programming Language) Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary As a Software Engineer with Python expertise, you will develop data-driven applications on AWS. Responsible for the creation of scalable data pipelines and algorithms to process and deliver actionable vehicle data insights. Roles & Responsibilities:1.Lead the design and development of Python based applications and services2.Architect and implement cloud-native solutions using AWS services 3.Mentor and guide the Python development team, promoting best practices and code quality4.Collaborate with data scientists and analysts to implement data processing pipelines5.Participate in architecture discussions and contribute to technical decision-making 6.Ensure the scalability, reliability, and performance of Python applications on AWS 7.Stay current with Python ecosystem developments, AWS services, and industry best practices Professional & Technical Skills: 1.Python Programming 2.Web framework expertise (Django, Flask, or FastAPI) 3.Data processing and analysis 4.Database technologies (SQL and NoSQL) 5.API development 6.Significant experience working with AWS Lambda 7.AWS services (e.g., EC2, S3, RDS, Lambda, SageMaker, EMR) with Any AWS certification is a plus.8.Infrastructure as Code (e.g., AWS CloudFormation, Terraform) 9.Test-Driven Development (TDD) 10.DevOps practices 11.Agile methodologies.12.Experience with big data technologies and data warehousing solutions on AWS (e.g., Redshift, EMR, Athena).13.Strong knowledge of AWS platform and services (e.g., EC2, S3, RDS, Lambda, API Gateway, VPC, IAM). Additional Information:1.The candidate should have a minimum of 3 years of experience in Python Programming.2.This position is based at our Hyderabad office3.A 15 years full time education is required (Bachelor of computer science, or any related stream. masters degree preferred.) Qualification 15 years full time education

Posted 2 months ago

Apply

3 - 5 years

6 - 10 Lacs

Gurugram

Work from Office

Position Summary: A Data Engineer designs and maintains scalable data pipelines and storage systems, with a focus on integrating and processing knowledge graph data for semantic insights. They enable efficient data flow, ensure data quality, and support analytics and machine learning by leveraging advanced graph-based technologies. How You"™ll Make an Impact (responsibilities of role) Build and optimize ETL/ELT pipelines for knowledge graphs and other data sources. Design and manage graph databases (e.g., Neo4j, AWS Neptune, ArangoDB). Develop semantic data models using RDF, OWL, and SPARQL. Integrate structured, semi-structured, and unstructured data into knowledge graphs. Ensure data quality, security, and compliance with governance standards. Collaborate with data scientists and architects to support graph-based analytics. What You Bring (required qualifications and skills) Bachelor"™s/master"™s in computer science, Data Science, or related fields. Experience3+ years of experience in data engineering, with knowledge graph expertise. Proficiency in Python, SQL, and graph query languages (SPARQL, Cypher). Experience with graph databases and frameworks (Neo4j, GraphQL, RDF). Knowledge of cloud platforms (AWS, Azure). Strong problem-solving and data modeling skills. Excellent communication skills, with the ability to convey complex concepts to non-technical stakeholders. The ability to work collaboratively in a dynamic team environment across the globe.

Posted 2 months ago

Apply

5 - 8 years

14 - 19 Lacs

Noida

Work from Office

o Minimum of 5-6 years develop, test, and deploy Python based applications on Azure/AWS platforms o Must have basic knowledge on concepts of Generative AI / LLMs / GPT o Deep understanding of architecture and work experience on Web Technologies o Python, SQL hands-on experience o Expertise in any popular python web frameworks e.g. flask, Django etc. o Familiarity with frontend technologies like HTML, JavaScript, REACT

Posted 2 months ago

Apply

2 - 4 years

12 - 14 Lacs

Navi Mumbai

Work from Office

Overview GEP is a diverse, creative team of people passionate about procurement. We invest ourselves entirely in our client’s success, creating strong collaborative relationships that deliver extraordinary value year after year. Our clients include market global leaders with far-flung international operations, Fortune 500 and Global 2000 enterprises, leading government and public institutions. We deliver practical, effective services and software that enable procurement leaders to maximise their impact on business operations, strategy and financial performance. That’s just some of the things that we do in our quest to build a beautiful company, enjoy the journey and make a difference. GEP is a place where individuality is prized, and talent respected. We’re focused on what is real and effective. GEP is where good ideas and great people are recognized, results matter, and ability and hard work drive achievements. We’re a learning organization, actively looking for people to help shape, grow and continually improve us. Are you one of us? GEP is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, ethnicity, color, national origin, religion, sex, disability status, or any other characteristics protected by law. We are committed to hiring and valuing a global diverse work team. For more information please visit us on GEP.com or check us out on LinkedIn.com. Responsibilities The candidate will be responsible for creating infrastructure designs and guiding the development and implementation of infrastructure, applications, systems and processes. This position will be working directly with infrastructure, application development and QA teams to build and deploy highly available and scalable systems in private or public cloud environments along with release management. • Candidate must have experience on AZURE or GCP Cloud Platform • Building a highly scalable, highly available, private or public infrastructure • Owning and maintaining and enhancing the infrastructure and the related tools • Help build out an entirely CI ecosystem, including automated and auto scaling testing systems. • Design and implement monitoring and alerting for production systems used by DevOps staff • Work closely with developers and other staff to solve DevOps issues with customer facing services, tools and apps Qualifications REQUIREMENTS • 2+ of experience working in a DevOps role in a continuous integration environment specially in Micro-Soft technologies. • Strong knowledge of configuration management software such as Power Shell, Ansible, Continuous integration tools such as Octopus, Azure DevOps, Jenkins • Developing complete solutions considering sizing, infrastructure, data protection, disaster recovery, security, application requirements on cloud enterprise systems. • Experience adhering to an Agile development environment and iterative sprint cycle. • Familiarity with Database Deployment and CI/CD Pipeline. • Hands-on experience with CI/CD tools like VSTS, Azure DevOps, Jenkins(at least one of this tools experience) • Worked on Docker, Container, Kubernetes, AWS EKS, API Gateway, Application Load balancer , WAF, Cloud Front • Experience with GIT, or Github and the gitflow model, administration, User Management. Must be worked on AWS Platform with minimum 2 years of experience. • Strong understanding of Linux. Strong experience in various tools related to Continuous Integration and Continuous Deployment. • Automating builds using MS Build scripts • Any Scripting language(ruby,python, Yaml, Terraform) or any other application development experience(.net , java or golan etc) • Ability to write in multiple languages including Python, Java, Ruby, and Bash scripting. • Experience with setting up SLAs and monitoring of infrastructure and applications using Nagios, New Relic, Pingdom, VictorOps/Pagerduty like tools. • Experience with network configurations (switches, routers, firewalls) and a good understanding of routing and switching, firewalls, VPN tunnels.

Posted 2 months ago

Apply

9 - 14 years

30 - 40 Lacs

Navi Mumbai

Work from Office

Overview GEP is a diverse, creative team of people passionate about procurement. We invest ourselves entirely in our client’s success, creating strong collaborative relationships that deliver extraordinary value year after year. Our clients include market global leaders with far-flung international operations, Fortune 500 and Global 2000 enterprises, leading government and public institutions. We deliver practical, effective services and software that enable procurement leaders to maximise their impact on business operations, strategy and financial performance. That’s just some of the things that we do in our quest to build a beautiful company, enjoy the journey and make a difference. GEP is a place where individuality is prized, and talent respected. We’re focused on what is real and effective. GEP is where good ideas and great people are recognized, results matter, and ability and hard work drive achievements. We’re a learning organization, actively looking for people to help shape, grow and continually improve us. Are you one of us? GEP is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, ethnicity, color, national origin, religion, sex, disability status, or any other characteristics protected by law. We are committed to hiring and valuing a global diverse work team. For more information please visit us on GEP.com or check us out on LinkedIn.com. Responsibilities The candidate will be responsible for creating infrastructure designs and guiding the development and implementation of infrastructure, applications, systems and processes. This position will be working directly with infrastructure, application development and QA teams to build and deploy highly available and scalable systems in private or public cloud environments along with release management. • Candidate must have experience on AZURE or GCP Cloud Platform • Building a highly scalable, highly available, private or public infrastructure • Owning and maintaining and enhancing the infrastructure and the related tools • Help build out an entirely CI ecosystem, including automated and auto scaling testing systems. • Design and implement monitoring and alerting for production systems used by DevOps staff • Work closely with developers and other staff to solve DevOps issues with customer facing services, tools and apps Qualifications 9+ of experience working in a DevOps role in a continuous integration environment specially in Micro-Soft technologies. • Strong knowledge of configuration management software such as Power Shell, Ansible, Continuous integration tools such as Octopus, Azure DevOps, Jenkins • Developing complete solutions considering sizing, infrastructure, data protection, disaster recovery, security, application requirements on cloud enterprise systems. • Experience adhering to an Agile development environment and iterative sprint cycle. • Familiarity with Database Deployment and CI/CD Pipeline. • Hands-on experience with CI/CD tools like VSTS, Azure DevOps, Jenkins(at least one of this tools experience) • Worked on Docker, Container, Kubernetes, AWS EKS, API Gateway, Application Load balancer , WAF, Cloud Front • Experience with GIT, or Github and the gitflow model, administration, User Management. Must be worked on AWS Platform with minimum 2 years of experience. • Strong understanding of Linux. Strong experience in various tools related to Continuous Integration and Continuous Deployment. • Automating builds using MS Build scripts • Any Scripting language(ruby,python, Yaml, Terraform) or any other application development experience(.net , java or golan etc) • Ability to write in multiple languages including Python, Java, Ruby, and Bash scripting. • Experience with setting up SLAs and monitoring of infrastructure and applications using Nagios, New Relic, Pingdom, VictorOps/Pagerduty like tools. • Experience with network configurations (switches, routers, firewalls) and a good understanding of routing and switching, firewalls, VPN tunnels

Posted 2 months ago

Apply
Page 2 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies