Jobs
Interviews

66 Aws Stack Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 10.0 years

12 - 14 Lacs

Hyderabad

Work from Office

ABOUT THE ROLE At Amgen, we believe that innovation can and should be happening across the entire company. Part of the Artificial Intelligence & Data function of the Amgen Technology and Medical Organizations (ATMOS), the AI & Data Innovation Lab (the Lab) is a center for exploration and innovation, focused on integrating and accelerating new technologies and methods that deliver measurable value and competitive advantage. Weve built algorithms that predict bone fractures in patients who havent even been diagnosed with osteoporosis yet. Weve built software to help us select clinical trial sites so we can get medicines to patients faster. Weve built AI capabilities to standardize and accelerate the authoring of regulatory documents so we can shorten the drug approval cycle. And thats just a part of the beginning. Join us! We are seeking a Senior DevOps Software Engineer to join the Labs software engineering practice. This role is integral to developing top-tier talent, setting engineering best practices, and evangelizing full-stack development capabilities across the organization. The Senior DevOps Software Engineer will design and implement deployment strategies for AI systems using the AWS stack, ensuring high availability, performance, and scalability of applications. Roles & Responsibilities: Design and implement deployment strategies using the AWS stack, including EKS, ECS, Lambda, SageMaker, and DynamoDB. Configure and manage CI/CD pipelines in GitLab to streamline the deployment process. Develop, deploy, and manage scalable applications on AWS, ensuring they meet high standards for availability and performance. Implement infrastructure-as-code (IaC) to provision and manage cloud resources consistently and reproducibly. Collaborate with AI product design and development teams to ensure seamless integration of AI models into the infrastructure. Monitor and optimize the performance of deployed AI systems, addressing any issues related to scaling, availability, and performance. Lead and develop standards, processes, and best practices for the team across the AI system deployment lifecycle. Stay updated on emerging technologies and best practices in AI infrastructure and AWS services to continuously improve deployment strategies. Familiarity with AI concepts such as traditional AI, generative AI, and agentic AI, with the ability to learn and adopt new skills quickly. Functional Skills: Deep expertise in designing and maintaining CI/CD pipelines and enabling software engineering best practices and overall software product development lifecycle. Ability to implement automated testing, build, deployment, and rollback strategies. Advanced proficiency managing and deploying infrastructure with the AWS cloud platform, including cost planning, tracking and optimization. Proficiency with backend languages and frameworks (Python, FastAPI, Flask preferred). Experience with databases (Postgres/DynamoDB) Experience with microservices architecture and containerization (Docker, Kubernetes). Good-to-Have Skills: Familiarity with enterprise software systems in life sciences or healthcare domains. Familiarity with big data platforms and experience in data pipeline development (Databricks, Spark). Knowledge of data security, privacy regulations, and scalable software solutions. Soft Skills: Excellent communication skills, with the ability to convey complex technical concepts to non-technical stakeholders. Ability to foster a collaborative and innovative work environment. Strong problem-solving abilities and attention to detail. High degree of initiative and self-motivation. Basic Qualifications: Bachelors degree in Computer Science, AI, Software Engineering, or related field. 8+ years of experience in full-stack software engineering.

Posted 3 months ago

Apply

6.0 - 10.0 years

3 - 6 Lacs

Pune

Work from Office

Job Information Job Opening ID ZR_2099_JOB Date Opened 22/01/2024 Industry Technology Job Type Work Experience 6-10 years Job Title Golang Developer City Pune City Province Maharashtra Country India Postal Code 411001 Number of Positions 4 LocationsBangalore & Pune Work ModeHybrid Work effectively as a member of self-organized agile team that builds, owns and runs the service. Contribute to all aspects of service development including back-end and quality. Assist in the operation of the service, e.g. monitoring, alerting, metrics, logging and troubleshooting Work closely with architect and product management to understand requirements and translate them to elegant implementations. Use the current system behaviour to identify opportunities for continuous improvement of the scalability, reliability, usability and security of the system. Excellent troubleshooting skills; able to debug complex technical issues involving multiple system components. Minimum Qualifications: BS or MS in Computer Science or related technical field 6 - 10 years of experience building web applications Experience with Golang/ Web API / RESTful API design Experience in building Cloud native web services with high performance, high availability at web scale Good understanding of software design and architectural patterns Committed to quality, including security and performance Experience with agile methodologies (Scrum or Kanban) Possess strong verbal and written communication skills Experience with relational data stores such as MSSQL / MySQL Preferred Qualifications Strong design and coding skills with the ability to pick up new languages, tools and design patterns as needed. Experience with AWS stack is a plus check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 3 months ago

Apply

4.0 - 9.0 years

12 - 16 Lacs

Gurugram

Work from Office

ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it , our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage an d passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What you’ll do We are looking for experienced Knowledge Graph developers who have the following set of technical skillsets and experience. Undertake complete ownership in accomplishing activities and assigned responsibilities across all phases of project lifecycle to solve business problems across one or more client engagements. Apply appropriate development methodologies (e.g.agile, waterfall) and best practices (e.g.mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely completion of assignments. Collaborate with other team members to leverage expertise and ensure seamless transitions; Exhibit flexibility in undertaking new and challenging problems and demonstrate excellent task management. Assist in creating project outputs such as business case development, solution vision and design, user requirements, prototypes, and technical architecture (if needed), test cases, and operations management. Bring transparency in driving assigned tasks to completion and report accurate status. Bring Consulting mindset in problem solving, innovation by leveraging technical and business knowledge/ expertise and collaborate across other teams. Assist senior team members, delivery leads in project management responsibilities. Build complex solutions using Programing languages, ETL service platform, etc. What you’ll bring Bachelor’s or master’s degree in computer science, Engineering, or a related field. 4+ years of professional experience in Knowledge Graph development in Neo4j or AWS Neptune or Anzo knowledge graph Database. 3+ years of experience in RDF ontologies, Data modelling & ontology development Strong expertise in python, pyspark, SQL Strong ability to identify data anomalies, design data validation rules, and perform data cleanup to ensure high-quality data. Project management and task planning experience, ensuring smooth execution of deliverables and timelines. Strong communication and interpersonal skills to collaborate with both technical and non-technical teams. Experience with automation testing Performance OptimizationKnowledge of techniques to optimize knowledge graph operations like data inserts. Data ModelingProficiency in designing effective data models within Knowledge Graph, including relationships between tables and optimizing data for reporting. Motivation and willingness to learn new tools and technologies as per the team’s requirements. Additional Skills: Strong communication skills, both verbal and written, with the ability to structure thoughts logically during discussions and presentations Experience in pharma or life sciences dataFamiliarity with pharmaceutical datasets, including product, patient, or healthcare provider data, is a plus. Experience in manufacturing data is a plus Capability to simplify complex concepts into easily understandable frameworks and presentations Proficiency in working within a virtual global team environment, contributing to the timely delivery of multiple projects Travel to other offices as required to collaborate with clients and internal project teams Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At www.zs.com

Posted 3 months ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Hyderabad

Work from Office

Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : Python (Programming Language) Good to have skills : AWS AdministrationMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Engineer with Python expertise, you will develop data-driven applications on AWS. Responsible for the creation of scalable data pipelines and algorithms to process and deliver actionable vehicle data insights. Roles & Responsibilities:1.Lead the design and development of Python based applications and services2.Architect and implement cloud-native solutions using AWS services 3.Mentor and guide the Python development team, promoting best practices and code quality4.Collaborate with data scientists and analysts to implement data processing pipelines5.Participate in architecture discussions and contribute to technical decision-making 6.Ensure the scalability, reliability, and performance of Python applications on AWS 7.Stay current with Python ecosystem developments, AWS services, and industry best practices Professional & Technical Skills: 1.Python Programming.2.Web framework expertise (Django, Flask, or FastAPI) 3.Data processing and analysis 4.Database technologies (SQL and NoSQL) 5.API development 6.Significant experience working with AWS Lambda 7.AWS services (e.g., EC2, S3, RDS, Lambda, SageMaker, EMR) with Any AWS certification is a plus.8.Infrastructure as Code (e.g., AWS CloudFormation, Terraform) 9.Test-Driven Development (TDD) 10.DevOps practices 11.Agile methodologies.12.Experience with big data technologies and data warehousing solutions on AWS (e.g., Redshift, EMR, Athena).13.Strong knowledge of AWS platform and services (e.g., EC2, S3, RDS, Lambda, API Gateway, VPC, IAM). Additional Information:1.The candidate should have a minimum of 5 years of experience in Python Programming.2.This position is based at our Hyderabad office3.A 15 years full time education is required (Bachelor of computer science, or any related stream. masters degree preferred.) Qualification 15 years full time education

Posted 3 months ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Python (Programming Language) Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Engineer with Python expertise, you will develop data-driven applications on AWS. Responsible for the creation of scalable data pipelines and algorithms to process and deliver actionable vehicle data insights. Roles & Responsibilities:1.Lead the design and development of Python based applications and services2.Architect and implement cloud-native solutions using AWS services 3.Mentor and guide the Python development team, promoting best practices and code quality4.Collaborate with data scientists and analysts to implement data processing pipelines5.Participate in architecture discussions and contribute to technical decision-making 6.Ensure the scalability, reliability, and performance of Python applications on AWS 7.Stay current with Python ecosystem developments, AWS services, and industry best practices Professional & Technical Skills: 1.Python Programming.2.Web framework expertise (Django, Flask, or FastAPI) 3.Data processing and analysis 4.Database technologies (SQL and NoSQL) 5.API development 6.Significant experience working with AWS Lambda 7.AWS services (e.g., EC2, S3, RDS, Lambda, SageMaker, EMR) with Any AWS certification is a plus.8.Infrastructure as Code (e.g., AWS CloudFormation, Terraform) 9.Test-Driven Development (TDD) 10.DevOps practices 11.Agile methodologies.12.Experience with big data technologies and data warehousing solutions on AWS (e.g., Redshift, EMR, Athena).13.Strong knowledge of AWS platform and services (e.g., EC2, S3, RDS, Lambda, API Gateway, VPC, IAM). Additional Information:1.The candidate should have a minimum of 5 years of experience in Python Programming.2.This position is based at our Hyderabad office3.A 15 years full time education is required (Bachelor of computer science, or any related stream. masters degree preferred.) Qualification 15 years full time education

Posted 3 months ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Python (Programming Language) Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary As a Software Engineer with Python expertise, you will develop data-driven applications on AWS. Responsible for the creation of scalable data pipelines and algorithms to process and deliver actionable vehicle data insights. Roles & Responsibilities:1.Lead the design and development of Python based applications and services2.Architect and implement cloud-native solutions using AWS services 3.Mentor and guide the Python development team, promoting best practices and code quality4.Collaborate with data scientists and analysts to implement data processing pipelines5.Participate in architecture discussions and contribute to technical decision-making 6.Ensure the scalability, reliability, and performance of Python applications on AWS 7.Stay current with Python ecosystem developments, AWS services, and industry best practices Professional & Technical Skills: 1.Python Programming 2.Web framework expertise (Django, Flask, or FastAPI) 3.Data processing and analysis 4.Database technologies (SQL and NoSQL) 5.API development 6.Significant experience working with AWS Lambda 7.AWS services (e.g., EC2, S3, RDS, Lambda, SageMaker, EMR) with Any AWS certification is a plus.8.Infrastructure as Code (e.g., AWS CloudFormation, Terraform) 9.Test-Driven Development (TDD) 10.DevOps practices 11.Agile methodologies.12.Experience with big data technologies and data warehousing solutions on AWS (e.g., Redshift, EMR, Athena).13.Strong knowledge of AWS platform and services (e.g., EC2, S3, RDS, Lambda, API Gateway, VPC, IAM). Additional Information:1.The candidate should have a minimum of 3 years of experience in Python Programming.2.This position is based at our Hyderabad office3.A 15 years full time education is required (Bachelor of computer science, or any related stream. masters degree preferred.) Qualification 15 years full time education

Posted 3 months ago

Apply

3 - 5 years

6 - 10 Lacs

Gurugram

Work from Office

Position Summary: A Data Engineer designs and maintains scalable data pipelines and storage systems, with a focus on integrating and processing knowledge graph data for semantic insights. They enable efficient data flow, ensure data quality, and support analytics and machine learning by leveraging advanced graph-based technologies. How You"™ll Make an Impact (responsibilities of role) Build and optimize ETL/ELT pipelines for knowledge graphs and other data sources. Design and manage graph databases (e.g., Neo4j, AWS Neptune, ArangoDB). Develop semantic data models using RDF, OWL, and SPARQL. Integrate structured, semi-structured, and unstructured data into knowledge graphs. Ensure data quality, security, and compliance with governance standards. Collaborate with data scientists and architects to support graph-based analytics. What You Bring (required qualifications and skills) Bachelor"™s/master"™s in computer science, Data Science, or related fields. Experience3+ years of experience in data engineering, with knowledge graph expertise. Proficiency in Python, SQL, and graph query languages (SPARQL, Cypher). Experience with graph databases and frameworks (Neo4j, GraphQL, RDF). Knowledge of cloud platforms (AWS, Azure). Strong problem-solving and data modeling skills. Excellent communication skills, with the ability to convey complex concepts to non-technical stakeholders. The ability to work collaboratively in a dynamic team environment across the globe.

Posted 4 months ago

Apply

5 - 8 years

14 - 19 Lacs

Noida

Work from Office

o Minimum of 5-6 years develop, test, and deploy Python based applications on Azure/AWS platforms o Must have basic knowledge on concepts of Generative AI / LLMs / GPT o Deep understanding of architecture and work experience on Web Technologies o Python, SQL hands-on experience o Expertise in any popular python web frameworks e.g. flask, Django etc. o Familiarity with frontend technologies like HTML, JavaScript, REACT

Posted 4 months ago

Apply

2 - 4 years

12 - 14 Lacs

Navi Mumbai

Work from Office

Overview GEP is a diverse, creative team of people passionate about procurement. We invest ourselves entirely in our client’s success, creating strong collaborative relationships that deliver extraordinary value year after year. Our clients include market global leaders with far-flung international operations, Fortune 500 and Global 2000 enterprises, leading government and public institutions. We deliver practical, effective services and software that enable procurement leaders to maximise their impact on business operations, strategy and financial performance. That’s just some of the things that we do in our quest to build a beautiful company, enjoy the journey and make a difference. GEP is a place where individuality is prized, and talent respected. We’re focused on what is real and effective. GEP is where good ideas and great people are recognized, results matter, and ability and hard work drive achievements. We’re a learning organization, actively looking for people to help shape, grow and continually improve us. Are you one of us? GEP is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, ethnicity, color, national origin, religion, sex, disability status, or any other characteristics protected by law. We are committed to hiring and valuing a global diverse work team. For more information please visit us on GEP.com or check us out on LinkedIn.com. Responsibilities The candidate will be responsible for creating infrastructure designs and guiding the development and implementation of infrastructure, applications, systems and processes. This position will be working directly with infrastructure, application development and QA teams to build and deploy highly available and scalable systems in private or public cloud environments along with release management. • Candidate must have experience on AZURE or GCP Cloud Platform • Building a highly scalable, highly available, private or public infrastructure • Owning and maintaining and enhancing the infrastructure and the related tools • Help build out an entirely CI ecosystem, including automated and auto scaling testing systems. • Design and implement monitoring and alerting for production systems used by DevOps staff • Work closely with developers and other staff to solve DevOps issues with customer facing services, tools and apps Qualifications REQUIREMENTS • 2+ of experience working in a DevOps role in a continuous integration environment specially in Micro-Soft technologies. • Strong knowledge of configuration management software such as Power Shell, Ansible, Continuous integration tools such as Octopus, Azure DevOps, Jenkins • Developing complete solutions considering sizing, infrastructure, data protection, disaster recovery, security, application requirements on cloud enterprise systems. • Experience adhering to an Agile development environment and iterative sprint cycle. • Familiarity with Database Deployment and CI/CD Pipeline. • Hands-on experience with CI/CD tools like VSTS, Azure DevOps, Jenkins(at least one of this tools experience) • Worked on Docker, Container, Kubernetes, AWS EKS, API Gateway, Application Load balancer , WAF, Cloud Front • Experience with GIT, or Github and the gitflow model, administration, User Management. Must be worked on AWS Platform with minimum 2 years of experience. • Strong understanding of Linux. Strong experience in various tools related to Continuous Integration and Continuous Deployment. • Automating builds using MS Build scripts • Any Scripting language(ruby,python, Yaml, Terraform) or any other application development experience(.net , java or golan etc) • Ability to write in multiple languages including Python, Java, Ruby, and Bash scripting. • Experience with setting up SLAs and monitoring of infrastructure and applications using Nagios, New Relic, Pingdom, VictorOps/Pagerduty like tools. • Experience with network configurations (switches, routers, firewalls) and a good understanding of routing and switching, firewalls, VPN tunnels.

Posted 4 months ago

Apply

9 - 14 years

30 - 40 Lacs

Navi Mumbai

Work from Office

Overview GEP is a diverse, creative team of people passionate about procurement. We invest ourselves entirely in our client’s success, creating strong collaborative relationships that deliver extraordinary value year after year. Our clients include market global leaders with far-flung international operations, Fortune 500 and Global 2000 enterprises, leading government and public institutions. We deliver practical, effective services and software that enable procurement leaders to maximise their impact on business operations, strategy and financial performance. That’s just some of the things that we do in our quest to build a beautiful company, enjoy the journey and make a difference. GEP is a place where individuality is prized, and talent respected. We’re focused on what is real and effective. GEP is where good ideas and great people are recognized, results matter, and ability and hard work drive achievements. We’re a learning organization, actively looking for people to help shape, grow and continually improve us. Are you one of us? GEP is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, ethnicity, color, national origin, religion, sex, disability status, or any other characteristics protected by law. We are committed to hiring and valuing a global diverse work team. For more information please visit us on GEP.com or check us out on LinkedIn.com. Responsibilities The candidate will be responsible for creating infrastructure designs and guiding the development and implementation of infrastructure, applications, systems and processes. This position will be working directly with infrastructure, application development and QA teams to build and deploy highly available and scalable systems in private or public cloud environments along with release management. • Candidate must have experience on AZURE or GCP Cloud Platform • Building a highly scalable, highly available, private or public infrastructure • Owning and maintaining and enhancing the infrastructure and the related tools • Help build out an entirely CI ecosystem, including automated and auto scaling testing systems. • Design and implement monitoring and alerting for production systems used by DevOps staff • Work closely with developers and other staff to solve DevOps issues with customer facing services, tools and apps Qualifications 9+ of experience working in a DevOps role in a continuous integration environment specially in Micro-Soft technologies. • Strong knowledge of configuration management software such as Power Shell, Ansible, Continuous integration tools such as Octopus, Azure DevOps, Jenkins • Developing complete solutions considering sizing, infrastructure, data protection, disaster recovery, security, application requirements on cloud enterprise systems. • Experience adhering to an Agile development environment and iterative sprint cycle. • Familiarity with Database Deployment and CI/CD Pipeline. • Hands-on experience with CI/CD tools like VSTS, Azure DevOps, Jenkins(at least one of this tools experience) • Worked on Docker, Container, Kubernetes, AWS EKS, API Gateway, Application Load balancer , WAF, Cloud Front • Experience with GIT, or Github and the gitflow model, administration, User Management. Must be worked on AWS Platform with minimum 2 years of experience. • Strong understanding of Linux. Strong experience in various tools related to Continuous Integration and Continuous Deployment. • Automating builds using MS Build scripts • Any Scripting language(ruby,python, Yaml, Terraform) or any other application development experience(.net , java or golan etc) • Ability to write in multiple languages including Python, Java, Ruby, and Bash scripting. • Experience with setting up SLAs and monitoring of infrastructure and applications using Nagios, New Relic, Pingdom, VictorOps/Pagerduty like tools. • Experience with network configurations (switches, routers, firewalls) and a good understanding of routing and switching, firewalls, VPN tunnels

Posted 4 months ago

Apply

4.0 - 8.0 years

13 - 18 Lacs

bengaluru

Work from Office

Bengaluru, India Murex Others BCM Industry 29/04/2025 Project description We've been engaged by a large Australian financial institution to provide resources to manage the production support activities along with their existing team in Sydney & India. Responsibilities Carry out enhancements to maintenance/housekeeping scripts as required and monitor the DB growth periodically. Handles cloud Environment preparation, refresh, rebuild, upkeep, maintenance, and upgrade activities. Ensure cloud cost optimisation. Troubleshooting of Murex environment-specific issues including Infrastructure related issues and update pipelines for a permanent fix. Handling EOD execution and troubleshooting of issues related to it. Participate in analysis, solutioning, and deployment of solution for production issues during EoD. Participate in the release activity and coordinate with QA/Release teams. Participate in AWS stack deployment, AWS AMI patching, and stack configuration to ensure optimal performance and cost-efficiency. Address requests like warehouse rebuild, maintenance, Perform Health/sanity checks, create XVA engine, environment restores & backup in AWS as per project need. Perform Weekend maintenance and perform health checks in the production environment during the weekend. Support working in shifts (max end time will be 12.30 AM IST) and available for weekend & on-call support. Have to work out of client location on a need basis. Flexible to work in a Hybrid model. Skills Must have 4 to 8 Years of experience in Murex Production Support Murex End of Day support Troubleshooting batch-related issues, including date moves and processing adjustments Murex Env Management & Troubleshooting Experienced in SQL Unix shell scripting, Monitoring tools, Web development Experienced in the Release and CI/CD process Linux/Unix server and Oracle RDS knowledge Working experience with automation/job scheduling tools such as Autosys, GitHub Actions Working experience with monitoring tools like Grafana, Splunk, Obstack, PagerDuty Good communication and organization skills working within a DevOps team supporting a wider IT delivery team Nice to have PL/SQL, Scripting languages (Python) Advanced troubleshooting experience with Shell scripting and Python Experience with CICD tools like Git, flows, Ansible, and AWS including CDK Exposure to AWS Cloud environment Willing to learn and obtain AWS certification Other Languages EnglishC1 Advanced Seniority Regular

Posted Date not available

Apply

6.0 - 10.0 years

3 - 6 Lacs

pune, bengaluru

Hybrid

Work effectively as a member of self-organized agile team that builds, owns and runs the service. Contribute to all aspects of service development including back-end and quality. Assist in the operation of the service, e.g. monitoring, alerting, metrics, logging and troubleshooting Work closely with architect and product management to understand requirements and translate them to elegant implementations. Use the current system behaviour to identify opportunities for continuous improvement of the scalability, reliability, usability and security of the system. Excellent troubleshooting skills; able to debug complex technical issues involving multiple system components. Minimum Qualifications: BS or MS in Computer Science or related technical field 6 - 10 years of experience building web applications Experience with Golang/ Web API / RESTful API design Experience in building Cloud native web services with high performance, high availability at web scale Good understanding of software design and architectural patterns Committed to quality, including security and performance Experience with agile methodologies (Scrum or Kanban) Possess strong verbal and written communication skills Experience with relational data stores such as MSSQL / MySQL Preferred Qualifications Strong design and coding skills with the ability to pick up new languages, tools and design patterns as needed. Experience with AWS stack is a plus

Posted Date not available

Apply

6.0 - 10.0 years

3 - 6 Lacs

bengaluru

Work from Office

"Develop and maintain backend services using Golang, Web APIs, and RESTful API design. Build scalable, high-performance, cloud-native web services. Assist in operations, including monitoring, alerting, logging, and troubleshooting. Collaborate with architects and product managers to implement solutions. Ensure system security, reliability, and performance improvements. Work with relational databases like MSSQL/MySQL and follow Agile methodologies. Experience with AWS stack is a plus."

Posted Date not available

Apply

9.0 - 14.0 years

17 - 22 Lacs

kochi

Work from Office

This is a role for a Senior Product Architect for IBM Concert, responsible for leading the product engineering team in designing scalable, secure, and resilient solutions. Drive architectural strategy, technical excellence, and innovation across the Concert product organisation. Architectural Leadership: Work collaboratively within a team responsible for shaping the architecture and technical trajectory of our software networking and edge computing portfolio. Write product requirements for new integrations, enhance existing integrations, and identify ecosystem products to interact with. Serve as an end-to-end SMEDemonstrates skills/expertise across all functional/non-functional aspects. Gain insights into customer adoption of product capabilities and rise above the silos of being client side, server side or module specific knowledge. Be the agent of change/innovationChallenge the status quo and provide though leadership and operational drive. Real-world Product Building: Leverage practical experience in building products to contribute valuable insights to the software engineering practices within the team. Contribute to the core development/deliveryBe thorough with the technology domain and associated programming languages/frameworks/patterns and tool. Own coding/testing of some of the key/complex modules/components Domain Expertise: Possess a deep understanding of software networking and edge computing, utilizing this knowledge to develop informed opinions that play a pivotal role in shaping product direction and strategy. Conduct market research and analysis to identify market trends. Provide expert inputs/insightsOn product architecture, design decisions, and technology choices. Effectively collaborate with the stakeholders like Product/Offering Management, Sales and Mktg, Support. Collaboration and Adaptability: Collaborate seamlessly with diverse teams and contribute to other IBM product initiatives. A crucial aspect is the ability to extend expertise beyond your domain to promote a holistic understanding of interconnected domains. Enable tech sales and support teams by building specialized demos, assisting with RFIs, fielding scanner-specific questions Engage prospects, customers, and internal stakeholders for discovery sessions and roadmap reviews Flexible Mindset: Demonstrate flexibility in mindset to navigate and resolve points of intersection with other system components. Adaptability is key to finding mutually beneficial solutions. Effective Communication: Utilize strong communication skills to articulate complex technical concepts, fostering collaboration and understanding across interdisciplinary teams. Provide regular updates to the leadership team on product development progress and market signals Drive eminence and respectBe an active evangelist and thought leader inside and outside the organization. Required education Bachelor's Degree Preferred education Bachelor's Degree Required technical and professional expertise 15+ years of experience in building and architecting enterprise-grade software products, preferably in the automation, observability, or risk & resilience domain. Deep domain expertise in Application Risk and Resilience Expertise in Go and Python application development, with exposure to additional languages and development/test environments. Solid understanding of modern application architecture including microservices, containers (Docker), orchestration (Kubernetes), and RESTful APIs. Proficiency in cloud-native development, with mastery of IBM Cloud and AWS platforms. Strong command of software architecture patterns, design principles, and best practices. Development experience with PostgreSQL and cloud-native databases. Proven technical leadership experience in driving design and architectural decisions within cross-functional agile teams. Hands-on experience with DevOps/DevSecOps practices, CI/CD pipelines, and GitOps workflows. Excellent problem-solving and debugging skills to resolve complex technical issues effectively. Preferred technical and professional experience Brings a solid background in Automation and Cloud technologies . Demonstrates a growth mindset , is open to feedback, and continuously seeks to learn and evolve. Approaches challenges with ownership and resilience , focusing on solutions rather than assigning blame. Can effectively balance people, product, and process goals , with a collaborative and open-minded leadership style—critical for success in this role.

Posted Date not available

Apply

4.0 - 8.0 years

9 - 13 Lacs

gurugram

Work from Office

As an Associate Software Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 8+ yrs AWS exp-Knowledge of ECS, SQS, Step Function, Aurora DB, Lambda functions, EFS, Transfer Family, API Gateway, Security concepts, AWS Cognito-Knowledge of Japanese is required Creative problem-solving skills and superb communication Skill. Container based solutions. Strong experience with Node.js and AWS stack. Proficiency in AWS services like API Gateway, Lambda, SQS, SNS, S3, IAM and Cognito. Experience with infrastructure as a code using AWS CDK.Expertise in encryption and decryption techniques for securing APIs, API Authentication and Authorization Preferred technical and professional experience Experience in distributed/scalable systems Knowledge of standard tools for optimizing and testing code Knowledge/Experience of Development/Build/Deploy/Test life cycle

Posted Date not available

Apply

2.0 - 5.0 years

12 - 16 Lacs

pune, gurugram, bengaluru

Work from Office

What you’ll do We are looking for experienced Knowledge Graph developers who have the following set of technical skillsets and experience. Undertake complete ownership in accomplishing activities and assigned responsibilities across all phases of project lifecycle to solve business problems across one or more client engagements. Apply appropriate development methodologies (e.g.agile, waterfall) and best practices (e.g.mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely completion of assignments. Collaborate with other team members to leverage expertise and ensure seamless transitions; Exhibit flexibility in undertaking new and challenging problems and demonstrate excellent task management. Assist in creating project outputs such as business case development, solution vision and design, user requirements, prototypes, and technical architecture (if needed), test cases, and operations management. Bring transparency in driving assigned tasks to completion and report accurate status. Bring Consulting mindset in problem solving, innovation by leveraging technical and business knowledge/ expertise and collaborate across other teams. Assist senior team members, delivery leads in project management responsibilities. Build complex solutions using Programing languages, ETL service platform, etc. What you’ll bring Bachelor’s or master’s degree in computer science, Engineering, or a related field. 4+ years of professional experience in Knowledge Graph development in Neo4j or AWS Neptune or Anzo knowledge graph Database. 3+ years of experience in RDF ontologies, Data modelling & ontology development Strong expertise in python, pyspark, SQL Strong ability to identify data anomalies, design data validation rules, and perform data cleanup to ensure high-quality data. Project management and task planning experience, ensuring smooth execution of deliverables and timelines. Strong communication and interpersonal skills to collaborate with both technical and non-technical teams. Experience with automation testing Performance OptimizationKnowledge of techniques to optimize knowledge graph operations like data inserts. Data ModelingProficiency in designing effective data models within Knowledge Graph, including relationships between tables and optimizing data for reporting. Motivation and willingness to learn new tools and technologies as per the team’s requirements. Additional Skills: Strong communication skills, both verbal and written, with the ability to structure thoughts logically during discussions and presentations Experience in pharma or life sciences dataFamiliarity with pharmaceutical datasets, including product, patient, or healthcare provider data, is a plus. Experience in manufacturing data is a plus Capability to simplify complex concepts into easily understandable frameworks and presentations Proficiency in working within a virtual global team environment, contributing to the timely delivery of multiple projects Travel to other offices as required to collaborate with clients and internal project teams

Posted Date not available

Apply
Page 3 of 3
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies