Home
Jobs

272 S3 Jobs - Page 11

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3 - 8 years

8 - 12 Lacs

Greater Noida

Work from Office

Naukri logo

Sound experience in developing Python applications using Fast API or Flask. (Fast API is preferrable). Proficient in OOPs, Design patterns and functional programming. Hand on experinece with MySql or MongoDB and can manage the complex queries. Good experince of GIT versioning tool. Should have worked with server less architecture and RESTful systems. Experience of API development in Python Hands on experience in AWS services: Lambda, SQS, S3, ECS etc. Experience in using Python Classes using inheritance, overloading and polymorphism Experience in building Serverless applications in AWS using API Gateway and Lambda Experience in Insurance projects is preferable. Note: We are not looking candidate from ML(Machine learning) & Data Science domain. This opening is only for Web/API development in Python and its frameworks.

Posted 1 month ago

Apply

3 - 5 years

15 - 19 Lacs

Bengaluru

Work from Office

Naukri logo

Immediate Joiners Job Summary We are seeking an experienced DevOps Engineer to join our team and help us build and maintain scalable, secure, and efficient infrastructure on Amazon Web Services (AWS). The ideal candidate will have a strong background in DevOps practices, AWS services, and scripting languages. Key Responsibilities Design and Implement Infrastructure: Design and implement scalable, secure, andefficient infrastructure on AWS using services such as EC2, S3, RDS, and VPC. Automate Deployment Processes: Automate deployment processes using tools such as AWS CodePipeline, AWS CodeBuild, and AWS CodeDeploy. Implement Continuous Integration and Continuous Deployment (CI/CD): Implement CI/CD pipelines using tools such as Jenkins, GitLab CI/CD, and CircleCI. Monitor and Troubleshoot Infrastructure: Monitor and troubleshoot infrastructure issuesusing tools such as Amazon CloudWatch, AWS X-Ray, and AWS CloudTrail. Collaborate with Development Teams: Collaborate with development teams to ensuresmooth deployment of applications and infrastructure. Implement Security Best Practices: Implement security best practices and ensurecompliance with organizational security policies. Optimize Infrastructure for Cost and Performance: Optimize infrastructure for cost andperformance using tools such as AWS Cost Explorer and AWS Trusted Advisor. Requirements Education: Bachelors degree in Computer Science, Information Technology, or relatedfield. Experience: Minimum 3 years of experience in DevOps engineering, with a focus on AWSservices. Technical Skills: AWS services such as EC2, S3, RDS, VPC, and Lambda. Scripting languages such as Python, Ruby, or PowerShell. CI/CD tools such as Jenkins, GitLab CI/CD, and CircleCI. Monitoring and troubleshooting tools such as Amazon CloudWatch, AWS X-Ray, and AWS CloudTrail. Soft Skills: Excellent communication and interpersonal skills. Strong problem-solving and analytical skills. Ability to work in a team environment and collaborate with development teams. Nice to Have Certifications: AWS certifications such as AWS Certified DevOps Engineer or AWSCertified Solutions Architect. Experience with Containerization: Experience with containerization using Docker orKubernetes. Experience with Serverless Computing: Experience with serverless computing using AWSLambda or Azure Functions.

Posted 1 month ago

Apply

1 - 4 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

What Youll Own Full Stack Systems: Architect and build end-to-end applications using Flask, FastAPI, Node.js, React (or Next.js), and Tailwind. AI Integrations: Build and optimize pipelines involving LLMs (OpenAI, Groq, LLaMA), Whisper, TTS, embeddings, RAG, LangChain, LangGraph, and vector DBs like Pinecone/Milvus. Cloud Infrastructure: Deploy, monitor, and scale systems on AWS/GCP using EC2, S3, IAM, Lambda, Kafka, and ClickHouse. Real-time Systems: Design asynchronous workflows (Kafka, Celery, WebSockets) for voice-based agents, event tracking, or search indexing. System Orchestration: Set up scalable infra with autoscaling groups, Docker, and Kubernetes (PoC ready, if not full prod). Growth-Ready Features: Implement in-app nudges, tracking with Amplitude, AB testing, and funnel optimization. Tech Stack Youll Work With: Backend & Infrastructure Languages/Frameworks: Python (Flask, FastAPI), Node.js Databases: PostgreSQL, Redis, ClickHouse Infra: Kafka, Docker, Kubernetes, GitHub Actions, Cloudflare Cloud: AWS (EC2, S3, RDS), GCP Frontend React / Next.js, TailwindCSS, Zustand, Shadcn/UI WebGL, Three.js for 3D rendering AI/ML & Computer Vision LangChain, LangGraph, HuggingFace, OpenAI, Groq Whisper (ASR), Eleven Labs (TTS) Diffusion Models, StyleGAN, Stable Diffusion GANs, MediaPipe, ARKit/ARCore Computer Vision: Face tracking, real-time try-on, pose estimation Virtual Try-On: Face/body detection, cloth/hairstyle try-ons APIs Stripe, VAPI, Algolia, OpenAI, Amplitude Vector DB & Search Pinecone, Milvus (Zilliz), custom vector search pipelines Other Vibe Coding culture, prompt engineering, system-level optimization Must-Haves: 1+ years of experience building production-grade full-stack systems Fluency in Python and JS/TS (Node.js, React) shipping independently without handholding Deep understanding of LLM pipelines, embeddings, vector search, and retrieval-augmented generation (RAG) Experience with AR frameworks (ARKit, ARCore), 3D rendering (Three.js), and real-time computer vision (MediaPipe) Strong grasp of modern AI model architectures: Diffusion Models, GANs, AI Agent Hands-on with system debugging, performance profiling, infra cost optimization Comfort with ambiguity fast iteration, shipping prototypes, breaking things to learn faster Bonus if youve built agentic apps, AI workflows, or virtual try-ons

Posted 1 month ago

Apply

4 - 7 years

0 - 3 Lacs

Hyderabad, Pune, Chennai

Hybrid

Naukri logo

Job Posting Title : Data Engineer (Snowflake _ AWS) Location: Chennai or Hyderabad Experience: 4 to 6 years Job Description: Data Engineer Role Summary This role focuses on building and optimizing secure data pipelines integrating AWS services and Snowflake to support de-identified data consumption by analytical tools and users. Key Responsibilities Integrate de-identified data from Amazon S3 into Snowflake for downstream analytics. Build robust ETL pipelines using Glue for data cleansing, transformation, and schema alignment. Automate ingestion of structured/unstructured data from various AWS services to Snowflake. Apply masking, redaction, or pseudonymization techniques to sensitive datasets pre-ingestion. Implement lifecycle and access policies for data stored in Snowflake and AWS S3. Collaborate with analytics teams to optimize warehouse performance and data modeling. Required Skills 46 years of experience in data engineering roles. Strong hands-on experience with Snowflake (warehouse sizing, query optimization, data sharing). Familiarity with AWS Glue, S3, and IAM. Understanding of PHI/PII protection techniques and HIPAA controls. Experience in transforming datasets for BI/reporting tools. Skilled in SQL, Python, and Snowflake stored procedures.

Posted 1 month ago

Apply

12 - 15 years

35 - 45 Lacs

Pune, Bengaluru, Mumbai (All Areas)

Hybrid

Naukri logo

Strong frontend development experience with ReactJS, JavaScript or TypeScript. Proficiency in HTML5, CSS3 & responsive design best practices. Hands-on exp with AWS Cloud Services, specifically designing systems with SNS, SQS, EC2, Lambda & S3. Required Candidate profile Expert-level exp in backend development using .NetCore, C# & EF Core. Strong expertise in PostgreSQL & efficient database design. Proficient in building & maintaining RESTful APIs at scale.

Posted 1 month ago

Apply

6 - 11 years

18 - 30 Lacs

Gurugram

Work from Office

Naukri logo

Application layer technologies including Tomcat/Nodejs, Netty, Springboot, hibernate, Elasticsearch, Kafka, Apache flink Frontend technologies including ReactJs, Angular, Android/IOS Data storage technologies like Oracle, S3, Postgres, Mongodb

Posted 1 month ago

Apply

8 - 13 years

12 - 22 Lacs

Hyderabad, Bengaluru, Mumbai (All Areas)

Work from Office

Naukri logo

Greetings of The Day...!!! We have an URGENT on-rolls opening for the position of "Snowflake Architect" at One of our reputed clients for WFH. Name of the Company - Confidential Rolls - Onrolls Mode of Employment - FTE / Sub-Con / Contract Job Location - Remote Job Work Timings Night Shift – 06.00 pm to 03.00 am IST Nature of Work – Work from Home Working Days – 5 Days Weekly Educational Qualification - Bachelor's degree in computer science, BCA, engineering, or a related field. Salary – Maximum CTC Would be 23LPA (Salary & benefits package will be commensurate with experience and qualifications, PF, Medical Insurance cover available) Language Known - English, Hindi, & local language. Experience – 9 Years + of relevant experience in the same domain. Job Summary: We are seeking a highly skilled and experienced Snowflake Architect to lead the design, development, and implementation of scalable, secure, and high-performance data warehousing solutions on the Snowflake platform. The ideal candidate will possess deep expertise in data modelling, cloud architecture, and modern ELT frameworks. You will be responsible for architecting robust data pipelines, optimizing query performance, and ensuring enterprise-grade data governance and security. In this role, you will collaborate with data engineers, analysts, and business stakeholders to deliver efficient data solutions that drive informed decision-making across the organization. Key Responsibilities: Manage and maintain the Snowflake platform to ensure optimal performance and reliability. Collaborate with data engineers and analysts to design and implement data pipelines. Develop and optimize SQL queries for efficient data retrieval and manipulation. Create custom scripts and functions using JavaScript and Python to automate platform tasks. Troubleshoot platform issues and provide timely resolutions. Implement security best practices to protect data within the Snowflake platform. Stay updated on the latest Snowflake features and best practices to continuously improve platform performance. Required Qualifications: Bachelor’s degree in computer science, Engineering, or a related field. Minimum of Nine years of experience in managing any Database platform. Proficiency in SQL for data querying and manipulation. Strong programming skills in JavaScript and Python. Experience in optimizing and tuning Snowflake for performance. Preferred Skills: Technical Expertise Cloud & Integration Performance & Optimization Security & Governance Soft Skills THE PERSON SHOULD BE WILLING TO JOIN IN 07-10 DAYS TIME OR IMMEDIATE JOINER. Request for interested candidates; Please share your updated resume with us below Email-ID executivehr@monalisammllp.com, also candidate can call or WhatsApp us at 9029895581. Current /Last Net in Hand - Salary will be offered based on the interview /Technical evaluation process -- Notice Period & LWD was/will be - Reason for Changing the job - Total Years of Experience in Specific Field – Please specify the location which you are from – Do you hold any offer from any other association - ? Regards, Monalisa Group of Services HR Department 9029895581 – Call / WhatsApp executivehr@monalisammllp.com

Posted 1 month ago

Apply

10 - 15 years

17 - 22 Lacs

Mumbai, Hyderabad, Bengaluru

Work from Office

Naukri logo

Job roles and responsibilities : The AWS DevOps Engineer is responsible for automating, optimizing, and managing CI/CD pipelines, cloud infrastructure, and deployment processes on AWS. This role ensures smooth software delivery while maintaining high availability, security, and scalability. Design and implement scalable and secure cloud infrastructure on AWS, utilizing services such as EC2,EKS, ECS, S3, RDS, and VPC Automate the provisioning and management of AWS resources using Infrastructure as Code tools: (Terraform/ Cloud Formation / Ansible ) and YAML Implement and maintain continuous integration and continuous deployment (CI/CD) pipelines using tools like Jenkins, GitLab, or AWS CodePipeline Advocate for a No-Ops model, striving for console-less experiences and self-healing systems Experience with containerization technologies: Docker and Kubernetes Mandatory Skills: Overall experience is 5 - 8 Years on AWS Devops Speicalization (AWS CodePipeline, AWS CodeBuild, AWS CodeDeploy, AWS CodeCommit) Work experience on AWS Devops, IAM Work expertise on coding tools - Terraform or Ansible or Cloud Formation , YAML Good on deployment work - CI/CD pipelining Manage containerized workloads using Docker, Kubernetes (EKS), or AWS ECS , Helm Chart Has experience of database migration Proficiency in scripting languages (Python AND (Bash OR PowerShell)). Develop and maintain CI/CD pipelines using (AWS CodePipeline OR Jenkins OR GitHub Actions OR GitLab CI/CD) Experience with monitoring and logging tools (CloudWatch OR ELK Stack OR Prometheus OR Grafana) Career Level - IC4 Responsibilities Job roles and responsibilities : The AWS DevOps Engineer is responsible for automating, optimizing, and managing CI/CD pipelines, cloud infrastructure, and deployment processes on AWS. This role ensures smooth software delivery while maintaining high availability, security, and scalability. Design and implement scalable and secure cloud infrastructure on AWS, utilizing services such as EC2,EKS, ECS, S3, RDS, and VPC Automate the provisioning and management of AWS resources using Infrastructure as Code tools: (Terraform/ Cloud Formation / Ansible ) and YAML Implement and maintain continuous integration and continuous deployment (CI/CD) pipelines using tools like Jenkins, GitLab, or AWS CodePipeline Advocate for a No-Ops model, striving for console-less experiences and self-healing systems Experience with containerization technologies: Docker and Kubernetes Mandatory Skills: Overall experience is 5 - 8 Years on AWS Devops Speicalization (AWS CodePipeline, AWS CodeBuild, AWS CodeDeploy, AWS CodeCommit) Work experience on AWS Devops, IAM Work expertise on coding tools - Terraform or Ansible or Cloud Formation , YAML Good on deployment work - CI/CD pipelining Manage containerized workloads using Docker, Kubernetes (EKS), or AWS ECS , Helm Chart Has experience of database migration Proficiency in scripting languages (Python AND (Bash OR PowerShell)). Develop and maintain CI/CD pipelines using (AWS CodePipeline OR Jenkins OR GitHub Actions OR GitLab CI/CD) Experience with monitoring and logging tools (CloudWatch OR ELK Stack OR Prometheus OR Grafana)

Posted 1 month ago

Apply

3 - 5 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

An experienced consulting professional who has an understanding of solutions, industry best practices, multiple business processes or technology designs within a product/technology family. Operates independently to provide quality work products to an engagement. Performs varied and complex duties and tasks that need independent judgment, in order to implement Oracle products and technology to meet customer needs. Applies Oracle methodology, company procedures, and leading practices. Over 3 to 5+ years of relevant IT experience with 5 + years in Oracle VBCS, OIC, PL/SQL, PCS based implementations as a Technical lead and senior developer Role is an Individual Contributor role. Being Hands-On is a critical requirement. Must Have: Experience Solution design for Customer engagements in the UI and Integration (OIC) space At least 5 project experience in developing SaaS Extensions using VBCS, OIC ORDS. Understanding of Inherent tools and technologies of SaaS Applications (FBDI, BIP, ADFDI, Applications Composer, Page Integration, etc.) Expertise in Oracle Visual Builder Studio, Good experience with Build and Release, Systems Integration, Agile, Estimations/Planning. Experience in configuring SSO PaaS extensions with Fusion SaaS Drive detailed design using customer requirements Good understanding and usage of OCI architecture, serverless functions, APU Gateway, object storage Conduct Design review to provide guidance and Quality assurance around standard methodologies and frameworks Experience in PCS is an added advantage. Good to have SOA/OSB/ODI/BPM skills. Have experience of building at least one project from scratch Experience with rolling out three big project (multiple phased release or country rollouts) to production. #LI-DNI Career Level - IC2 Responsibilities Standard assignments are accomplished without assistance by exercising independent judgment, within defined policies and processes, to deliver functional and technical solutions on moderately complex customer engagements. #LI-DNI

Posted 1 month ago

Apply

10 - 15 years

13 - 18 Lacs

Hyderabad

Work from Office

Naukri logo

As a member of the Support organization, your focus is to deliver SaaS support and solutions to the Oracle customer in Customer Success Services while serving as an advocate for customer needs. This involves resolving SaaS Ap-plications technical/non-technical customer incidents via Incident Management System and electronic means, as well as, technical questions regarding the use of and troubleshooting for our Electronic Support Services. A primary point of contact for customers, you are responsible for facilitating customer relationships with Support and providing advice and assistance to internal Oracle employees on diverse customer situations and escalated issues. You would be expected to be a hands-on lead with Oracle Fusion Supply Chain Management Functional/Technical skills. Career Level - IC4 Responsibilities As a Supply Chain Management Lead, you will offer strategic functional/technical support to assure the highest level of customer satisfaction. A primary focus is to create/utilize automated technology and instrumentation to diagnose, document, and resolve/avoid customer issues. You are expected to be an expert member of the technical problem solving/problem avoidance team, routinely sought after to address extremely complex, critical customer issues. Services may be frequently provided by on-site customer visits.

Posted 1 month ago

Apply

6 - 10 years

6 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

MS or BS in Computer Science or Equivalent 6-10+ years of relevant experience Strong Software Engineering Fundamentals API Development Proficiency in Data Structures Algorithms : Critical for designing efficient systems that handle large-scale data. System Design : Ability to design scalable, fault-tolerant systems and services. Coding Skills : Expertise in languages like Java, Python, or Scala, especially for backend systems development. RESTful Services : Proven experience in designing and building robust APIs for data access and integration. GraphQL: Knowledge of modern data access technologies for flexible querying. Microservices Architecture : Experience with creating microservices that handle different aspects of data management. Data Architecture and Design Patterns Database Design : Strong knowledge of both relational (SQL) and non-relational (NoSQL) databases like Oracle, MongoDB, Cassandra, or DynamoDB. Data Modeling : Ability to design and manage data models for efficient storage and retrieval. Data Warehousing : Experience with data warehouses and ETL processes. Data Access Controls : Experience with role-based access control (RBAC) and encryption techniques to secure sensitive data. Metadata Management : Familiarity with tools and processes for tracking data lineage, metadata catalogs (e.g., Apache Atlas, DataHub). Ability to handle complex data-related challenges, from dealing with incomplete or inconsistent data to optimizing performance. Strong analytical thinking to derive insights from data and build solutions that improve platform performance. Data Pipeline and Orchestration ETL/ELT Tools : Experience with data pipeline orchestration tools like Airflow, Prefect, or Dagster. Automation CI/CD : Familiarity with setting up CI/CD pipelines for data infrastructure. Distributed Systems : Experience with technologies like Hadoop, Spark, Kafka, and Flink for managing large-scale data processing pipelines. Data Streaming : Familiarity with real-time data processing frameworks (e.g., Apache Kafka, Pulsar, or RabbitMQ). Cloud Platforms : Hands-on experience with cloud-based data services (AWS, GCP, Azure), including data storage (S3, GCS) and data analytics (EMR, Dataproc). Leadership and Collaboration Team Leadership : Ability to mentor junior engineers and guide them in building robust, scalable systems. Cross-Functional Collaboration : Experience working closely with data scientists, analysts, and other engineers to deliver on the platforms objectives. Stakeholder Management : Strong communication skills for presenting technical decisions to non-technical stakeholders. Career Level - IC4 Responsibilities Strong Software Engineering Fundamentals API Development Proficiency in Data Structures Algorithms : Critical for designing efficient systems that handle large-scale data. System Design : Ability to design scalable, fault-tolerant systems and services. Coding Skills : Expertise in languages like Java, Python, or Scala, especially for backend systems development. RESTful Services : Proven experience in designing and building robust APIs for data access and integration. GraphQL: Knowledge of modern data access technologies for flexible querying. Microservices Architecture : Experience with creating microservices that handle different aspects of data management. Data Architecture and Design Patterns Database Design : Strong knowledge of both relational (SQL) and non-relational (NoSQL) databases like Oracle, MongoDB, Cassandra, or DynamoDB. Data Modeling : Ability to design and manage data models for efficient storage and retrieval. Data Warehousing : Experience with data warehouses and ETL processes. Data Access Controls : Experience with role-based access control (RBAC) and encryption techniques to secure sensitive data. Metadata Management : Familiarity with tools and processes for tracking data lineage, metadata catalogs (e.g., Apache Atlas, DataHub). Ability to handle complex data-related challenges, from dealing with incomplete or inconsistent data to optimizing performance. Strong analytical thinking to derive insights from data and build solutions that improve platform performance. Data Pipeline and Orchestration ETL/ELT Tools : Experience with data pipeline orchestration tools like Airflow, Prefect, or Dagster. Automation CI/CD : Familiarity with setting up CI/CD pipelines for data infrastructure. Distributed Systems : Experience with technologies like Hadoop, Spark, Kafka, and Flink for managing large-scale data processing pipelines. Data Streaming : Familiarity with real-time data processing frameworks (e.g., Apache Kafka, Pulsar, or RabbitMQ). Cloud Platforms : Hands-on experience with cloud-based data services (AWS, GCP, Azure), including data storage (S3, GCS) and data analytics (EMR, Dataproc). Leadership and Collaboration Team Leadership : Ability to mentor junior engineers and guide them in building robust, scalable systems. Cross-Functional Collaboration : Experience working closely with data scientists, analysts, and other engineers to deliver on the platforms objectives. Stakeholder Management : Strong communication skills for presenting technical decisions to non-technical stakeholders.

Posted 1 month ago

Apply

8 - 13 years

10 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com with the below details ASAP. C.CTC- E.CTC- Notice Period- Current location- Are you serving Notice period/immediate- Exp in Snowflake- Exp in Matillion- 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Only Serving/ Immediate candidates can apply. Interview Process-1 Round(Virtual)+Final Round(F2F) Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse,Python, Matillion, AWS S3, EC2 Preferred skills : SSIR, SSIS, Informatica, Shell Scripting Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com

Posted 1 month ago

Apply

3 - 8 years

10 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com with the below details ASAP. C.CTC- E.CTC- Notice Period- Current location- Are you serving Notice period/immediate- Exp in Snowflake- Exp in Matillion- 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Only Serving/ Immediate candidates can apply. Interview Process-2 Rounds(Virtual)+Final Round(F2F) Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse,Python, Matillion, AWS S3, EC2 Preferred skills : SSIR, SSIS, Informatica, Shell Scripting Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com

Posted 1 month ago

Apply

3 - 6 years

5 - 8 Lacs

Pune

Work from Office

Naukri logo

Software Engineering at HMH is focused on building fantastic software to meet the challenges facing teachers and learners, enabling and supporting a wide range of next generation learning experiences. We design and build custom applications and services used by millions. We are creating teams full of innovative, eager software professionals to build the products that will transform our industry. We are staffing small, self-contained development teams with people who love solving problems, building high quality products and services. We use a wide range of technologies and are building up a next generation microservices platform that can make our learning tools and content available to all our customers. If you want to make a difference in the lives of students and teachers and understand what it takes to deliver high quality software, we would love to talk to you about this opportunity. Technology Stack You'll work with technologies such as Java, Spring Boot, Kafka, Aurora, Mesos, Jenkins etc. This will be a hands-on coding role working as part of a cross-functional team alongside other developers, designers and quality engineers, within an agile development environment. Were working on the development of our next generation learning platform and solutions utilizing the latest in server and web technologies. Responsibilities: Build high-quality, clean, scalable, and reusable code by enforcing best practices around software engineering architecture and processes (Code Reviews, Unit testing, etc.) on the team. Work with the product owners to understand detailed requirements and own your code from design, implementation, test automation and delivery of high-quality product to our users. Identify ways to improve data reliability, efficiency, and quality. Perform development tasks from design specifications. Construct and verify (unit test) software components to meet design specifications. Perform quality assurance functions by collaborating with the cross-team members to identify and resolve software defects. Participate in production support and on-call rotation for the services owned by the team. Adhere to standards, such as security patterns, logging patterns, etc. Collaborate with cross-functional team members/vendors in different geographical locations to ensure successful delivery of the product features Have ownership over the things you build, help shape the product and technical vision, direction, and how we iterate. Work closely with your teammates for improved stability, reliability, and quality. Perform other duties as assigned to ensure the success of the team and the entire organization. Run numerous experiments in a fast-paced, analytical culture so you can quickly learn and adapt your work. Build and maintain CI/CD pipelines for services owned by team by following secure development practices. Skills Experience: 3 to 6 years' experience in a relevant software development role Excellent object-oriented design programming skills, including the application of design patterns and avoidance of anti-patterns. Strong Cloud platform skills: AWS Lambda, Terraform, SNS, SQS, RDS, Kinesis, DynamoDB etc. Experience building large-scale, enterprise applications with ReactJS/AngularJS. Proficient with front-end technologies, such as HTML, CSS, JavaScript preferred. Experience working in a collaborative team of application developers and source code repositories. Deep knowledge of more than one programming language like Node.js/Java. Demonstrable knowledge of AWS and Data Platform experience: Lambda, Dynamodb, RDS, S3, Kinesis, Snowflake. Demonstrated ability to follow through with all tasks, promises and commitments. Ability to communicate and work effectively within priorities. Ability to work under tight timelines in a fast-paced environment. Understanding software development methodologies and principles. Ability to solve large scale complex problems. Working experience of modern Agile software development methodologies (i.e. Kanban, Scrum, Test Driven Development)

Posted 1 month ago

Apply

2 - 4 years

4 - 6 Lacs

Pune

Work from Office

Naukri logo

In this role, you will maintain and enhance frontend services, working with AngularJS to support existing user-facing features while contributing to modernization efforts using React. Collaboration is key, as you'll work closely with designers, backend developers, and product managers to integrate systems involving XHTML/JSF-based UIs and Java backend frameworks like Spring and JBoss-Seam. You'll play a pivotal role in ensuring the technical feasibility of UI/UX designs while optimizing applications for performance, scalability, and accessibility. A strong emphasis on code quality is essential, requiring you to write clean, maintainable, and well-documented code that adheres to industry best practices. Additionally, you'll stay informed about emerging technologies and industry trends, continuously learning and applying this knowledge to improve the system and processes . Required Skills & Experience 3 to 5 years' experience in a relevant software development role Frontend Frameworks : Proficiency in AngularJS; familiarity with React is a plus. Web Development : Expertise in HTML5, CSS3, JavaScript, and TypeScript. Backend Integration : Experience with Node.js, Express.js, and API-driven development. Build Tools and Testing : Familiarity with task automation tools like Grunt and unit testing frameworks like Karma. Containerization : Understanding of Docker for application deployment. AWS Services : Working knowledge of AWS services such as S3, Lambda, and CloudFront. Soft Skills : Strong communication, problem-solving skills, and a collaborative mindset. Preferred Skills Agile Practices : Familiarity with agile development methodologies. Testing Frameworks : Proficiency with Jest or Mocha for testing. Advanced AWS Services : Familiarity with CloudWatch, DynamoDB, API Gateway, AppSync, Route 53, CloudTrail, WAF, and X-Ray.

Posted 1 month ago

Apply

11 - 20 years

20 - 35 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 11 - 20 Yrs Location- Pan India Job Description : - Minimum 2 Years hands on experience in Solution Architect ( AWS Databricks ) If interested please forward your updated resume to sankarspstaffings@gmail.com With Regards, Sankar G Sr. Executive - IT Recruitment

Posted 1 month ago

Apply

5 - 10 years

0 - 0 Lacs

Hyderabad

Work from Office

Naukri logo

Job Description: DevOps Engineer Qualifications: - Bachelors or Masters degree in Computer Science or Computer Engineering. - 4 to 8 years of experience in DevOps. Key Skills and Responsibilities: - Passionate about continuous build, integration, testing, and delivery of systems. - Strong understanding of distributed systems, APIs, microservices, and cloud computing. - Experience in implementing applications on private and public cloud infrastructure. - Proficient in container technologies such as Kubernetes, including experience with public clouds like AWS, GCP, and other platforms through migrations, scaling, and day-to-day operations. - Hands-on experience with AWS services (VPC, EC2, EKS, S3, IAM, etc.) and Elastic Beanstalk. - Knowledge of source control management (Git, GitHub, GitLab). - Hands-on experience with Kafka for data streaming and handling microservices communication. - Experience in managing Jenkins for CI/CD pipelines. - Familiar with logging tools and monitoring solutions. - Experience working with network load balancers (Nginx, Netscaler). - Proficient with KONG API gateways, Kubernetes, PostgreSQL, NoSQL databases, and Kafka. - Experience with AWS S3 buckets, including policy management, storage, and backup using S3 and Glacier. - Ability to respond to production incidents and take on-call responsibilities. - Experience with multiple cloud providers and designing applications accordingly. - Skilled in owning and operating mission-critical, large-scale product operations (provisioning, deployment, upgrades, patching, and incidents) on the cloud. - Strong commitment to ensuring high availability and scalability of production systems. - Continuously raising the standard of engineering excellence by implementing best DevOps practices. - Quick learner with a balance between listening and taking charge. Responsibilities: - Develop and implement tools to automate and streamline operations. - Develop and maintain CI/CD pipeline systems for application development teams using Jenkins. - Prioritize production-related issues alongside operational team members. - Conduct root cause analysis, resolve issues, and implement long-term fixes. - Expand the capacity and improve the performance of current operational systems. Regards Mohammed Umar Farooq HR Recruitment Team Revest Solutions 9949051730

Posted 1 month ago

Apply

3 - 5 years

7 - 11 Lacs

Gurugram

Remote

Naukri logo

Groundtruth looking for DevOps Engineer who can join us within 30 Days You will: Increase velocity of engineering teams by creating/deploying new stacks, services, and automations Work on projects to improve tooling, efficiency, and standardize/automate approaches (DRY) for commonly-used stacks/services Manage user access to services/systems via tools such as AWS IAM, terraform, and saltstack Participate in on-call rotation to handle critical and/or service-impacting issues Seek pragmatic opportunities to improve our infrastructure, processes, and operational activities Plan, provision, operate, and monitor cloud infrastructure for multiple areas of the business that you support. Design and assist with development and integration of monitoring dashboards, alerting solutions, and devops tools. Collaborate with Software Engineering to plan feature releases and to monitor and support applications including cost analysis and controls. Respond to system, application, security, and customer incidents conducting cause and impact analysis. Participate in on-call support rotation You have: This is our ideal wish list, but most people dont check every box on every job description. So, if you meet most of the criteria below and are excited about the opportunity, and willing to learn, wed love to hear from you. working in a DevOps roles supporting Engineering teams 4 year degree in Computer Science or related field and 3+ years of experience in software engineering OR 6+ years of experience in software development with no degree Experience working with multiple AWS technologies including IAM, EC2, ECS, S3, RDS, EMR, Glue, or similar Experience working for a geographically distributed company Knowledge of CI/CD tools and integration along with container and other microservice-related technologies Proficiency with Github, Github Actions, AWS CLI, and troubleshooting web services and distributed systems Experience in one or more of the following: Python, Bash/Shell, Go, Terraform (or other IaC tools) Experience with automation tools (Saltstack, Chef, Ansible) Experience with IaC tools (e.g. Terraform) Experience working with cloud (AWS, Azure, GCP) preferably with multi-region tenancy Experience with linux administration Experience with shell scripting/cron Nice to have Python3 coding experience (or similar) automation of cloud deployments/infra mgmt. experience with containerization (docker, kubernetes, etc) experience with networking set up (on prem or virtual) experience with monitoring/alerting tools (e.g. cloudwatch alarms, graphite, prometheus, etc) What we offer At GroundTruth, we want our employees to be comfortable with their benefits so they can focus on doing the work they love. Parental leave- Maternity and Paternity Flexible Time Offs (Earned Leaves, Sick Leaves, Birthday leave, Bereavement leave & Company Holidays) In Office Daily Catered Lunch Fully stocked snacks/beverages Health cover for any hospitalization. Covers both nuclear family and parents Tele-med for free doctor consultation, discounts on health checkups and medicines Wellness/Gym Reimbursement Pet Expense Reimbursement Childcare Expenses and reimbursements Employee referral program Education reimbursement program Skill development program Cell phone reimbursement (Mobile Subsidy program). Internet reimbursement/Postpaid cell phone bill/or both. Birthday treat reimbursement Employee Provident Fund Scheme offering different tax saving options such as Voluntary Provident Fund and employee and employer contribution up to 12% Basic Creche reimbursement Co-working space reimbursement National Pension System employer match Meal card for tax benefit Special benefits on salary account Interested one share update resume at laxmi.pal@groundtruth.com or if you are immediate joiner and having relevant experience please connect on 9220900537

Posted 1 month ago

Apply

5 - 10 years

5 - 15 Lacs

Hyderabad

Hybrid

Naukri logo

We are seeking an experienced Senior DevOps Engineer with deep expertise in building automation and CI/CD pipelines within a serverless AWS environment . The ideal candidate will have hands-on experience managing AWS Lambda at scale , designing infrastructure with AWS CDK , and implementing pipelines using GitHub Actions . This role will play a key part in scaling, securing, and optimizing our cloud-native architecture. Key Responsibilities: Design, implement, and maintain robust CI/CD pipelines using GitHub Actions and AWS CDK . Build and manage serverless applications with a focus on scalability, performance, and reliability . Configure and maintain key AWS services including: IAM, API Gateway, Lambda (600+ functions), SNS, SQS, EventBridge, CloudFront, S3, RDS, RDS Proxy, Secrets Manager, KMS, and CloudWatch . Develop infrastructure as code (IaC) using AWS CDK and CloudFormation Templates . Code primarily in TypeScript , with additional scripting in Python as needed. Implement and optimize DynamoDB and other AWS-native databases. Enforce best practices for cloud security, monitoring, and cost optimization. Collaborate with development, QA, and architecture teams to enhance deployment workflows and reduce release cycles. Required Skills & Experience: Strong expertise in AWS serverless technologies , including large-scale Lambda function management. Extensive experience with AWS CDK and GitHub Actions for pipeline automation. Hands-on with AWS services: IAM, API Gateway, Lambda, SQS, SNS, S3, CloudWatch, RDS, EventBridge , and others. Proficient in TypeScript ; familiarity with Python is a plus. Solid understanding of CI/CD practices , infrastructure automation, and Git-based workflows . Experience building scalable and secure serverless systems in production.

Posted 1 month ago

Apply

4 - 9 years

12 - 16 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

Role & responsibilities Urgent Hiring for one of the reputed MNC Immediate Joiners Only Females Exp - 4-9 Years Bang / Hyd / Pune As a Python Developer with AWS , you will be responsible for developing cloud-based applications, building data pipelines, and integrating with various AWS services. You will work closely with DevOps, Data Engineering, and Product teams to design and deploy solutions that are scalable, resilient, and efficient in an AWS cloud environment. Key Responsibilities: Python Development : Design, develop, and maintain applications and services using Python in a cloud environment. AWS Cloud Services : Leverage AWS services such as EC2 , S3 , Lambda , RDS , DynamoDB , and API Gateway to build scalable solutions. Data Pipelines : Develop and maintain data pipelines, including integrating data from various sources into AWS-based storage solutions. API Integration : Design and integrate RESTful APIs for application communication and data exchange. Cloud Optimization : Monitor and optimize cloud resources for cost efficiency, performance, and security. Automation : Automate workflows and deployment processes using AWS Lambda , CloudFormation , and other automation tools. Security & Compliance : Implement security best practices (e.g., IAM roles, encryption) to protect data and maintain compliance within the cloud environment. Collaboration : Work with DevOps, Cloud Engineers, and other developers to ensure seamless deployment and integration of applications. Continuous Improvement : Participate in the continuous improvement of development processes and deployment practices. Required Qualifications: Python Expertise : Strong experience in Python programming, including using libraries like Pandas , NumPy , Boto3 (AWS SDK for Python), and frameworks like Flask or Django . AWS Knowledge : Hands-on experience with AWS services such as S3 , EC2 , Lambda , RDS , DynamoDB , CloudFormation , and API Gateway . Cloud Infrastructure : Experience in designing, deploying, and maintaining cloud-based applications using AWS. API Development : Experience in designing and developing RESTful APIs, integrating with external services, and managing data exchanges. Automation & Scripting : Experience with automation tools and scripts (e.g., using AWS Lambda , Boto3 , CloudFormation ). Version Control : Proficiency with version control tools such as Git . CI/CD Pipelines : Experience building and maintaining CI/CD pipelines for cloud-based applications. Preferred candidate profile Familiarity with serverless architectures using AWS Lambda and other AWS serverless services. AWS Certification (e.g., AWS Certified Developer Associate , AWS Certified Solutions Architect Associate ) is a plus. Knowledge of containerization tools like Docker and orchestration platforms such as Kubernetes . Experience with Infrastructure as Code (IaC) tools such as Terraform or AWS CloudFormation .

Posted 1 month ago

Apply

5 - 10 years

15 - 30 Lacs

Chennai, Bengaluru, Hyderabad

Hybrid

Naukri logo

AWS Cloud Engineer CEBE Why HCLTech? HCLTech is a next-generation global technology company that helps enterprises reimagine their businesses for the digital age. Our belief in the values of trust, transparency, flexibility and value-centricity,fueledby our philosophy of 'Employees First', ensures the continued pursuit of our customers' best interests. What is HCLSoftware? HCLSoftwareis the software business division ofHCLTech, fueling the Digital+ Economy by developing, sharing and supporting solutions in five key areas: Business & Industry Applications, Intelligent Operations, Total Experience, Data Analytics and Cybersecurity. Wedevelop, market, sell, and support over 20 product families.We haveoffices and labs around the world to serve thousands of customers. Our mission is to drive customer success with our relentless product innovation at more than 20,000 organizations in every region of the world — including more than half of the Fortune 1000 and Global 2000 companies. Which team will you be working in? You will be working in the Cloud Engineering and Business Experience (CeBe) team within HCLSoftware. The HCLSoftware CeBe team drives the cloud-native strategy for HCL Software. We innovate with new technologies and apply them to the HCLSoftware portfolio. The team is distributed across several locations, in India, Europe and the USA. Senior Software Engineer III Looking for AWS Cloud Engineer who designs, implements, and manages cloud infrastructure on Amazon Web Services (AWS), ensuring high availability, scalability, and performance. Should be familiar with a wide range of AWS services, including compute (EC2), storage (S3, EBS), databases (RDS, DynamoDB), networking (VPC), security (IAM, WAF). Should have Strong hands-on experience and understanding of Node JS, AWS Lambda services, DynamoDB and S3 Storage. Also need to work on other technologies as per need basis. Should have experience with infrastructure-as-code tools like CloudFormation or Terraform. Should mentor and guide large development teams from the technology perspective suggesting multiple solutions to developer issues with a problem-solving mindset. Should be responsible for translating business requirements into technical solutions, focusing on serverless architectures and IaC practices. Should have Cross functional group co-ordination experience like QA, AppOps, Release Engineering etc. Should have Strong oral and written communication skills. Should have Good attitude and eagerness to learn.

Posted 1 month ago

Apply

6 - 11 years

15 - 30 Lacs

Bengaluru, Hyderabad, Gurgaon

Work from Office

Naukri logo

Were Hiring: Sr. AWS Data Engineer – GSPANN Technologies Locations: Bangalore, Pune, Hyderabad, Gurugram Experience: 6+ Years | Immediate Joiners Only Looking for experts in: AWS Services: Glue, Redshift, S3, Lambda, Athena Big Data: Spark, Hadoop, Kafka Languages: Python, SQL, Scala ETL & Data Engineering Apply now: heena.ruchwani@gspann.com #AWSDataEngineer #HiringNow #DataEngineering #GSPANN

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies