Jobs
Interviews

922 Aws Lambda Jobs - Page 15

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : Python (Programming Language), AWS Glue, AWS Lambda AdministrationMinimum 5 year(s) of experience is required Educational Qualification : Graduate Summary :As a Snowflake Data Warehouse Architect, you will be responsible for leading the implementation of Infrastructure Services projects, leveraging our global delivery capability. Your typical day will involve working with Snowflake Data Warehouse, AWS Glue, AWS Lambda Administration, and Python programming language. Roles & Responsibilities:- Lead the design and implementation of Snowflake Data Warehouse solutions for Infrastructure Services projects.- Collaborate with cross-functional teams to ensure successful delivery of projects, leveraging AWS Glue and AWS Lambda Administration.- Provide technical guidance and mentorship to junior team members.- Stay updated with the latest advancements in Snowflake Data Warehouse and related technologies, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: - Must To Have Skills: Strong experience in Snowflake Data Warehouse.- Good To Have Skills: Proficiency in Python programming language, AWS Glue, and AWS Lambda Administration.- Experience in leading the design and implementation of Snowflake Data Warehouse solutions.- Strong understanding of data architecture principles and best practices.- Experience in data modeling, data integration, and data warehousing.- Experience in performance tuning and optimization of Snowflake Data Warehouse solutions. Additional Information:- The candidate should have a minimum of 5 years of experience in Snowflake Data Warehouse.- The ideal candidate will possess a strong educational background in computer science, information technology, or a related field, along with a proven track record of delivering impactful data-driven solutions.- This position is based at our Bengaluru office. Qualification Graduate

Posted 1 month ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Gurugram

Work from Office

Responsibilities: * Design, develop & maintain data pipelines using ETL, Python, PySpark & AWS tools. * Collaborate with cross-functional teams on project requirements & deliverables.

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 15 Lacs

Mumbai, Mumbai Suburban, Mumbai (All Areas)

Work from Office

3+ yrs in AWS Cloud Computing, 5+ yrs in DevOps Strong in EC2, VPC, LB, Auto Scaling Hands-on with AWS Lambda, CloudFormation, CI/CD ,Python, Bash ,PowerShell Knows IAM, KMS, security ,monitoring, networking, blue-green & canary deployments , Neo4J

Posted 1 month ago

Apply

0.0 - 1.0 years

4 Lacs

Noida

Work from Office

Freshers vacancy in Full stack Software Development Preference to be given to the candidates having good knowledge and some hands on coding practice of React or Vue or Next.js Node.js Javascript, Typescript MySQL or MongoDB or DynamoDB Restful services Good to have knowledge of: AWS Lambda, API Gateway working in Linux environment Make use of tools: Jira, Confluence, Git, IntelliJ, OSX JSON, Agile development Role & responsibilities Code - Eat - Repeat Other Details Excellent logical & problem solving skills BE/B.Tech/BCA/MCA (2024 or 2025 completion) from recognized institute with good academic score Mandatory 30 Months Retention (Non Negotiable) In office vacancy for Sector 135 Noida Office

Posted 1 month ago

Apply

8.0 - 13.0 years

27 - 35 Lacs

Kochi, Bengaluru

Work from Office

About Us. DBiz Solution is a Transformational Partner. Digital transformation is intense. Wed like for you to have something to hold on to, whilst you set out bringing your ideas into existence. Beyond anything, we put humans first. This means solving real problems with real people and providing needs with real, working solutions. DBiz leverages a wealth of experience building a variety of software to improve our client's ability to respond to change and build tomorrows digital business. Were quite proud of our record of accomplishment. Having delivered over 150 projects for over 100 clients, we can honestly say we leave our clients happy and wanting more. Using data, we aim to unlock value and create platforms/products at scale that can evolve with business strategies using our innovative Rapid Application Development methodologies. The passion for creating an impact: Our passion for creating an impact drive everything we do. We believe that technology has the power to transform businesses and improve lives, and it is our mission to harness this power to make a difference. We constantly strive to innovate and deliver solutions that not only meet our client's needs but exceed their expectations, allowing them to achieve their goals and drive sustainable growth. Through our world-leading digital transformation strategies, we are always growing and improving. That means creating an environment where every one of us can strive together for excellence. Senior Data Engineer AWS (Glue, Data Warehousing, Optimization & Security) Experienced Senior Data Engineer (8+ Yrs) with deep expertise in AWS cloud Data services, particularly AWS Glue, to design, build, and optimize scalable data solutions. The ideal candidate will drive end-to-end data engineering initiatives from ingestion to consumption — with a strong focus on data warehousing, performance optimization, self-service enablement, and data security. The candidate needs to have experience in doing consulting and troubleshooting exercise to design best-fit solutions. Key Responsibilities Consult with business and technology stakeholders to understand data requirements, troubleshoot and advise on best-fit AWS data solutions Design and implement scalable ETL pipelines using AWS Glue, handling structured and semi-structured data Architect and manage modern cloud data warehouses (e.g., Amazon Redshift, Snowflake, or equivalent) Optimize data pipelines and queries for performance, cost-efficiency, and scalability Develop solutions that enable self-service analytics for business and data science teams Implement data security, governance, and access controls Collaborate with data scientists, analysts, and business stakeholders to understand data needs Monitor, troubleshoot, and improve existing data solutions, ensuring high availability and reliability Required Skills & Experience 8+ years of experience in data engineering in AWS platform Strong hands-on experience with AWS Glue, Lambda, S3, Athena, Redshift, IAM Proven expertise in data modelling, data warehousing concepts, and SQL optimization Experience designing self-service data platforms for business users Solid understanding of data security, encryption, and access management Proficiency in Python Familiarity with DevOps practices & CI/CD Strong problem-solving Exposure to BI tools (e.g., QuickSight, Power BI, Tableau) for self-service enablement Preferred Qualifications AWS Certified Data Analytics – Specialty or Solutions Architect – Associate

Posted 1 month ago

Apply

12.0 - 17.0 years

30 - 45 Lacs

Bengaluru

Work from Office

Work Location: Bangalore Experience :10+yrs Required Skills: Experience AWS cloud and AWS services such as S3 Buckets, Lambda, API Gateway, SQS queues; Experience with batch job scheduling and identifying data/job dependencies; Experience with data engineering using AWS platform and Python; Familiar with AWS Services like EC2, S3, Redshift/Spectrum, Glue, Athena, RDS, Lambda, and API gateway; Familiar with software DevOps CI/CD tools, such Git, Jenkins, Linux, and Shell Script Thanks & Regards Suganya R suganya@spstaffing.in

Posted 1 month ago

Apply

7.0 - 10.0 years

35 - 50 Lacs

Pune

Hybrid

Key Responsibilities Develop software solutions by studying information needs, conferring with users, studying systems flow, data usage, and work processes, investigating problem areas, and following the software development lifecycle. Document and demonstrate solutions by developing flow charts, layouts, and documentation. Determine feasibility by evaluating analysis, problem definition, requirements, solution development, and proposed solutions. Understand business needs and create tools to manage them. Prepare and install solutions by determining and designing system specifications, standards, and programming. Recommend state-of-the-art development tools, programming techniques, and computing equipment. Participate in educational opportunities, read professional publications, maintain personal networks, and participate in professional organizations. Provide information by collecting, analyzing, and summarizing development and issues while protecting IT assets by keeping information confidential. Improve applications by conducting systems analysis and recommending changes in policies and procedures. Define applications and their interfaces, allocate responsibilities to applications, understand solution deployment, and communicate requirements for interactions with solution context. Define Nonfunctional Requirements (NFRs). Understand multiple architectures and how to apply architecture to solutions. Provide oversight and foster Built-In Quality and Team and Technical Agility. Adopt new mindsets and habits in how people approach their work while supporting decentralized decision-making. Maintain strong relationships to deliver business value using relevant Business Relationship Management practices. Skills and Experience 8-10 years of software engineering in a global environment. Proficiency in Java and basic Python . AWS experience (other cloud experience can be substituted), with a preference for 3+ years of AWS experience. Experience with EC2, Lambda, SQS, API Gateway, Kinesis, S3, CloudFront, CloudWatch . Exposure to serverless architecture and infrastructure as code (CloudFormation/Terraform). Experience with DynamoDB (or another similar NoSQL DB). Strong SQL skills and experience with RDBMS (Relational Database Management System). Experience working in Agile environments. Fundamental IT technical skill sets. Experience taking a system from coping requirements through actual launch. Ability to communicate with users, other technical teams, and management to collect requirements, identify tasks, provide estimates, and meet production deadlines. Professional software engineering best practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations.

Posted 1 month ago

Apply

12.0 - 16.0 years

15 - 20 Lacs

Pune

Hybrid

Key Responsibilities: Application Analysis and Configuration: Lead the analysis and evaluation of application functionality to enhance business processes and capabilities. Determine and document the optimal application setup and configuration to meet functional requirements. Configure and verify complex application setups and configurations. Solution Development and Support: Stay current with emerging application trends and functionality, providing functional expertise for assigned applications and systems. Analyze potential application solutions, identifying and recommending solutions for functionality gaps. Develop and maintain strong relationships with vendors, improving application functionality and resolving complex issues. Collaboration and Documentation: Partner with process owners, business analysts, systems analysts, and architects to gather, document, and review requirements. Serve as the subject matter expert for assigned applications, leading the creation and management of functional specifications for projects. Educate, coach, and train team members to improve support request handling and overall efficiency. Functional Expertise and Support: Act as a point of contact for application content, processes, and functionality. Lead the implementation of Step 3 improvements after obtaining necessary approvals. Ensure strong business value delivery using Business Relationship Management practices. External Qualifications and Competencies Competencies: Business Insight: Applying market knowledge to advance organizational goals. Customer Focus: Building strong relationships and delivering customer-centric solutions. Decision Quality: Making timely and effective decisions. Develops Talent: Mentoring team members to meet career and organizational goals. Directs Work: Providing direction and removing obstacles to achieve objectives. Strategic Mindset: Anticipating future possibilities and developing breakthrough strategies. Tech Savvy: Adopting innovations in technology applications. Solution Configuration: Creating and testing solutions using industry standards. Solution Design and Modeling: Designing and documenting solutions to meet business and compliance requirements. Values Differences: Recognizing and leveraging the value of diverse perspectives and cultures. Education, Licenses, Certifications: BE/B.Tech in a relevant field. Experience: 12+ years of significant relevant experience in application development, configuration, and support. Additional Responsibilities Unique to this Position Skills and Experience: Technical Expertise: Deep understanding of AWS services (e.g., AWS Lambda, API Gateway, S3, EC2, RDS). Proficiency with AWS SDKs, CLI, and CI/CD pipelines (AWS CodePipeline, CodeBuild). Experience with Relational (PostgreSQL, MySQL) and NoSQL databases (DynamoDB). Knowledge of disaster recovery strategies and performance monitoring for distributed web applications. Strong programming skills in Python, Node.js, and Java. Application Development and Configuration: Proficiency in Angular/React and understanding of dependency injection, authentication, and authorization patterns. Experience with unit testing frameworks and documentation of technical concepts. Project Management and Agile Methodologies: Experience with Agile development methodologies and collaboration tools (Jira, Confluence). Ability to lead and manage projects, ensuring alignment with business, technical, and compliance requirements.

Posted 1 month ago

Apply

6.0 - 8.0 years

25 - 30 Lacs

Hyderabad

Work from Office

Transnational ai Architect develop scalable backend systems using Python and FastAPI Collaborate with data scientists to productionize AI/ML models Deploy intelligent services using AWS Lambda, API Gateway, DynamoDB, etc.

Posted 1 month ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Bengaluru

Work from Office

Why this job matters Cloud Native Java Developer - To Individually contribute and Drive transformation of our existing Java microservices deployed on Amazon Elastic Kubernetes Service (EKS) to serverless AWS Lambda functions , below are the Roles and Responsibilities What youll be doing Key Responsibilities Develop and deploy serverless applications using Quarkus/Spring Boot and AWS Lambda Build RESTful APIs and event-driven microservices using cloud-native patterns Optimize cold-start performance using GraalVM native images Integrate with AWS services such as AWS API Gateway, S3, DynamoDB, CloudWatch and Postgres Implement and manage Lambda authorizers (custom and token-based) for securing APIs Design and configure AWS API Gateway for routing, throttling, and securing endpoints Integrate OAuth 2.0 authentication flows using Azure Active Directory as the identity provider Descent Understanding of resilience patterns Write unit and integration tests using JUnit, Mockito, and Quarkus testing tools Collaborate with DevOps teams to automate deployments using AWS SAM, CDK, or Terraform Monitor and troubleshoot production issues using AWS observability tools Migration Responsibilities Analyse existing Spring Boot microservices deployed on Kubernetes to identify candidates for serverless migration Refactor services to be stateless, event-driven, and optimized for short-lived execution Replace Kubernetes ingress and service discovery with API Gateway and Lambda triggers Migrate persistent state and configuration to AWS-native services (e.g., DynamoDB, S3, Secrets Manager) Redesign CI/CD pipelines to support serverless deployment workflows Ensure performance, cost-efficiency, and scalability in the new architecture Document migration strategies, patterns, and best practices for future reference Technical Proficiency Strong industry expereince of 4+ years with command of Java 8+, with deep understanding of: Functional interfaces (Function, Predicate, Supplier, Consumer) Streams API, lambda expressions, and Optional Proficiency in Java concurrency, including: Thread management, Executor Service, Completable Future, and parallel streams Designing thread-safe components and understanding concurrency pitfalls Understanding of AWS EKS (Elastic Kubernetes Service) , Docker Containers and Kubernetes fundamentals: Experience with resource requests and limits, pod autoscaling, and K8s networking Familiarity with transitioning workloads from EKS to serverless environments.

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Hyderabad, Pune, Chennai

Work from Office

This role requires solid expertise in .NET, ReactJS, AWS, and PostgreSQL or SQL. Candidates with healthcare domain exposure will be preferred. Experience with PHP or CodeIgniter is an added advantage but not mandatory

Posted 1 month ago

Apply

6.0 - 11.0 years

20 - 27 Lacs

Bengaluru

Work from Office

Job Description: Experience in Java, J2ee, Spring boot. Experience in Design, Kubernetes, AWS (Lambda, EKS, EC2) is needed. Experience in AWS cloud monitoring tools like Datadog, Cloud watch, Lambda is needed. Experience with XACML Authorization policies. Experience in NoSQL , SQL database such as Cassandra, Aurora, Oracle. Experience with Web Services SOA experience (SOAP as well as Restful with JSON formats), with Messaging (Kafka). Hands on with development and test automation tools/frameworks (e.g. BDD and Cucumber)

Posted 1 month ago

Apply

4.0 - 8.0 years

10 - 20 Lacs

Bengaluru

Work from Office

We Are Hiring Immediate Joiners...!!! Job Title: Python Developer Experience Level: 4-8 years Location: Bangalore - Koramangala Salary: 10 to 20 LPA Mandate Skills: AI/ML, AWS, Python, Microservices. Notice Period: 0 to 15 Days Max Design and develop microservices using Python Deploy microservices as AWS Lambda functions, packaged as Docker images when necessary Integrate with various data sources using Python SDKs or REST APIs Implement WebSocket communication in Python services Work with libraries to extract data from PDFs, images, and Word documents (Good to have) Experience with Retrieval-Augmented Generation (RAG) frameworks like LangChain Interact with AWS RDS (PostgreSQL) via Python scripts Use libraries such as pandas and numpy for data cleansing and transformation

Posted 1 month ago

Apply

6.0 - 11.0 years

20 - 32 Lacs

Bengaluru

Work from Office

Role & responsibilities Develop product features in a full-stack arena using React, Node.js, AWS, AWS lambda, Dynamo DB, Develop in micro frontend SPA architectures. Integrate with composable web platforms like Netlify, Vercel, etc. Translate business requirements into technical ones with very strong communication skills. Guide junior fellows and help them develop their skills and qualities. Help build a strong customer data platform for enabling extreme personalization across all digital touchpoints. Help the team build scalable, easy-to-maintain software to support millions of users and transactions. Preferred candidate profile Good experience building front-end apps in Javascript, ReactJS, Node.js as well as experience on backend systems using object oriented programming languages. Knowledge of micro-frontend architecture Experience with BFF design patterns A creative and precise problem solver and a quick learner adapting to changing requirements in a fast paced environment Digital Product/UX understanding AWS Experience CI/CD Jenkins pipeline configuration Bachelor's degree in Computer Science or Computer Engineering

Posted 1 month ago

Apply

6.0 - 11.0 years

20 - 32 Lacs

Bengaluru

Work from Office

Role & responsibilities Develop product features in a full-stack arena using React, Next.js, Node.js, AWS, AWS lambda, Dynamo DB, Develop in micro frontend SPA architectures. Integrate with composable web platforms like Netlify, Vercel, etc. Translate business requirements into technical ones with very strong communication skills. Guide junior fellows and help them develop their skills and qualities. Help build a strong customer data platform for enabling extreme personalization across all digital touchpoints. Help the team build scalable, easy-to-maintain software to support millions of users and transactions. Preferred candidate profile Good experience building front-end apps in Javascript, Next.js, ReactJS, Node.js as well as experience on backend systems using object oriented programming languages. Knowledge of micro-frontend architecture Experience with BFF design patterns A creative and precise problem solver and a quick learner adapting to changing requirements in a fast paced environment Digital Product/UX understanding AWS Experience CI/CD Jenkins pipeline configuration Bachelor's degree in Computer Science or Computer Engineering

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 18 Lacs

Bengaluru, Mumbai (All Areas)

Hybrid

About the Role: We are seeking a passionate and experienced Subject Matter Expert and Trainer to deliver our comprehensive Data Engineering with AWS program. This role combines deep technical expertise with the ability to coach, mentor, and empower learners to build strong capabilities in data engineering, cloud services, and modern analytics tools. If you have a strong background in data engineering and love to teach, this is your opportunity to create impact by shaping the next generation of cloud data professionals. Key Responsibilities: Deliver end-to-end training on the Data Engineering with AWS curriculum, including: - Oracle SQL and ANSI SQL - Data Warehousing Concepts, ETL & ELT - Data Modeling and Data Vault - Python programming for data engineering - AWS Fundamentals (EC2, S3, Glue, Redshift, Athena, Kinesis, etc.) - Apache Spark and Databricks - Data Ingestion, Processing, and Migration Utilities - Real-time Analytics and Compute Services (Airflow, Step Functions) Facilitate engaging sessions virtual and in-person and adapt instructional methods to suit diverse learning styles. Guide learners through hands-on labs, coding exercises, and real-world projects. Assess learner progress through evaluations, assignments, and practical assessments. Provide mentorship, resolve doubts, and inspire confidence in learners. Collaborate with the program management team to continuously improve course delivery and learner experience. Maintain up-to-date knowledge of AWS and data engineering best practices. Ideal Candidate Profile: Experience: Minimum 5-8 years in Data Engineering, Big Data, or Cloud Data Solutions. Prior experience delivering technical training or conducting workshops is strongly preferred. Technical Expertise: Proficiency in SQL, Python, and Spark. Hands-on experience with AWS services: Glue, Redshift, Athena, S3, EC2, Kinesis, and related tools. Familiarity with Databricks, Airflow, Step Functions, and modern data pipelines. Certifications: AWS certifications (e.g., AWS Certified Data Analytics Specialty) are a plus. Soft Skills: Excellent communication, facilitation, and interpersonal skills. Ability to break down complex concepts into simple, relatable examples. Strong commitment to learner success and outcomes. Email your application to: careers@edubridgeindia.in.

Posted 1 month ago

Apply

3.0 - 6.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your Role Should have developed/Worked for atleast 1 Gen AI project. Has data pipeline implementation experience with any of these cloud providers - AWS, Azure, GCP. Experience with cloud storage, cloud database, cloud data warehousing and Data lake solutions like Snowflake, Big query, AWS Redshift, ADLS, S3. Has good knowledge of cloud compute services and load balancing. Has good knowledge of cloud identity management, authentication and authorization. Proficiency in using cloud utility functions such as AWS lambda, AWS step functions, Cloud Run, Cloud functions, Azure functions. Experience in using cloud data integration services for structured, semi structured and unstructured data such as Azure Databricks, Azure Data Factory, Azure Synapse Analytics, AWS Glue, AWS EMR, Dataflow, Dataproc. Your Profile Good knowledge of Infra capacity sizing, costing of cloud services to drive optimized solution architecture, leading to optimal infra investment vs performance and scaling. Able to contribute to making architectural choices using various cloud services and solution methodologies. Expertise in programming using python. Very good knowledge of cloud Dev-ops practices such as infrastructure as code, CI/CD components, and automated deployments on cloud. Must understand networking, security, design principles and best practices in cloud. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.

Posted 1 month ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) What do you need for this opportunity? Must have skills required: Data Governance, Lakehouse architecture, Medallion Architecture, Azure DataBricks, Azure Synapse, Data Lake Storage, Azure Data Factory Intelebee LLC is Looking for: Data Engineer:We are seeking a skilled and hands-on Cloud Data Engineer with 5-8 years of experience to drive end-to-end data engineering solutions. The ideal candidate will have a deep understanding of dimensional modeling, data warehousing (DW), Lakehouse architecture, and the Medallion architecture. This role will focus on leveraging Azure's/AWS ecosystem to build scalable, efficient, and secure data solutions. You will work closely with customers to understand requirements, create technical specifications, and deliver solutions that scale across both on-premise and cloud environments. Key Responsibilities: End-to-End Data Engineering Lead the design and development of data pipelines for large-scale data processing, utilizing Azure/AWS tools such as Azure Data Factory, Azure Synapse, Azure functions, Logic Apps , Azure Databricks, and Data Lake Storage. Tools, AWS Lambda, AWS Glue Develop and implement dimensional modeling techniques and data warehousing solutions for effective data analysis and reporting. Build and maintain Lakehouse and Medallion architecture solutions for streamlined, high-performance data processing. Implement and manage Data Lakes on Azure/AWS, ensuring that data storage and processing is both scalable and secure. Handle large-scale databases (both on-prem and cloud) ensuring high availability, security, and performance. Design and enforce data governance policies for data security, privacy, and compliance within the Azure ecosystem.

Posted 1 month ago

Apply

6.0 - 10.0 years

8 - 13 Lacs

Mumbai, Bengaluru, Delhi / NCR

Work from Office

Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote What do you need for this opportunity? Must have skills required: ML, Python Looking for: Were looking for a hands-on engineering lead to own the delivery of our GenAI-centric product from the backend up to the UI while integrating visual AI pipelines built by ML engineers. Youll be both a builder and a leader: writing clean Python, Java and TypeScript, scaling AWS-based systems, mentoring engineers, and making architectural decisions that stand the test of scale. You wont be working in a silo this is a role for someone who thrives in fast-paced, high-context environments with product, design, and AI deeply intertwined. (Note: This role requires both technical mastery and leadership skills - we're looking for someone who can write production code, make architectural decisions, and lead a team to success.) What Youll Do Lead development of our Java, Python (FastAPI), and Node.js backend services on AWS Deploy ML pipelines (built by the ML team) into containerized inference workflows using FastAPI, Docker, and GPU-enabled ECS EC2. Deploy and manage services on AWS ECS/Fargate, Lambda, API Gateway, and GPU-powered EC2 Contribute to React/TypeScript frontend when needed to accelerate product delivery Work closely with the founder, product, and UX team to translate business needs into working product Make architecture and infrastructure decisions from media processing to task queues to storage Own the performance, reliability, and cost-efficiency of our core services Hire and mentor junior/mid engineers over time Drive technical planning, sprint prioritization, and trade-off decisions A customer-centric approach you think about how your work affects end users and product experience, not just model performance A quest for high-quality deliverables you write clean, tested code and debug edge cases until theyre truly fixed The ability to frame problems from scratch and work without strict handoffs you build from a goal, not a ticket. Skills & Experience We Expect Core Engineering Experience 6-8 years of professional software engineering experience in production environments 2-3 years of experience leading engineering teams of 5+ engineers Cloud Infrastructure & AWS Expertise (5+ years) Deep experience with AWS Lambda, ECS, and container orchestration tools Familiarity with API Gateway and microservices architecture best practices Proficient with S3, DynamoDB, and other AWS-native data services CloudWatch, X-Ray, or similar tools for monitoring and debugging distributed systems Strong grasp of IAM, roles, and security best practices in cloud environments Backend Development (5-7 years) Java: Advanced concurrency, scalability, and microservice design Python: Experience with FastAPI, building production-grade MLops pipelines Node.js & TypeScript: Strong backend engineering and API development Deep understanding of RESTful API design and implementation Docker: 3+ years of containerization experience for building/deploying services Hands-on experience deploying ML inference pipelines (built by ML team) using Docker, FastAPI, and GPU-based AWS infrastructure (e.g., ECS, EC2) 2+ years System Optimization & Middleware (3-5 years) Application performance optimization and AWS cloud cost optimization Use of background job frameworks (e.g., Celery, BullMQ, AWS Step Functions) Media/image processing using tools like Sharp, PIL, Imagemagick, or OpenCV Database design and optimization for low-latency and high-availability systems Frontend Development (2-3 years) Hands-on experience with React and TypeScript in modern web apps Familiarity with Redux, Context API, and modern state management patterns Comfortable with modern build tools, CI/CD, and frontend deployment practices System Design & Architecture (4-6 years) Designing and implementing microservices-based systems Experience with event-driven architectures using queues or pub/sub Implementing caching strategies (e.g., Redis, CDN edge caching) Architecting high-performance image/media pipelines Leadership & Communication (2-3 years) Proven ability to lead engineering teams and drive project delivery Skilled at writing clear and concise technical documentation Experience mentoring engineers, conducting code reviews, and fostering growth Track record of shipping high-impact products in fast-paced environments Strong customer-centric and growth-oriented mindset, especially in startup settings able to take high-level goals and independently drive toward outcomes without requiring constant handoffs or back-and-forth with the founder Proactive in using tools like ChatGPT, GitHub Copilot, or similar AI copilots to improve personal and team efficiency, remove blockers, and iterate faster.

Posted 1 month ago

Apply

8.0 - 13.0 years

14 - 24 Lacs

Hyderabad

Work from Office

Roles and Responsibilities Lead the backend development for our AI-based product, driving architectural decisions and hands on implementation. Design and develop scalable, secure, and maintainable APIs using AWS Lambda and API Gateway . Build and maintain CI/CD pipelines using AWS-native tools (CodePipeline, CodeBuild) and GitHub. Collaborate with frontend developers (React/MUI) to ensure seamless integration between frontend and backend systems. Work closely with AWS and infrastructure teams to implement best practices in performance, security, and cost optimization. Review code, provide technical guidance to junior developers, and drive high engineering standards. Participate in sprint planning, estimations, and cross-functional discussions. An Ideal Candidate would have Strong programming skills in Python , with experience building production-grade applications. Proven experience with AWS Lambda , API Gateway , and other serverless components. Deep understanding of RESTful API design and development. Hands-on experience in setting up CI/CD pipelines using AWS services and GitHub. Familiarity with event-driven architectures , cloud deployments , and security best practices. Experience in working with Agile/Scrum methodologies. Strong communication and leadership skills to coordinate across cross-functional teams. Good-to-Have: Exposure to AI/ML pipelines, vector databases, or model-serving workflows. Experience with AWS Step Functions , DynamoDB , S3 , CloudWatch , and CloudFormation . Knowledge of observability tools (e.g., X-Ray, Prometheus, Grafana). Familiarity with frontend architecture and integration patterns. Experience: 8+ years with at least 2 years in a lead capacity Location: Hyderabad, India Role: Full Time Salary: Competitive

Posted 1 month ago

Apply

4.0 - 6.0 years

6 - 14 Lacs

Coimbatore

Work from Office

Hiring Full Stack Engineer (React, React Native, Python, AWS) with 46 yrs exp for on-site role at Sense7AI. Work with offshore clients, flexible IST/EST hours. Strong API, AWS Serverless skills needed. Immediate joiners preferred. hr@sense7ai.com Health insurance Flexi working Cafeteria Work from home

Posted 1 month ago

Apply

10.0 - 15.0 years

17 - 30 Lacs

Pune, Chennai

Hybrid

DevOps Engineer to join our team and lead the migration of 800 SQL servers from Windows to Linux. The ideal candidate will have extensive experience with Amazon Linux and a strong background in infrastructure setup, database migration, and cluster formation. Role & responsibilities Infrastructure Setup: Install and configure Linux OS on new servers, with a focus on Amazon Linux. Set up and manage server clusters for high availability and load balancing. Perform domain join activities to integrate Linux servers into the existing domain. Database Migration: Execute database backup and restore processes. Conduct basic validation to ensure data synchronization and integrity. Troubleshoot and resolve any issues that arise during the migration process. Automation and Scripting: Develop and maintain automation scripts for deployment, monitoring, and maintenance tasks. Implement CI/CD pipelines to streamline the migration process and ensure continuous integration and delivery. Collaboration and Support: Work closely with cross-functional teams, including database administrators, system administrators, and network engineers. Provide technical support and guidance to team members throughout the migration process. Document processes, procedures, and best practices for future reference. Lead the organizations platform security efforts by collaborating with the core engineering team Develop policies, standards, and guidelines for IAC and CI/CD that teams can follow Implementing application solutions both in the cloud as well as participating in technical research and development to enable continuing innovation within the DevOps space. Implement scalable, resilient, and secure solutions in the public cloud, especially in AWS. Preferred candidate profile Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience as a DevOps Engineer, with a focus on Linux environments. Strong expertise in Amazon Linux and related tools and technologies. Proficiency in scripting languages such as Bash, Python, or Perl. Experience with SQL Server management and migration. Familiarity with clustering technologies and high availability setups. Excellent problem-solving skills and attention to detail. Strong communication and collaboration abilities. Experience with cloud platforms such as AWS, Azure, or Google Cloud. Knowledge of containerization technologies like Docker and Kubernetes. Understanding of network protocols and security best practice Mandatory Skills : DevOps/CICD Pipeline knowledge AWS ECS AWS S3 Kubernetes AWS Lambda Jenkins Optional AWS Ansible Experience building AWS platforms Extensive AWS experience Development and testing skills Skilled in deployment and network operations Experience working in a large team agile team Strong experience with using infrastructure as a code Extensive CI/CD experience

Posted 1 month ago

Apply

5.0 - 10.0 years

15 - 27 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Need someone with strong Python and AWS automation experience , ideally with migration backgrounds or DevOps orchestration experience Role & responsibilities Engage with customers application and infrastructure teams to understand existing runbooks and migration steps. Design, develop, test, and deploy automation scripts for each migration workflow using Python and Terraform. Implement AWS Lambda-based orchestration and integrate with Step Functions, EventBridge, and other AWS-native tools. Translate manual activities into reusable automation modules, progressively replacing stepwise manual efforts with Lambda-driven automation. Collaborate during cutover planning to identify automatable activities and convert them into secure, scalable workflows. Handle diverse workloads spanning across geographies with an emphasis on reliability and compliance.

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Gurugram

Work from Office

Role & responsibilities Key Responsibilities Design, build, and maintain scalable and efficient data pipelines to move data between cloud-native databases (e.g., Snowflake) and SaaS providers using AWS Glue and Python Implement and manage ETL/ELT processes to ensure seamless data integration and transformation Ensure information security and compliance with data governance standards Maintain and enhance data environments, including data lakes, warehouses, and distributed processing systems Utilize version control systems (e.g., GitHub) to manage code and collaborate effectively with the team Primary Skills: Enhancements, new development, defect resolution, and production support of ETL development using AWS native services Integration of data sets using AWS services such as Glue and Lambda functions. Utilization of AWS SNS to send emails and alerts Authoring ETL processes using Python and PySpark ETL process monitoring using CloudWatch events Connecting with different data sources like S3 and validating data using Athena. Experience in CI/CD using GitHub Actions Proficiency in Agile methodology Extensive working experience with Advanced SQL and a complex understanding of SQL. Secondary Skills: Experience working with Snowflake and understanding of Snowflake architecture, including concepts like internal and external tables, stages, and masking policies. Competencies / Experience: Deep technical skills in AWS Glue (Crawler, Data Catalog): 5 years. Hands-on experience with Python and PySpark: 3 years. PL/SQL experience: 3 years CloudFormation and Terraform: 2 years CI/CD GitHub actions: 1 year Experience with BI systems (PowerBI, Tableau): 1 year Good understanding of AWS services like S3, SNS, Secret Manager, Athena, and Lambda: 2 years Additionally, familiarity with any of the following is highly desirable: Jira, GitHub, Snowflake

Posted 1 month ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Bengaluru, Bellandur

Hybrid

Hiring an AWS Data Engineer for a 6-month hybrid contractual role based in Bellandur, Bengaluru. The ideal candidate will have 4-6 years of experience in data engineering, with strong expertise in AWS services (S3, EC2, RDS, Lambda, EKS), PostgreSQL, Redis, Apache Iceberg, and Graph/Vector Databases. Proficiency in Python or Golang is essential. Responsibilities include designing and optimizing data pipelines on AWS, managing structured and in-memory data, implementing advanced analytics with vector/graph databases, and collaborating with cross-functional teams. Prior experience with CI/CD and containerization (Docker/Kubernetes) is a plus.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies