Jobs
Interviews

519 Lambda Jobs - Page 12

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 6.0 years

2 - 7 Lacs

Salem

Work from Office

Job Title: Senior Microservices Engineer / Microservices Developer Job Overview: We are looking for a seasoned Microservices Engineer who will design, build, and maintain scalable, secure, and highperformance microservice architectures. You'll work closely with product owners, solution architects, DevOps, and QA teams to deliver cloud-native and containerized applications. Responsibilities: Design, develop, and deploy microservices-based applications using best practices and patterns Write clean, testable, and efficient API services (REST) Collaborate across teams to define requirements, participate in architecture discussions, and conduct code reviews Implement CI/CD pipelines and DevOps best practices (e.g. Jenkins, Terraform, Docker, Kubernetes) . Monitor, troubleshoot, and optimize microservices performance in production environments Document service designs, API contracts, and deployment processes Mentor and guide junior developers in microservices design, testing, and deployment Required Qualifications & Skills Bachelors degree in CS or related field (or equivalent experience). 3+ years in software development focused on microservices architecture Strong programming proficiency in Python, Django,AWS API Gateway, Lambda. Solid experience with RESTful API design, event-driven patterns, and message brokers (e.g., Kafka, RabbitMQ) . Proficient in Docker and orchestration tools (Kubernetes, ECS/EKS) Familiarity with AWS cloud platform. Understanding of SQL and NoSQL databases, caching layers, and data management in distributed systems. Solid grasp of distributed systems concepts: service discovery, resilience patterns, circuit breakers, eventual consistency Excellent problem-solving skills and communication abilities; experience with agile methodologies. Company: Mukesh Buildtech is an innovative stealth startup focused on a new marketplace. We are building a cutting-edge platform that leverages advanced technologies to provide unparalleled user experiences. Join our dynamic team and be a part of our exciting journey from the ground up. Mukesh Buildtech Private Limited is backed by the strategic guidance of Mukesh & Associates (www.mukeshassociates.com) If you're interested, please share your CV with us at sumathi@mukeshassociates.com

Posted 1 month ago

Apply

10.0 - 17.0 years

9 - 15 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Dear Candidate, Please find below job description Role :- MLOps + ML Engineer Job Description: Role Overview: We are looking for a highly experienced MLOps and ML Engineer to lead the design, deployment, and optimization of machine learning systems at scale. This role requires deep expertise in MLOps practices, CI/CD automation, and AWS SageMaker, with a strong foundation in machine learning engineering and cloud-native development. Key Responsibilities: Architect and implement robust MLOps pipelines for model development, deployment, monitoring, and governance. Lead the operationalization of ML models using AWS SageMaker and other AWS services. Build and maintain CI/CD pipelines for ML workflows using tools like GitHub Actions, Jenkins, or AWS CodePipeline. Automate model lifecycle management including retraining, versioning, and rollback. Collaborate with data scientists, ML engineers, and DevOps teams to ensure seamless integration and scalability. Monitor production models for performance, drift, and reliability. Establish best practices for reproducibility, security, and compliance in ML systems. Required Skills: 10+ years of experience in ML Engineering, MLOps, or related fields. Deep hands-on experience with AWS SageMaker, Lambda, S3, CloudWatch, and related AWS services. Strong programming skills in Python and experience with Docker, Kubernetes, and Terraform. Expertise in CI/CD tools and infrastructure-as-code. Familiarity with model monitoring tools (e.g., Evidently, Prometheus, Grafana). Solid understanding of ML algorithms, data pipelines, and production-grade systems. Preferred Qualifications: AWS Certified Machine Learning Specialty or DevOps Engineer certification. Experience with feature stores, model registries, and real-time inference systems. Leadership experience in cross-functional ML/AI teams. Primary Skills: MLOps, ML Engineering, AWS related services (SageMaker/S3/CloudWatch) Regards Divya Grover +91 8448403677

Posted 1 month ago

Apply

10.0 - 15.0 years

15 - 25 Lacs

Kolkata, Hyderabad, Bengaluru

Hybrid

Experience: 10+ Years Job Description: Role Overview: We are seeking an experienced AWS Data & Analytics Architect with a strong background in delivery and excellent communication skills. The ideal candidate will have over 10 years of experience and a proven track record in managing teams and client relationships. You will be responsible for leading data modernization and transformation projects using AWS services. Key Responsibilities: Lead and architect data modernization/transformation projects using AWS services. Manage and mentor a team of data engineers and analysts. Build and maintain strong client relationships, ensuring successful project delivery. Design and implement scalable data architectures and solutions. Oversee the migration of large datasets to AWS, ensuring data integrity and security. Collaborate with stakeholders to understand business requirements and translate them into technical solutions. Ensure best practices in data management and governance are followed. Required Skills and Experience: 10+ years of experience in data architecture and analytics. Hands-on experience with AWS services such as Redshift, S3, Glue, Lambda, RDS, and others. Proven experience in delivering 1-2 large data migration/modernization projects using AWS. Strong leadership and team management skills. Excellent communication and interpersonal skills. Deep understanding of data modeling, ETL processes, and data warehousing. Experience with data governance and security best practices. Ability to work in a fast-paced, dynamic environment. Preferred Qualifications: AWS Certified Solutions Architect Professional or AWS Certified Big Data Specialty. Experience with other cloud platforms (e.g., Azure, GCP) is a plus. Familiarity with machine learning and AI technologies.

Posted 1 month ago

Apply

5.0 - 10.0 years

15 - 22 Lacs

Pune, Chennai, Bengaluru

Hybrid

5-8 years of experience in backend development with a strong focus on Python. Proven experience with AWS serverless technologies, including Lambda, DynamoDB, and other related services. Hands-on experience with Terraform for infrastructure as code.

Posted 1 month ago

Apply

7.0 - 12.0 years

15 - 25 Lacs

Hyderabad, Bengaluru, Mumbai (All Areas)

Hybrid

Job Summary: We are seeking a highly motivated and experienced Senior Data Engineer to join our team. This role requires a deep curiosity about our business and a passion for technology and innovation. You will be responsible for designing and developing robust, scalable data engineering solutions that drive our business intelligence and data-driven decision-making processes. If you thrive in a dynamic environment and have a strong desire to deliver top-notch data solutions, we want to hear from you. Key Responsibilities: Collaborate with agile teams to design and develop cutting-edge data engineering solutions. Build and maintain distributed, low-latency, and reliable data pipelines ensuring high availability and timely delivery of data. Design and implement optimized data engineering solutions for Big Data workloads to handle increasing data volumes and complexities. Develop high-performance real-time data ingestion solutions for streaming workloads. Adhere to best practices and established design patterns across all data engineering initiatives. Ensure code quality through elegant design, efficient coding, and performance optimization. Focus on data quality and consistency by implementing monitoring processes and systems. Produce detailed design and test documentation, including Data Flow Diagrams, Technical Design Specs, and Source to Target Mapping documents. Perform data analysis to troubleshoot and resolve data-related issues. Automate data engineering pipelines and data validation processes to eliminate manual interventions. Implement data security and privacy measures, including access controls, key management, and encryption techniques. Stay updated on technology trends, experimenting with new tools, and educating team members. Collaborate with analytics and business teams to improve data models and enhance data accessibility. Communicate effectively with both technical and non-technical stakeholders. Qualifications: Education: Bachelors degree in Computer Science, Computer Engineering, or a related field. Experience: Minimum of 5+ years in architecting, designing, and building data engineering solutions and data platforms. Proven experience in building Lakehouse or Data Warehouses on platforms like Databricks or Snowflake. Expertise in designing and building highly optimized batch/streaming data pipelines using Databricks. Proficiency with data acquisition and transformation tools such as Fivetran and DBT. Strong experience in building efficient data engineering pipelines using Python and PySpark. Experience with distributed data processing frameworks such as Apache Hadoop, Apache Spark, or Flink. Familiarity with real-time data stream processing using tools like Apache Kafka, Kinesis, or Spark Structured Streaming. Experience with various AWS services, including S3, EC2, EMR, Lambda, RDS, DynamoDB, Redshift, and Glue Catalog. Expertise in advanced SQL programming and performance tuning. Key Skills: Strong problem-solving abilities and perseverance in the face of ambiguity. Excellent emotional intelligence and interpersonal skills. Ability to build and maintain productive relationships with internal and external stakeholders. A self-starter mentality with a focus on growth and quick learning. Passion for operational products and creating outstanding employee experiences.

Posted 1 month ago

Apply

7.0 - 10.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Datamatics is a CMMI Level 5 company. Datamatics, a global Digital Solutions, Technology, and BPM Company, provides intelligent solutions for data-driven businesses to increase productivity and enhance the customer experience. With a completely digital approach, Datamatics portfolio spans across Information Technology Services, Business Process Management, Engineering Services, and Big Data & Analytics all powered by Artificial Intelligence. It has established products in Robotic Process Automation, Intelligent Document Processing, Business Intelligence, and Automatic Fare Collection. Datamatics services global customers across Banking, Financial Services, Insurance, Healthcare, Manufacturing, International Organizations, and Media & Publishing. The Company has a presence across 4 continents with major delivery centers in the USA, India, and Philippines. Job Role - PHP Lead Experience - 7+ Years Location - Bangalore - Work from office Job Role : We are looking for an experienced and highly motivated Senior Developer to join our dynamic team. The ideal candidate should have 7+ years of hands-on experience in PHP development and expertise in multiple frameworks (Symfony, Laravel, codeigniter). The Senior PHP Developer will be responsible for leading the Junior team of developers, guiding them through complex technical challenges. This role requires a deep understanding of PHP.. Experience in the banking domain is an added advantage. Key Responsibilities: Hands on coding while mentoring/helping other team members. Design software high level and detail design/architecture. interact with customers, requirements gathering and clarification. Proven experience in end-to-end projects, from requirements discussion to implementation and support. Strong proficiency in Symfony or Laravel frameworks. Preferable Symfony or multiple frameworks working experience. Expertise in object-oriented programming and MVC architecture. Experience with RESTful APIs, microservices, and third-party integrations. Strong knowledge of front-end technologies like HTML5, CSS3, JavaScript, and jQuery. Familiarity with databases such as MySQL, PostgreSQL, and ORM techniques. Proficient in version control tools like Git. Experience with cloud platforms and deployment pipelines (AWS, Azure, etc.) is a plus. Knowledge on AWS, GCP, or Azure services such as Lambda, RDS, and S3. Knowledge of web security best practices. Excellent problem-solving skills and attention to detail. Strong leadership and team management skills. Ability to work in a fast-paced environment and manage multiple tasks. Good communication skills, both written and verbal. Strong understanding of Agile Scrum methodologies and hands-on experience in Agile projects (e.g., Jira, Confluence). Excellent analytical, problem-solving, and critical-thinking skills. Ability to adapt to changing priorities and work effectively under pressure. Preferred Skills: Proficiency in automated testing frameworks such as PHPUnit, Behat, or Codeception for improving testing efficiency. Knowledge of Agile methodologies and DevOps practices. Knowledge in profiling and optimizing PHP applications, including caching solutions (Redis, Memcached) and query optimization. Knowledge on AWS, GCP, or Azure services such as Lambda, RDS, and S3.

Posted 1 month ago

Apply

5.0 - 10.0 years

16 - 20 Lacs

Mumbai, Goregaon

Work from Office

Role Overview We are seeking a highly skilled Engineering Manager with deep expertise in the MERN stack (MongoDB, Express, React, Node.js), AWS infrastructure, and DevOps practices. This role requires both hands-on technical leadership and strong people management to lead a team of engineers building scalable, high-performance applications. Key Responsibilities Lead, mentor, and manage a team of full-stack developers working primarily with MERN. Own architecture decisions, code quality, and engineering practices across multiple microservices. Collaborate with Product, Design, and QA teams to define and deliver on product roadmaps. Implement CI/CD pipelines, infrastructure as code, and automated testing strategies. Ensure system scalability, security, and performance optimization across services. Drive sprint planning, code reviews, and technical documentation standards. Work closely with DevOps to maintain uptime and operational excellence. Required Skills 6+ years of experience with full-stack JavaScript development (MERN stack) 2+ years in a leadership/managerial role Strong understanding of Node.js backend and API development Hands-on with React.js, component design, and front-end state management Proficient in MongoDB and designing scalable NoSQL schemas Experience in AWS services (EC2, S3, RDS, Lambda, CloudWatch, IAM) Working knowledge of Docker, GitHub Actions, or similar CI/CD tools Familiarity with monitoring tools like New Relic, Datadog, or Prometheus Solid experience managing agile workflows and team velocity

Posted 1 month ago

Apply

5.0 - 10.0 years

14 - 18 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

5+ years of working experience in Python 4+ years of hands-on experience with AWS Development, PySpark, Lambdas, Cloud Watch (Alerts), SNS, SQS, Cloud formation, Docker, ECS, Fargate, and ECR. Very strong hands-on knowledge on using Python for integrations between systems through different data formats Expert in deploying and maintaining the applications in AWS and Hands on experience in Kinesis streams, Auto-scaling Team player with very good written and communication skills Strong problem solving and decision-making skills Ability to solve complex software system issues Collaborate with business and other teams to understand business requirements and work on the project deliverables. Participate in requirements gathering and understanding Design a solution based on available framework and code

Posted 1 month ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Hyderabad

Work from Office

Key Responsibilities: Design and implement robust backend systems using C#/.NET. Develop and maintain microservices architecture. Utilize event-driven architecture to enhance system responsiveness and scalability. Manage and optimize AWS infrastructure, including SQS, SNS, Lambda, and DynamoDB. Work with both relational and non-relational databases to ensure data integrity and performance. Implement and maintain CI/CD pipelines to streamline development and deployment processes. Collaborate with Agile teams to deliver high-quality software solutions. Integrate and manage Apache Kafka for real-time data processing. Qualifications: At least 7+ years in the below: Proven experience as a Principal Engineer or similar role with a strong backend focus. Expertise in C#/.NET development. In-depth knowledge of microservices and event-driven architecture. Extensive experience with AWS infrastructure (SQS, SNS, Lambda, DynamoDB). Proficiency in working with both relational and non-relational databases. Strong understanding of CI/CD pipelines and Agile methodology. Hands-on experience with Apache Kafka. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills.

Posted 1 month ago

Apply

11.0 - 20.0 years

25 - 40 Lacs

Hyderabad, Chennai, Greater Noida

Hybrid

Primary Skills Proficiency in AWS Services : Deep knowledge of EC2, S3, RDS, Lambda, VPC, IAM, AWS Event Bridge, AWS B2Bi (EDI Generator), CloudFormation, and more. Cloud Architecture Design : Ability to design scalable, resilient, and cost-optimized architectures. Networking & Connectivity: Understanding of VPC peering, Direct Connect, Route 53, and load balancing. Security & Compliance: Implementing IAM policies, encryption, KMS, and compliance frameworks like HIPAA or GDPR. Infrastructure as Code (IaC): Using tools like AWS CloudFormation or Terraform to automate deployments. DevOps Integration : Familiarity with CI/CD pipelines, AWS CodePipeline, and container orchestration (ECS, EKS). Cloud Migration : Planning and executing lift-and-shift or re-architecting strategies for cloud adoption. Monitoring & Optimization: Using CloudWatch, X-Ray, and Trusted Advisor for performance tuning and cost control. Secondary Skills Programming Skills : Python, Java, or Node.js for scripting and automation. Serverless Architecture: Designing with Lambda, API Gateway, and Step Functions. Cost Management: Understanding pricing models (On-Demand, Reserved, Spot) and using Cost Explorer. Disaster Recovery & High Availability: Multi-AZ deployments, backups, and failover strategies. Soft Skills: Communication, stakeholder management, and documentation. Team Collaboration: Working with DevOps, security, and development teams to align cloud goals. Certifications: AWS Certified Solutions Architect Associate/Professional, and optionally DevOps Engineer or Security Specialty

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Pune, Chennai, Mumbai (All Areas)

Hybrid

Job Title: Backend Engineer Python + AI Integration Location: India (Hybrid - Preferably Chennai/ Mumbai / Pune) Experience Required: Minimum 5 Years Joining Timeline: Immediate to 30 Days Preferred Role Overview: We are seeking a seasoned Backend Engineer with strong proficiency in Python and a solid understanding of AI/ML model integration . This role is ideal for someone who thrives at the intersection of backend engineering and intelligent systems building scalable APIs, handling data workflows, and integrating machine learning models in production environments. Key Responsibilities: 1. Backend & API Development Develop RESTful and GraphQL APIs using Django, FastAPI, or Flask . Implement async tasks with tools like Celery, RabbitMQ , and webhooks. Design clean, scalable, and secure architecture following SOLID principles . 2. Database Design & Optimization Design, maintain, and optimize PostgreSQL databases JSONB, partitioning, window functions, materialized views, etc. Write complex SQL queries , design ER diagrams, manage schema migrations (e.g., Alembic, Flyway). Troubleshoot performance issues using EXPLAIN plans , handle connection pooling, and resolve deadlocks. 3. AI/ML System Integration Work closely with data scientists to deploy ML models as APIs. Use tools like scikit-learn, PyTorch, Hugging Face, TensorFlow for integrating AI capabilities. Deploy RAG systems using FAISS, Weaviate, Qdrant , and integrate with OpenAI APIs, LangChain, LlamaIndex . Build pipelines with Airflow, Prefect for continuous training and deployment. 4. Infrastructure & DevOps Containerize applications using Docker & Docker Compose . Implement CI/CD pipelines using GitHub Actions, GitLab CI, or Jenkins . Monitor systems using Prometheus, Grafana, ELK, or Sentry . Familiarity with cloud platforms ( AWS, GCP, Azure ) working with S3, Lambda, Cloud SQL, SageMaker , etc. Must-Have Skills: Proficiency in Python and OOP principles. Deep knowledge of PostgreSQL and general RDBMS optimization. Experience with RESTful APIs , async processing, and microservice design. Exposure to AI/ML workflows , including model deployment and monitoring. Knowledge of authentication/authorization standards (OAuth2, JWT). Good-to-Have (Bonus Skills): Experience with LLMs , embedding-based search , and RAG systems . Familiarity with Streamlit/Dash for internal tools and dashboards. Understanding of data governance , PII protection, and anonymization. Exposure to event-driven systems (Kafka, AWS SNS/SQS). Open-source contributions or technical blog writing. Interested Candidates can drop your resume to subashini.gopalan@kiya.ai

Posted 1 month ago

Apply

8.0 - 13.0 years

0 - 0 Lacs

Hyderabad

Work from Office

Position: Python Developer + AWS ; Location: Hyderabad ; Job type: Contract to hire On payrolls of Randstad Digital ; Experience: 8+ years ; Face To Face interview: 2nd July-25 ; Number of positions: 5 ; Need only immediate joiners ; 4 days WFO Primary Skills (Mandatory top 3 skills) AWS working experience AWS Glue or equivalent product experience Lambda functions Python programming Kubernetes knowledge Secondary Skills (Good to have) Data quality Data Governance knowledge Migration experience CI/CD Jules working knowledge

Posted 1 month ago

Apply

1.0 - 2.0 years

1 - 3 Lacs

Navi Mumbai

Work from Office

Job Title: CloudOps Engineer Department: Information Technology. Reporting line: Manager-IT Key Responsibilities Good understanding of the AWS/Azure cloud platform. Knowledge of Cloud Services, design, and configuration on enterprise systems. Good understanding of Cloud administration using Console & CLI. Understanding the needs of the business for defining Cloud system specifications. Understanding Architecture requirements and ensuring effective support activities. Familiar with Windows & Linux platforms . Understanding of EC2, VPC, ELB, S3, Cloud Watch, Event Bridge, SNS, IAM, Cloud Front, Lambda Worked and Manage three-tier highly scalable architecture with multiple DR/Production Client environment which includes Load Balancer , DNS, WAF, EC2, VPC , Security Groups, Auto Scaling and many other AWS Services to manage the client infrastructure. Managing three-tier highly scalable architecture including using Security Groups, Auto Scaling and Load Balancers . • Manage Bastion host to access instances in private and public subnet securely. Creating/Managing AMI/Snapshots/Volumes, upgrade/downgrade AWS resources (CPU, Memory, EBS). Creating Cloud watch alarm and monitoring various resources like EC2 instances and load balancer, configure Alerts to slack channels. Managing AD server to add, remove, modify, user access. Lifecycle policies to transfer data from one storage class to another. VPC peering between 2 VPCs also enabled VPC flow logs to monitor network related issues. Monthly patching on stage and production server. Creating or revoking user/role (AWS account) for on-boarding and off-boarding member.

Posted 1 month ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Bengaluru

Work from Office

We are seeking a highly skilled Senior Data Engineer to join our dynamic team in Bangalore. You will design, develop, and maintain scalable data ingestion frameworks and ELT pipelines using tools such as DBT, Apache Airflow, and Prefect. The ideal candidate will have deep technical expertise in cloud platforms (especially AWS), data architecture, and orchestration tools. You will work with modern cloud data warehouses like Snowflake, Redshift, or Databricks and integrate pipelines with AWS services such as S3, Lambda, Step Functions, and Glue. A strong background in SQL, scripting, and CI/CD practices is essential. Experience with data systems in manufacturing is a plus.

Posted 1 month ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

The Customer, Sales & Service Practice | Cloud Job Title - Amazon Connect + Level 9 (Consultant) + Entity (S&C GN) Management Level: Level 9 - Consultant Location: Delhi, Mumbai, Bangalore, Gurgaon, Pune, Hyderabad and Chennai Must have skills: AWS contact center, Amazon Connect flows, AWS Lambda and Lex bots, Amazon Connect Contact Center Good to have skills: AWS Lambda and Lex bots, Amazon Connect Join our team of Customer Sales & Service consultants who solve customer facing challenges at clients spanning sales, service and marketing to accelerate business change. Practice: Customer Sales & Service Sales I Areas of Work: Cloud AWS Cloud Contact Center Transformation, Analysis and Implementation | Level: Consultant | Location: Delhi, Mumbai, Bangalore, Gurgaon, Pune, Hyderabad and Chennai | Years of Exp: 5-8 years Explore an Exciting Career at Accenture Are you passionate about scaling businesses using in-depth frameworks and techniques to solve customer facing challengesDo you want to design, build and implement strategies to enhance business performanceDoes working in an inclusive and collaborative environment spark your interestThen, this is the right place for you! Welcome to a host of exciting global opportunities within Accenture Strategy & Consultings Customer, Sales & Service practice. The Practice A Brief Sketch The Customer Sales & Service Consulting practice is aligned to the Capability Network Practice of Accenture and works with clients across their marketing, sales and services functions. As part of the team, you will work on transformation services driven by key offerings like Living Marketing, Connected Commerce and Next-Generation Customer Care. These services help our clients become living businesses by optimizing their marketing, sales and customer service strategy, thereby driving cost reduction, revenue enhancement, customer satisfaction and impacting front end business metrics in a positive manner.You will work closely with our clients as consulting professionals who design, build and implement initiatives that can help enhance business performance. As part of these, you will drive the following: Work on creating business cases for journey to cloud, cloud strategy, cloud contact center vendor assessment activities Work on creating Cloud transformation approach for contact center transformations Work along with Solution Architects for architecting cloud contact center technology with AWS platform Work on enabling cloud contact center technology platforms for global clients specifically on Amazon connect Work on innovative assets, proof of concept, sales demos for AWS cloud contact center Support AWS offering leads in responding to RFIs and RFPs Bring your best skills forward to excel at the role: Good understanding of contact center technology landscape. An understanding of AWS Cloud platform and services with Solution architect skills. Deep expertise on AWS contact center relevant services. Sound experience in developing Amazon Connect flows , AWS Lambda and Lex bots Deep functional and technical understanding of APIs and related integration experience Functional and technical understanding of building API-based integrations with Salesforce, Service Now and Bot platforms Ability to understand customer challenges and requirements, ability to address these challenges/requirements in a differentiated manner. Ability to help the team to implement the solution, sell, deliver cloud contact center solutions to clients. Excellent communications skills Ability to develop requirements based on leadership input Ability to work effectively in a remote, virtual, global environment Ability to take new challenges and to be a passionate learner Read about us. Blogs Your experience counts! Bachelors degree in related field or equivalent experience and Post-Graduation in Business management would be added value. Minimum 4-5 years of experience in delivering software as a service or platform as a service projects related to cloud CC service providers such as Amazon Connect Contact Center cloud solution Hands-on experience working on the design, development and deployment of contact center solutions at scale. Hands-on development experience with cognitive service such as Amazon connect, Amazon Lex, Lambda, Kinesis, Athena, Pinpoint, Comprehend, Transcribe Working knowledge of one of the programming/scripting languages such as Node.js, Python, Java Whats in it for you An opportunity to work on transformative projects with key G2000 clients Potential to Co-create with leaders in strategy, industry experts, enterprise function practitioners and, business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed responsible business into everythingfrom how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge and capabilities Opportunity to thrive in a culture that is committed to accelerate equality for all. Engage in boundaryless collaboration across the entire organization. About Accenture: Accenture is a leading global professional services company, providing a broad range of services and solutions in strategy, consulting, digital, technology and operations. Combining unmatched experience and specialized skills across more than 40 industries and all business functions underpinned by the worlds largest delivery network Accenture works at the intersection of business and technology to help clients improve their performance and create sustainable value for their stakeholders. With 569,000 people serving clients in more than 120 countries, Accenture drives innovation to improve the way the world works and lives. Visit us at About Accenture Strategy & Consulting: Accenture Strategy shapes our clients future, combining deep business insight with the understanding of how technology will impact industry and business models. Our focus on issues such as digital disruption, redefining competitiveness, operating and business models as well as the workforce of the future helps our clients find future value and growth in a digital world. Today, digital is changing the way organizations engage with their employees, business partners, customers and communities. This is our unique differentiator. Capability Network a distributed management consulting organization that provides management consulting and strategy expertise across the client lifecycle. Our Capability Network teams complement our in-country teams to deliver cutting-edge expertise and measurable value to clients all around the world. For more information visit https:// Accenture Capability Network | Accenture in One Word come and be a part of our team. Qualification Experience: Minimum 5 year(s) of experience is required Educational Qualification: Engineering Degree or MBA from a Tier 1 or Tier 2 institute

Posted 1 month ago

Apply

10.0 - 15.0 years

20 - 35 Lacs

Bengaluru

Work from Office

Deep expertise in a wide range of AWS services, including: Compute: (EC2, Lambda, ECS, EKS), Storage: (S3, EFS, FSx), Databases: (RDS, DynamoDB, Aurora), Networking: (VPC, Route 53, CloudFront), Security: (IAM, KMS, GuardDuty, WAF), Monitoring: ( Required Candidate profile The role of an AWS Senior Architect is a senior-level position focused on designing, implementing, and managing robust cloud solutions on Amazon Web Services (AWS). Security: (IAM, KMS, GuardDuty,

Posted 1 month ago

Apply

5.0 - 7.0 years

9 - 12 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Hiring Data Engineers with 3+ yrs in Databricks, PySpark, Delta Lake, and AWS (S3, Glue, Redshift, Lambda, EMR). Must have strong SQL/Python, CI/CD, and data pipeline experience. Only Tier-1 company backgrounds are considered.

Posted 1 month ago

Apply

5.0 - 9.0 years

12 - 20 Lacs

Noida

Remote

Job Description: Position: DevOps Engineer Shift Timings: 1 PM to 10 PM Years of Experience: 5+ Years Job Location : Anywhere Position Summary: We are looking for a talented DevOps Engineer with strong expertise in AWS and Terraform to join our team. In this role, you will design and implement infrastructure as code, automate deployment processes, and enhance the reliability and scalability of our cloud-based applications. You will collaborate with development and operations teams to streamline workflows and ensure optimal system performance. A strong focus on automated testing is essential to ensure the reliability and performance of our solutions. Key Responsibilities: Continuously assess technology hosted in the public cloud against industry standards and security compliance. Streamline Infrastructure platforms to 100% everything as code" in the public cloud. Reusability at heart. Write code once and reuse it as much as possible. This applies to Infra as code, CICD pipelines, templates, base docker images, API templates, helm charts, scripts, and anything else that can be implemented as code. Implements end-to-end automated CICD practices for build, scan, packaging, test, and deployment, with the ability to continuously deliver secure solutions at scale. Drive Containerization across application workloads to leverage native cloud features and scalability. 100% compliance on all infrastructure vulnerabilities and package vulnerabilities across supported CICD base templates. Security scanning is part of CICD, and blocking is enabled to prevent vulnerable code from being deployed into production. Drive 100% self-service and reusable automation across stakeholders (business, product delivery, etc.) for platform requests, including new Infrastructure, access management, and maintenance (patching, upgrades, etc.) Add instrumentation across Infrastructure for monitoring and alerting on internal problems before they result in user visible outages. Build processes and diagnostic tools to troubleshoot, maintain, and optimize Infrastructure and respond to customer and production incidents. Adopt continuous learning of modern data engineering practices. Maintain industry standards through incremental adoption of new technology and best practices. Create and continuously maintain high-quality, up-to-date documentation for platform and DevOps practices to meet audit and compliance standards. Required Qualifications: 5+ years of strong experience with AWS services and architecture (VPC, networking, IAM, ECS, EKS, EC2, secrets manager, API Gateway, lambda, Route53, WAF etc.). 3+ years of strong hands-on experience in Terraform for infrastructure automation. 3+ years of experience with CI/CD tools (e.g., GitHub, Azure DevOps, Docker, Helm etc.). 3+ years of working in scripting languages (e.g., Python, Bash). Prior Experience with performance tuning and scalability. Hands on experience with Infrastructure monitoring and alerting tools. Excellent problem-solving skills and ability to work collaboratively. One interested can share your CV to Swapna.mallipedi@aapc.com

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Chennai

Remote

Location: 100% Remote Employment Type: Full-Time Must have Own laptop and Internet connection Work hours: 11 AM to 8 PM IST Position Summary: We are looking for a highly skilled and self-driven Full Stack Developer with deep expertise in React.js, Node.js, and AWS cloud services. The ideal candidate will play a critical role in designing, developing, and deploying full-stack web applications in a secure and scalable cloud environment. Key Responsibilities: Design and develop scalable front-end applications using React.js and modern JavaScript/TypeScript frameworks. Build and maintain robust backend services using Node.js, Express, and RESTful APIs. Architect and deploy full-stack solutions on AWS using services such as Lambda, API Gateway, ECS, RDS, S3, CloudFormation, CloudWatch, and DynamoDB. Ensure application performance, security, scalability, and maintainability. Work collaboratively in Agile/Scrum environments and participate in sprint planning, code reviews, and daily standups. Integrate CI/CD pipelines and automate testing and deployment workflows using AWS-native tools or services like Jenkins, CodeBuild, or GitHub Actions. Troubleshoot production issues, optimize system performance, and implement monitoring and alerting solutions. Maintain clean, well-documented, and reusable code and technical documentation. Required Qualifications: 5+ years of professional experience as a full stack developer. Strong expertise in React.js (Hooks, Context, Redux, etc.). Advanced backend development experience with Node.js and related frameworks. Proven hands-on experience designing and deploying applications on AWS Cloud. Solid understanding of RESTful services, microservices architecture, and cloud-native design. Experience working with relational databases (PostgreSQL, MySQL, DynamoDB). Proficient in Git and modern DevOps practices (CI/CD, Infrastructure as Code, etc.). Strong communication skills and ability to collaborate in distributed teams.

Posted 1 month ago

Apply

7.0 - 9.0 years

15 - 22 Lacs

Chennai

Work from Office

Key Responsibilities • Application Development: Design, develop, and maintain dealership applications using Java, Spring Boot, and Microservices architecture. • Cloud Deployment: Build and manage cloud-native applications on AWS, leveraging services such as Lambda, ECS, RDS, S3, DynamoDB, and API Gateway. • System Architecture: Develop and maintain scalable, high-availability architectures to support large-scale dealership operations. • Database Management: Work with relational and NoSQL databases like PostgreSQL, and DynamoDB for efficient data storage and retrieval. • Integration & APIs: Develop and manage RESTful APIs and integrate with third-party services, including OEMs and dealership management systems (DMS). • DevOps & CI/CD: Implement CI/CD pipelines using tools like Jenkins, GitHub Actions, or AWS CodePipeline for automated testing and deployments. • Security & Compliance: Ensure application security, data privacy, and compliance with industry regulations. • Collaboration & Mentorship: Work closely with product managers, designers, and other engineers. Mentor junior developers to maintain best coding practices. Technical Skills Required • Programming Languages: Java (8+), Spring Boot, Hibernate • Cloud Platforms: AWS (Lambda, S3, EC2, ECS, RDS, DynamoDB, CloudFormation) • Microservices & API Development: RESTful APIs, API Gateway • Database Management: PostgreSQL, DynamoDB • DevOps & Automation: Docker, Kubernetes, Terraform, CI/CD pipelines • Testing & Monitoring: JUnit, Mockito, Prometheus, CloudWatch • Version Control & Collaboration: Git, GitHub, Jira, Confluence Nice-to-Have Skills • Experience with serverless architectures (AWS Lambda, Step Functions) • Exposure to event-driven architectures (Kafka, SNS, SQS) Qualifications • Bachelor's or masters degree in computer science, Software Engineering, or a related field • 5+ years of hands-on experience in Java-based backend development • Strong problem-solving skills with a focus on scalability and performance.

Posted 1 month ago

Apply

7.0 - 9.0 years

15 - 22 Lacs

Chennai

Work from Office

Key Responsibilities • Application Development: Design, develop, and maintain dealership applications using Java, Spring Boot, and Microservices architecture. • Cloud Deployment: Build and manage cloud-native applications on AWS, leveraging services such as Lambda, ECS, RDS, S3, DynamoDB, and API Gateway. • System Architecture: Develop and maintain scalable, high-availability architectures to support large-scale dealership operations. • Database Management: Work with relational and NoSQL databases like PostgreSQL, and DynamoDB for efficient data storage and retrieval. • Integration & APIs: Develop and manage RESTful APIs and integrate with third-party services, including OEMs and dealership management systems (DMS). • DevOps & CI/CD: Implement CI/CD pipelines using tools like Jenkins, GitHub Actions, or AWS CodePipeline for automated testing and deployments. • Security & Compliance: Ensure application security, data privacy, and compliance with industry regulations. • Collaboration & Mentorship: Work closely with product managers, designers, and other engineers. Mentor junior developers to maintain best coding practices. Technical Skills Required • Programming Languages: Java (8+), Spring Boot, Hibernate • Cloud Platforms: AWS (Lambda, S3, EC2, ECS, RDS, DynamoDB, CloudFormation) • Microservices & API Development: RESTful APIs, API Gateway • Database Management: PostgreSQL, DynamoDB • DevOps & Automation: Docker, Kubernetes, Terraform, CI/CD pipelines • Testing & Monitoring: JUnit, Mockito, Prometheus, CloudWatch • Version Control & Collaboration: Git, GitHub, Jira, Confluence Nice-to-Have Skills • Experience with serverless architectures (AWS Lambda, Step Functions) • Exposure to event-driven architectures (Kafka, SNS, SQS) Qualifications • Bachelor's or masters degree in computer science, Software Engineering, or a related field • 5+ years of hands-on experience in Java-based backend development • Strong problem-solving skills with a focus on scalability and performance.

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 30 Lacs

Pune, Chennai, Bengaluru

Work from Office

Mandatory keyskills : Athena, Step Functions, Spark - Pyspark, ETL Fundamentals, SQL (Basic + Advanced), Glue, Python, Lambda, Data Warehousing, EBS /EFS, AWS EC2, Lake Formation, Aurora, S3, Modern Data Platform Fundamentals, PLSQL, Cloud front We are looking for an experienced AWS Data Engineer to design, build, and manage robust, scalable, and high-performance data pipelines and data platforms on AWS. The ideal candidate will have a strong foundation in ETL fundamentals, data modeling, and modern data architecture, with hands-on expertise across a broad spectrum of AWS services including Athena, Glue, Step Functions, Lambda, S3, and Lake Formation. Key Responsibilities : Design and implement scalable ETL/ELT pipelines using AWS Glue, Spark (PySpark), and Step Functions. Work with structured and semi-structured data using Athena, S3, and Lake Formation to enable efficient querying and access control. Develop and deploy serverless data processing solutions using AWS Lambda and integrate them into pipeline orchestration. Perform advanced SQL and PL/SQL development for data transformation, analysis, and performance tuning. Build data lakes and data warehouses using S3, Aurora, and Athena. Implement data governance, security, and access control strategies using AWS tools including Lake Formation, CloudFront, EBS/EFS, and IAM. Develop and maintain metadata, lineage, and data cataloging capabilities. Participate in data modeling exercises for both OLTP and OLAP environments. Work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver actionable insights. Monitor, debug, and optimize data pipelines for reliability and performance. Required Skills & Experience: Strong experience with AWS data services: Glue, Athena, Step Functions, Lambda, Lake Formation, S3, EC2, Aurora, EBS/EFS, CloudFront. Proficient in PySpark, Python, SQL (basic and advanced), and PL/SQL. Solid understanding of ETL/ELT processes and data warehousing concepts. Familiarity with modern data platform fundamentals and distributed data processing. Experience in data modeling (conceptual, logical, physical) for analytical and operational use cases. Experience with orchestration and workflow management tools within AWS. Strong debugging and performance tuning skills across the data stack.

Posted 1 month ago

Apply

10.0 - 15.0 years

15 - 30 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Senior Full-Stack Developer with Node ,react .aws and Gen Ai llm Location: Chennai Hyderabad Banglore Experience: 10 Years Work Type: Onsite Budget: As per market standards Primary Skills NodeJS 6+ years of hands-on backend development JavaScript HTML CSS Strong frontend development capabilities ReactJS VueJS Working knowledge or project experience preferred AWS Serverless Architecture Mandatory (Lambda, API Gateway, S3) LLM Integration AI Development Experience with OpenAI, Anthropic APIs Prompt Engineering Context management and token optimization SQL NoSQL Databases Solid experience with relational & non-relational DBs End To End Deployment Deploy, debug, and manage full-stack apps Clean Code Writes clean, maintainable, production-ready code Secondary Skills Amazon Bedrock Familiarity is a strong plus Web Servers Experience with Nginx Apache configuration RAG Patterns Vector DBs AIAgents Bonus experience Software Engineering Best Practices Strong design & architecture skills CI/CD DevOps Exposure Beneficial for full pipeline integration Expectations Own frontend and backend development Collaborate closely with engineering and client teams Build scalable, secure, and intelligent systems Influence architecture and tech stack decisions Stay up-to-date with AI trends and serverless best practices

Posted 1 month ago

Apply

5.0 - 10.0 years

12 - 22 Lacs

Pune, Chennai, Bengaluru

Hybrid

Role : Gen AI developer/ AI ML / ML operations/Data science Experience: 4 Years - 11 Years Locations: Bangalore/Chennai/Pune/Kolkata Notice Period: Immediate to 30 Days Mandatory Skills : Gen AI, LLM, RAG, Lang chain, Mistral,Llama, Vector DB, Azure/GCP/ Lambda, Python, Tensorflow, Pytorch Preferred Skills : GPT-4, NumPy, Pandas, Keras, Databricks, Pinecone/Chroma/Weaviate, Scale/Labelbox, Job Description / Roles & Responsibilities (in Detail) : We are looking for a good Python Developer with Knowledge of Machine learning and deep learning framework. Take care of entire prompt life cycle like prompt design, prompt template creation, prompt tuning/optimization for various GenAI base models Design and develop prompts suiting project needs Stakeholder management across business and domains as required for the projects Evaluating base models and benchmarking performance Implement prompt guardrails to prevent attacks like prompt injection, jail braking and prompt leaking Develop, deploy and maintain auto prompt solutions Design and implement minimum design standards for every use case involving prompt engineering You will be responsible for training the machine learning and deep learning model. Writing reusable, testable, and efficient code using Python Design and implementation of low-latency, high-availability, and performant applications Implementation of security and data protection Integration of data storage solutions and API Gateways Production change deployment and related support Interested candidates can share their updated CV to pravallika@wrootsglobal.in

Posted 1 month ago

Apply

3.0 - 6.0 years

20 - 30 Lacs

Bengaluru

Work from Office

Job Title: Data Engineer II (Python, SQL) Experience: 3 to 6 years Location: Bangalore, Karnataka (Work from office, 5 days a week) Role: Data Engineer II (Python, SQL) As a Data Engineer II, you will work on designing, building, and maintaining scalable data pipelines. Youll collaborate across data analytics, marketing, data science, and product teams to drive insights and AI/ML integration using robust and efficient data infrastructure. Key Responsibilities: Design, develop and maintain end-to-end data pipelines (ETL/ELT). Ingest, clean, transform, and curate data for analytics and ML usage. Work with orchestration tools like Airflow to schedule and manage workflows. Implement data extraction using batch, CDC, and real-time tools (e.g., Debezium, Kafka Connect). Build data models and enable real-time and batch processing using Spark and AWS services. Collaborate with DevOps and architects for system scalability and performance. Optimize Redshift-based data solutions for performance and reliability. Must-Have Skills & Experience: 3+ years in Data Engineering or Data Science with strong ETL and pipeline experience. Expertise in Python and SQL . Strong experience in Data Warehousing , Data Lakes , Data Modeling , and Ingestion . Working knowledge of Airflow or similar orchestration tools. Hands-on with data extraction techniques like CDC , batch-based, using Debezium, Kafka Connect, AWS DMS . Experience with AWS Services : Glue, Redshift, Lambda, EMR, Athena, MWAA, SQS, etc. Knowledge of Spark or similar distributed systems. Experience with queuing/messaging systems like SQS , Kinesis , RabbitMQ .

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies