Home
Jobs

39809 Aws Jobs - Page 43

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

About Fusemachines Fusemachines is a 10+ year old AI company, dedicated to delivering state-of-the-art AI products and solutions to a diverse range of industries. Founded by Sameer Maskey, Ph.D., an Adjunct Associate Professor at Columbia University, our company is on a steadfast mission to democratize AI and harness the power of global AI talent from underserved communities. With a robust presence in four countries and a dedicated team of over 400 full-time employees, we are committed to fostering AI transformation journeys for businesses worldwide. At Fusemachines, we not only bridge the gap between AI advancement and its global impact but also strive to deliver the most advanced technology solutions to the world. About The Role This is a remote full-time contractual position , working in the Travel & Hospitality Industry , responsible for designing, building, testing, optimizing and maintaining the infrastructure and code required for data integration, storage, processing, pipelines and analytics (BI, visualization and Advanced Analytics) from ingestion to consumption, implementing data flow controls, and ensuring high data quality and accessibility for analytics and business intelligence purposes. This role requires a strong foundation in programming and a keen understanding of how to integrate and manage data effectively across various storage systems and technologies. We're looking for someone who can quickly ramp up, contribute right away and work independently as well as with junior team members with minimal oversight. We are looking for a skilled Sr. Data Engineer with a strong background in Python , SQL , Pyspark , Redshift, and AWS cloud-based large-scale data solutions with a passion for data quality, performance and cost optimization. The ideal candidate will develop in an Agile environment. This role is perfect for an individual passionate about leveraging data to drive insights, improve decision-making, and support the strategic goals of the organization through innovative data engineering solutions. Qualification / Skill Set Requirement: Must have a full-time Bachelor's degree in Computer Science, Information Systems, Engineering, or a related field 5+ years of real-world data engineering development experience in AWS (certifications preferred). Strong expertise in Python, SQL, PySpark and AWS in an Agile environment, with a proven track record of building and optimizing data pipelines, architectures, and datasets, and proven experience in data storage, modelling, management, lake, warehousing, processing/transformation, integration, cleansing, validation and analytics A senior person who can understand requirements and design end-to-end solutions with minimal oversight Strong programming Skills in one or more languages such as Python, Scala, and proficient in writing efficient and optimized code for data integration, storage, processing and manipulation Strong knowledge SDLC tools and technologies, including project management software (Jira or similar), source code management (GitHub or similar), CI/CD system (GitHub actions, AWS CodeBuild or similar) and binary repository manager (AWS CodeArtifact or similar) Good understanding of Data Modelling and Database Design Principles. Being able to design and implement efficient database schemas that meet the requirements of the data architecture to support data solutions Strong SQL skills and experience working with complex data sets, Enterprise Data Warehouse and writing advanced SQL queries. Proficient with Relational Databases (RDS, MySQL, Postgres, or similar) and NonSQL Databases (Cassandra, MongoDB, Neo4j, etc.) Skilled in Data Integration from different sources such as APIs, databases, flat files, and event streaming Strong experience in implementing data pipelines and efficient ELT/ETL processes, batch and real-time, in AWS and using open source solutions, being able to develop custom integration solutions as needed, including Data Integration from different sources such as APIs (PoS integrations is a plus), ERP (Oracle and Allegra are a plus), databases, flat files, Apache Parquet, event streaming, including cleansing, transformation and validation of the data Strong experience with scalable and distributed Data Technologies such as Spark/PySpark, DBT and Kafka, to be able to handle large volumes of data Experience with stream-processing systems: Storm, Spark-Streaming, etc. is a plus Strong experience in designing and implementing Data Warehousing solutions in AWS with Redshift. Demonstrated experience in designing and implementing efficient ELT/ETL processes that extract data from source systems, transform it (DBT), and load it into the data warehouse Strong experience in Orchestration using Apache Airflow Expert in Cloud Computing in AWS, including deep knowledge of a variety of AWS services like Lambda, Kinesis, S3, Lake Formation, EC2, EMR, ECS/ECR, IAM, CloudWatch, etc Good understanding of Data Quality and Governance, including implementation of data quality checks and monitoring processes to ensure that data is accurate, complete, and consistent Good understanding of BI solutions, including Looker and LookML (Looker Modelling Language) Strong knowledge and hands-on experience of DevOps principles, tools and technologies (GitHub and AWS DevOps), including continuous integration, continuous delivery (CI/CD), infrastructure as code (IaC – Terraform), configuration management, automated testing, performance tuning and cost management and optimization Good Problem-Solving skills: being able to troubleshoot data processing pipelines and identify performance bottlenecks and other issues Possesses strong leadership skills with a willingness to lead, create Ideas, and be assertive Strong project management and organizational skills Excellent communication skills to collaborate with cross-functional teams, including business users, data architects, DevOps/DataOps/MLOps engineers, data analysts, data scientists, developers, and operations teams. Essential to convey complex technical concepts and insights to non-technical stakeholders effectively Ability to document processes, procedures, and deployment configurations Responsibilities: Design, implement, deploy, test and maintain highly scalable and efficient data architectures, defining and maintaining standards and best practices for data management independently with minimal guidance Ensuring the scalability, reliability, quality and performance of data systems Mentoring and guiding junior/mid-level data engineers Collaborating with Product, Engineering, Data Scientists and Analysts to understand data requirements and develop data solutions, including reusable components Evaluating and implementing new technologies and tools to improve data integration, data processing and analysis Design architecture, observability and testing strategies, and build reliable infrastructure and data pipelines Takes ownership of storage layer, data management tasks, including schema design, indexing, and performance tuning Swiftly address and resolve complex data engineering issues, incidents and resolve bottlenecks in SQL queries and database operations Conduct a Discovery on the existing Data Infrastructure and Proposed Architecture Evaluate and implement cutting-edge technologies and methodologies, and continue learning and expanding skills in data engineering and cloud platforms, to improve and modernize existing data systems Evaluate, design, and implement data governance solutions: cataloguing, lineage, quality and data governance frameworks that are suitable for a modern analytics solution, considering industry-standard best practices and patterns Define and document data engineering architectures, processes and data flows Assess best practices and design schemas that match business needs for delivering a modern analytics solution (descriptive, diagnostic, predictive, prescriptive) Be an active member of our Agile team, participating in all ceremonies and continuous improvement activities Fusemachines is an Equal opportunity employer, committed to diversity and inclusion. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or any other characteristic protected by applicable federal, state, or local laws. Powered by JazzHR SC1hyFVwpp Show more Show less

Posted 1 day ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Hyderabad, Bengaluru

Hybrid

Naukri logo

Key Skills & Responsibilities Hands-on experience with AWS services: S3, Lambda, Glue, API Gateway, and SQS. Strong data engineering expertise on AWS, with proficiency in Python, PySpark, and SQL. Experience in batch job scheduling and managing data dependencies across pipelines. Familiarity with data processing tools such as Apache Spark and Airflow. Ability to automate repetitive tasks and build reusable frameworks for improved efficiency. Provide RunOps DevOps support, and manage the ongoing operation and monitoring of data services. Ensure high performance, scalability, and reliability of data workflows in cloud environments. Skills: aws,s3,glue,apache spark,lambda,airflow,sql,s3, lambda, glue, api gateway, and sqs,api gateway,pyspark,sqs,python,devops support

Posted 1 day ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Lead Backend Engineer About US FICO, originally known as Fair Isaac Corporation, is a leading analytics and decision management company that empowers businesses and individuals around the world with data-driven insights. Known for pioneering the FICO® Score, a standard in consumer credit risk assessment, FICO combines advanced analytics, machine learning, and sophisticated algorithms to drive smarter, faster decisions across industries. From financial services to retail, insurance, and healthcare, FICO's innovative solutions help organizations make precise decisions, reduce risk, and enhance customer experiences. With a strong commitment to ethical use of AI and data, FICO is dedicated to improving financial access and inclusivity, fostering trust, and driving growth for a digitally evolving world. The Opportunity “As a Lead backend Engineer on our Generative AI team, you will work at the frontier of language model applications, developing novel solutions for various areas of the FICO platform to include fraud investigation, decision automation, process flow automation, and optimization. We seek a highly skilled engineer with a strong foundation in digital product development, a zeal for innovation and responsible for deploying product updates, identifying production issues and implementing integrations. The backend engineer should thrive in agile, fast-paced environments, champion DevOps and CI/CD best practices, and consistently deliver scalable, customer-focused backend solutions. You will have the opportunity to make a meaningful impact on FICO’s platform by infusing it with next-generation AI capabilities. You’ll work with a team, leveraging skills to build solutions and drive innovation forward.”. What You’ll Contribute Design, develop, and maintain high-performance, scalable Python-based backend systems powering ML and Generative AI products. Collaborate closely with ML engineers, data scientists, and product managers to build reusable APIs and services that support the full ML lifecycle—from data ingestion to inference and monitoring. Take end-to-end ownership of backend services, including design, implementation, testing, deployment, and maintenance. Implement product changes across the SDLC: detailed design, unit/integration testing, documentation, deployment, and support. Contribute to architecture discussions and enforce coding best practices and design patterns across the engineering team. Participate in peer code reviews, PR approvals, and mentor junior developers by removing technical blockers and sharing expertise. Work with the QA and DevOps teams to enable CI/CD, build pipelines, and ensure product quality through automated testing and performance monitoring. Translate business and product requirements into robust engineering deliverables and detailed technical documentation. Build backend infrastructure that supports ML pipelines, model versioning, performance monitoring, and retraining loops. Engage in prototyping efforts, collaborating with internal and external stakeholders to design PoVs and pilot solutions. What We’re Seeking 8+ of software development experience, with at least 3 years in a technical or team leadership role. Deep expertise in Python, including design and development of reusable, modular API packages for ML and data science use cases. Strong understanding of REST and gRPC APIs, including schema design, authentication, and versioning. Familiarity with ML workflows, MLOps, and tools such as MLflow, FastAPI, TensorFlow, PyTorch, or similar. Strong experience building and maintaining microservices and distributed backend systems in production environments. Solid knowledge of cloud-native development and experience with platforms like AWS, GCP, or Azure. Familiarity with Kubernetes, Docker, Helm, and deployment strategies for scalable AI systems. Proficient in SQL and NoSQL databases and experience designing performant database schemas. Experience with messaging and streaming platforms like Kafka is a plus. Understanding of software engineering best practices, including unit testing, integration testing, TDD, code reviews, and performance tuning. Exposure to frontend technologies such as React or Angular is a bonus, though not mandatory. Experience integrating with LLM APIs and understanding of prompt engineering and vector databases. Exposure to Java or Spring Boot in hybrid technology environments will be a bonus. Excellent collaboration and communication skills, with a proven ability to work effectively in cross-functional, globally distributed teams. A bachelor’s degree in Computer Science, Engineering, or a related discipline, or equivalent hands-on industry experience. Our Offer to You An inclusive culture strongly reflecting our core values: Act Like an Owner, Delight Our Customers and Earn the Respect of Others. The opportunity to make an impact and develop professionally by leveraging your unique strengths and participating in valuable learning experiences. Highly competitive compensation, benefits and rewards programs that encourage you to bring your best every day and be recognized for doing so. An engaging, people-first work environment offering work/life balance, employee resource groups, and social events to promote interaction and camaraderie. Show more Show less

Posted 1 day ago

Apply

2.0 - 4.0 years

5 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

About the Role We are looking for a driven and technically sound Junior DevOps Engineer with a minimum of 2 years of hands-on experience in cloud infrastructure, CI/CD automation, container orchestration, and production support. You will work closely with our development and operations teams to ensure seamless deployment processes, robust monitoring, and high system reliability in a fast-paced, cloud-native environment. Key Responsibilities Design, build, and maintain CI/CD pipelines using Jenkins and Gitlab to support fast and reliable software delivery. Collaborate with developers to streamline build, test, and deployment workflows across environments. Deploy, manage, and monitor containerized applications using Docker and Kubernetes. Implement robust monitoring and alerting systems with Prometheus, Grafana, and Loki to ensure observability and proactive incident detection. Troubleshoot and resolve production issues in Kubernetes-based environments, ensuring minimal downtime and service continuity. Automate infrastructure provisioning and configuration using Terraform or equivalent Infrastructure as Code tools. Manage and support AWS infrastructure components including EC2, VPC, S3, RDS, CodeArtifact, ECR, and related services. Collaborate with engineering teams to apply security best practices, enforce scalability standards, and ensure high availability. Use Git and GitLab for version control, code management, and CI/CD integration. Administer and optimize PostgreSQL (RDS) instances for application-level performance and reliability. Assist in the deployment and environment configuration of Spring Boot (Java) applications. Required Qualifications Minimum 2 years of experience in a DevOps, SRE, or systems engineering role. Hands-on expertise with Jenkins, Docker, Kubernetes, and cloud platforms (preferably AWS). Proficient in monitoring and logging with Prometheus, Grafana, and Loki. Good knowledge of Linux-based environments for infrastructure operations and debugging. Experience with Git, GitLab, and version control best practices. Solid understanding of AWS core services: EC2, VPC, S3, RDS, IAM, ECR, CDN, SES, etc. Experience working with PostgreSQL and other relational databases. Basic scripting skills in Python, Bash, or equivalent. Familiarity with Terraform or similar IaC tools (preferred). Understanding of Java Spring Boot applications and deployment lifecycles. Strong troubleshooting skills and the ability to work collaboratively in cross-functional teams. Preferred Qualifications Practical experience with Infrastructure as Code (IaC) using Terraform or similar tools. Exposure to Agile methodologies and DevSecOps practices. Familiarity with additional scripting or automation tools. Bachelor's degree in computer science, Information Technology, or a related field. Why Join Us Work with modern DevOps stacks and contribute to scalable, secure, and automated infrastructure. Collaborate with an experienced, passionate engineering team. Be part of a learning-driven culture that values ownership, innovation, and technical excellence.

Posted 1 day ago

Apply

8.0 - 10.0 years

20 - 25 Lacs

Noida

Work from Office

Naukri logo

Key Responsibilities Hands-on Development: Develop and implement machine learning models and algorithms, including supervised, unsupervised, deep learning, and reinforcement learning techniques. Implement Generative AI solutions using technologies like RAG (Retrieval-Augmented Generation), Vector DBs, and frameworks such as LangChain and Hugging Face, Agentic Ai. Utilize popular AI/ML frameworks and libraries such as TensorFlow, PyTorch, and scikit-learn. Design and deploy NLP models and techniques, including text classification, RNNs, CNNs, and Transformer-based models like BERT. Ensure robust end-to-end AI/ML solutions, from data preprocessing and feature engineering to model deployment and monitoring. Technical Proficiency: Demonstrate strong programming skills in languages commonly used for data science and ML, particularly Python. Leverage cloud platforms and services for AI/ML, especially AWS, with knowledge of AWS Sagemaker, Lambda, DynamoDB, S3, and other AWS resources. Mentorship: Mentor and coach a team of data scientists and machine learning engineers, fostering skill development and professional growth. Provide technical guidance and support, helping team members overcome challenges and achieve project goals. Set technical direction and strategy for AI/ML projects, ensuring alignment with business goals and objectives. Facilitate knowledge sharing and collaboration within the team, promoting best practices and continuous learning. Strategic Advisory: Collaborate with cross-functional teams to integrate AI/ML solutions into business processes and products. Provide strategic insights and recommendations to support decision-making processes. Communicate effectively with stakeholders at various levels, including technical and non-technical audiences. Qualifications Bachelor s degree in a relevant field (e. g. , Computer Science) or equivalent combination of education and experience. Typically, 8-10 years of relevant work experience in AI/ML/GenAI 15+ years of overall work experience. With proven ability to manage projects and activities. Extensive experience with generative AI technologies, including RAG, Vector DBs, and frameworks such as LangChain and Hugging Face, Agentic AI Proficiency in machine learning algorithms and techniques, including supervised and unsupervised learning, deep learning, and reinforcement learning. Extensive experience with AI/ML frameworks and libraries such as TensorFlow, PyTorch, and scikit-learn. Strong knowledge of natural language processing (NLP) techniques and models, including Transformer-based models like BERT. Proficient programming skills in Python and experience with cloud platforms like AWS. Experience with AWS Cloud Resources, including AWS Sagemaker, Lambda, DynamoDB, S3, etc. , is a plus. Proven experience leading a team of data scientists or machine learning engineers on complex projects. Strong project management skills, with the ability to prioritize tasks, allocate resources, and meet deadlines. Excellent communication skills and the ability to convey complex technical concepts to diverse audiences. Preferred Qualifications Experience in setting technical direction and strategy for AI/ML projects. Experience in the Insurance domain Ability to mentor and coach junior team members, fostering growth and development. Proven track record of successfully managing AI/ML projects from conception to deployment.

Posted 1 day ago

Apply

0.0 - 1.0 years

20 - 25 Lacs

Chennai

Work from Office

Naukri logo

Quality Services (QS) organization provides testing support for Devices, Retail and AWS products. The primary objective of QS organization is to provide manual testing support. An Associate, Quality Services performs manual test execution of documented task instructions. They produce accurate test results meeting daily targets, adhering to defined processes. Execute test cases prepared for testing Software builds on the Kindle platform and Kindle software products. Perform test case execution and report bugs accurately Understand testing procedures and guidelines for new builds / releases. Perform regression and repetitive testing exercises to qualify builds without compromising on quality Use software tools for data capture on a daily basis Be comfortable with capturing results, communicating and escalating failures and providing individual status reports A day in the life Quality Services (QS) organization provides testing support for Devices, Retail and AWS products. The primary objective of QS organization is to provide manual testing support. An Associate, Quality Services performs manual test execution of documented task instructions. They produce accurate test results meeting daily targets, adhering to defined processes. About the team Quality Services (QS) organization provides testing support for Devices, Retail and AWS products. The primary objective of QS organization is to provide manual testing support. An Associate, Quality Services performs manual test execution of documented task instructions. They produce accurate test results meeting daily targets, adhering to defined processes. Knowledge of QA methodology and tools Knowledge of QA methodology and tools Graduate, preferably in a quantitative field of study with relevant exp of 01 years Familiarity using computer and software Experience using any gadgets or devices Good communication skills, detailed oriented and be a team player Capability to follow defined processes and adhere to policies Understanding about software testing Ability to complete assigned tasks accurately and promptly

Posted 1 day ago

Apply

0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

About Us At Vaidrix Technologies Pvt. Ltd., we're passionate about crafting digital experiences that drive innovation, efficiency, and growth. We partner with startups and enterprises to build smart, scalable, and custom-tailored digital solutions. Our expertise spans custom software development, web and mobile applications, ReactJs/Native development, SaaS products, and e-commerce solutions. Our team is dedicated to quality, transparency, and building long-term partnerships. We're a group of creative and technology-focused professionals committed to delivering transformative results. Join us, and let's build the future together. Role Description We are seeking a motivated and skilled .NET Core and Angular Developer to join our dynamic team in Ahmedabad. In this role, you will be instrumental in designing, developing, and maintaining robust and scalable web APIs and front-end applications. You will collaborate closely with our cross-functional teams to deliver high-quality software solutions that meet our clients' needs. This is an excellent opportunity for a developer who is passionate about building cutting-edge applications and wants to grow their career in a supportive and innovative environment. What you'll do: Design, develop, and maintain efficient, reusable, and reliable code using .NET Core for our Web APIs. Develop user-facing features using Angular, ensuring a seamless and responsive user experience. Collaborate with front-end and back-end developers to integrate user-facing elements with server-side logic. Participate in the entire application lifecycle, from concept to deployment. Write clean, scalable code and adhere to best practices in software development. Engage in code reviews to maintain high standards of code quality and provide constructive feedback to peers. Troubleshoot, debug, and upgrade existing software. Qualifications Solid understanding of Object-Oriented Programming (OOP) principles. Proven experience in software development with a strong focus on .NET Core and building RESTful Web APIs . Proficiency in front-end development with Angular . Experience with ASP.NET MVC . Strong problem-solving and analytical skills. A collaborative spirit and the ability to work effectively in a team environment. A degree in Computer Science, Information Technology, or a related field. Familiarity with web technologies such as HTML5, CSS3, and JavaScript/TypeScript. Nice to have: Exposure to cloud platforms like Azure or AWS. Experience with database technologies such as SQL Server or PostgreSQL. Familiarity with version control systems, particularly Git. Why Join Vaidrix Technologies? Innovate and Grow: Be part of a team that's at the forefront of technology, working on exciting and challenging projects. Collaborative Culture: We foster a supportive and collaborative environment where your ideas are valued. Career Development: We are committed to your professional growth and offer opportunities for learning and advancement. Show more Show less

Posted 1 day ago

Apply

5.0 - 8.0 years

15 - 25 Lacs

Hyderabad

Work from Office

Naukri logo

Roles and Responsibilities Design, develop, test, and deploy full-stack applications using Node.js, Express, MongoDB, Redis, React.js, AWS, Nestjs, SQL, Microservices, REST, JSON. Collaborate with cross-functional teams to identify requirements and deliver high-quality solutions. Develop scalable and efficient algorithms for data processing and storage management. Ensure seamless integration of multiple services through API design and implementation. Participate in code reviews to maintain coding standards and best practices. Desired Candidate Profile 5-8 years of experience as a Full Stack Software Engineer with expertise in at least two programming languages (Node.js & React.js). Bachelor's degree in Any Specialization (B.Tech/B.E.). Strong understanding of software development life cycle including design patterns, testing methodologies, version control systems (Git), continuous integration/continuous deployment pipelines (CI/CD). Proficiency in working with databases such as MySQL or NoSQL databases like MongoDB.

Posted 1 day ago

Apply

5.0 - 7.0 years

15 - 22 Lacs

Gurugram

Remote

Naukri logo

Role & responsibilities Delivering industry leading technical solutions for our product. As part of development team, work directly with the product managers, engineers, and business stakeholders Efficiently collaborating with clients, Product & Engineering, as well as stakeholders across multiple time zones, including EST, GMT and IST. Learning and developing alongside our talented team. Keeping your skills fresh and staying informed of best practices, technical trends and developments Technical Skills Required Core Java Experience with advanced working knowledge of J2EE, APIs (Java 8, REST, Webservices) related technologies. Good understanding of Spring Boot, microservices, databases. Sound knowledge of CI/CD, Docker, and Basic Knowledge of Kubernetes. Have the ability (adaptability) to pick up any ad-hoc technology and continuous curiosity for new technologies on the horizon. Extensive experience in developing applications using Open-Source Frameworks, IDEs, and Java Application Servers. Excellent Debugging / Troubleshooting skills. Good to Have (Technical Skills) Any Monitoring tool experience like Grafana, Light step or Prometheus. Experience on working with Kafka. Experience on working with AWS.

Posted 1 day ago

Apply

1.0 - 5.0 years

7 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Diverse Lynx is looking for Snowflake Developer to join our dynamic team and embark on a rewarding career journeyA Developer is responsible for designing, developing, and maintaining software applications and systems. They collaborate with a team of software developers, designers, and stakeholders to create software solutions that meet the needs of the business.Key responsibilities:Design, code, test, and debug software applications and systemsCollaborate with cross-functional teams to identify and resolve software issuesWrite clean, efficient, and well-documented codeStay current with emerging technologies and industry trendsParticipate in code reviews to ensure code quality and adherence to coding standardsParticipate in the full software development life cycle, from requirement gathering to deploymentProvide technical support and troubleshooting for production issues.Requirements:Strong programming skills in one or more programming languages, such as Python, Java, C++, or JavaScriptExperience with software development tools, such as version control systems (e.g. Git), integrated development environments (IDEs), and debugging toolsFamiliarity with software design patterns and best practicesGood communication and collaboration skills.

Posted 1 day ago

Apply

6.0 - 11.0 years

10 - 14 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

Perficient India is looking for Lead Technical Consultant - Java + Kafka to join our dynamic team and embark on a rewarding career journey Undertake short-term or long-term projects to address a variety of issues and needs Meet with management or appropriate staff to understand their requirements Use interviews, surveys etc. to collect necessary data Conduct situational and data analysis to identify and understand a problem or issue Present and explain findings to appropriate executives Provide advice or suggestions for improvement according to objectives Formulate plans to implement recommendations and overcome objections Arrange for or provide training to people affected by change Evaluate the situation periodically and make adjustments when needed Replenish knowledge of industry, products and field

Posted 1 day ago

Apply

3.0 - 5.0 years

11 - 12 Lacs

Chennai

Work from Office

Naukri logo

Job Description: Node.js Developer is seeking a competent and motivated Node.js developer to join our dynamic software development team. As a Node.js developer your primary role will be to develop and execute scalable APIs and applications using the Node.js framework, guide and mentor a and handle client interactions. You will create high-performance, efficient web applications that can handle large amounts of data and traffic for our clients.

Posted 1 day ago

Apply

5.0 - 6.0 years

6 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Detailed JD *(Roles and Responsibilities) Candidate would be responsible for Kafka support and testing. Should be familiar with Kafka commands with strong experience in troubleshooting. Exposure to KSQL & rest proxy is preferred.

Posted 1 day ago

Apply

5.0 - 8.0 years

8 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Position 2 - Part Time Trainers We are looking for experienced trainers to conduct online trainings. If you have expertise in conducting online training classes in either of the following technologies then please get in touch: Java Full Stack Development Python Full Stack Development Web Full Stack Development Data Science with Python Machine Learning with Python Blockchain for Beginners DevOps for Beginners Data Structure for Beginners AWS Certifications You can work for us from home as a freelancer or join us permanently in Hyderabad. There is no work from home for permanent employees.

Posted 1 day ago

Apply

6.0 - 11.0 years

4 - 8 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Function description The DEV Engineer commits to help the team to deliver a product at the end of each sprint, surpassing his/her own specific knowledge domain in order to collaborate with the team members. The focus is on the completion of the sprint backlog, containing all elements that the team must deliver and of which the sequence has been determined by the Product Owner based on the added value for the (internal or external) client. The DEV Engineer adheres to the scrum values (focussed, committed, open, respectful, and courageous) and is able to closely collaborate with the team members. Knowledge sharing, open communication, continuous learning and commitment to deliver added value are key. Language requirements English and active knowledge of at least one local language + passive knowledge of the other language Education Bachelor/Master or equivalent by experience Required experience / knowledge At least 6 years of relevant experience Technical experience mandatory Java8, spring,API, maven, oracle, sonar, git, React JS, html css Javascript knowledge of methods, standards and security procedures as well as development tools preferable jenkins, cucumber, docker Business experience mandatory knowledge of agile methodology Soft skills excellent analysis skills team spirit efficient communication skills ambitious towards the targets of his/her squad

Posted 1 day ago

Apply

5.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Data Analytics and Business Intelligence Engineer (Databricks Specialist) Location: Remote Timings : 6.30PM IST -3.30 AM IST Job Type: Full-Time Job Summary: We are seeking a highly skilled and analytical Data Analytics and Business Intelligence (BI) Engineer with strong experience in Databricks to join our data team. The ideal candidate will be responsible for designing, developing, and maintaining scalable data pipelines, dashboards, and analytics solutions that drive business insights and decision-making. Key Responsibilities: Design and implement robust data pipelines using Databricks , Apache Spark , and Delta Lake . Develop and maintain ETL/ELT workflows to ingest, transform, and store large volumes of structured and unstructured data. Build and optimize data models and data marts to support self-service BI and advanced analytics. Create interactive dashboards and reports using tools like Power BI , Tableau , or Looker . Collaborate with data scientists, analysts, and business stakeholders to understand data needs and deliver actionable insights. Ensure data quality, integrity, and governance across all analytics solutions. Monitor and improve the performance of data pipelines and BI tools. Stay current with emerging technologies and best practices in data engineering and analytics. Required Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, or a related field. 5+ years of experience in data engineering, analytics, or BI development. Strong proficiency in Databricks , Apache Spark , and SQL . Experience with cloud platforms such as Azure , AWS , or GCP . Proficiency in Python or Scala for data processing. Hands-on experience with data visualization tools (Power BI, Tableau, etc.). Solid understanding of data warehousing concepts , dimensional modeling , and data lakes . Familiarity with CI/CD pipelines , version control (Git) , and Agile methodologies . Preferred Qualifications: Databricks certification (e.g., Databricks Certified Data Engineer Associate/Professional ). Experience with MLflow , Delta Live Tables , or Unity Catalog . Knowledge of data governance , security , and compliance standards. Strong communication and stakeholder management skills. Show more Show less

Posted 1 day ago

Apply

2.0 - 7.0 years

13 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

The Opportunity Nutanix s Developer Productivity team is responsible for enabling all our developers to do their best at Nutanix. We manage and maintain build systems in Nutanix s data center and AWS, which provide our entire build and test environments and related services. Our DevOps engineers collaborate closely with product development and work in infrastructure and software domains. We prefer automation to interruptions and infrastructure as code rather than click-ops. With the focus on reliability, scalability, efficiency, and high availability, you will be responsible for delivering and operating the foundational infrastructure platforms and secure environment at Nutanix Were seeking a DevOps Engineer with strong infrastructure (on-prem, AWS ) experience. About the Team Nutanix is a global leader in cloud software and a pioneer in hyperconverged infrastructure solutions, making clouds invisible, freeing customers to focus on their business outcomes. Organizations around the world use Nutanix software to leverage a single platform to manage any app at any location for their hybrid multi-cloud environments. As part of the TPM organization, the TPM will partner with our TPMs, leaders, and teams who are tasked with development and testing of our product. Your Role Linux systems administration experience. Automation experience ( scripting ). Substantial experience with containerization, specifically Docker , and knowledge of Kubernetes . Host-level understanding of networking (TCP/IP). Significant on-prem and cloud (AWS/GCP) experience, design patterns, limitations, and cost containment techniques. Prior experience implementing significant technology redesign projects. Ability to implement effective monitoring and alerting. What You Will Bring 5+ years of relevant experience. Significant programming experience in one or more languages (i. e. , Python). Design, implement, and monitor enterprise-grade secure, fault-tolerant infrastructure. Build infrastructure automation tools and frameworks leveraging Docker, Kubernetes, CloudFormation / Terraform / Ansible / Saltstack , and Helm. Continuously evaluate and identify current system bottlenecks, and implement solutions to improve the scalability of our infrastructure for 10x-100x growth. Work with extended teams to manage systems and infrastructure required for ongoing development and releases. Participate in the on-call rotation for our services. Consult with internal and external stakeholders on the best way to accomplish a given task. You will help grow the team by mentoring and interviewing other talented engineers who want to join us. Work Arrangement Hybrid: This role operates in a hybrid capacity, blending the benefits of remote work with the advantages of in-person collaboration. For most roles, that will mean coming into an office a minimum of 3 days per week, however certain roles and/or teams may require more frequent in-office presence. Additional team-specific guidance and norms will be provided by your manager. -- .

Posted 1 day ago

Apply

3.0 - 8.0 years

2 - 6 Lacs

Noida

Work from Office

Naukri logo

GarudaUAV Soft Solutions Pvt. Ltd. is looking for Sr. ML/Ops Developer to join our dynamic team and embark on a rewarding career journey A Developer is responsible for designing, developing, and maintaining software applications and systems They collaborate with a team of software developers, designers, and stakeholders to create software solutions that meet the needs of the business Key responsibilities:Design, code, test, and debug software applications and systemsCollaborate with cross-functional teams to identify and resolve software issuesWrite clean, efficient, and well-documented codeStay current with emerging technologies and industry trendsParticipate in code reviews to ensure code quality and adherence to coding standardsParticipate in the full software development life cycle, from requirement gathering to deploymentProvide technical support and troubleshooting for production issues Requirements:Strong programming skills in one or more programming languages, such as Python, Java, C++, or JavaScriptExperience with software development tools, such as version control systems (e g Git), integrated development environments (IDEs), and debugging toolsFamiliarity with software design patterns and best practicesGood communication and collaboration skills

Posted 1 day ago

Apply

2.0 - 3.0 years

2 - 5 Lacs

Mumbai

Work from Office

Naukri logo

VIPSha Inc is looking for TRAINING & DEVELOPMENT to join our dynamic team and embark on a rewarding career journey Develop and deliver training programs Prepare training materials and lesson plans Evaluate trainee performance and provide feedback Stay updated on industry trends and best practices Assist in the development and implementation of training strategies

Posted 1 day ago

Apply

4.0 - 9.0 years

7 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

This is what youll do : - The position grows our analytics capabilities with faster, more reliable tools, handling petabytes of data every day. - Brainstorm and create new platforms that can help in our quest to make available to cluster users in all shapes and forms, with low latency and horizontal scalability. - Make changes to our diagnosing any problems across the entire technical stack. - Design and develop a real-time events pipeline for Data ingestion for real-time dash- boarding. - Develop complex and efficient functions to transform raw data sources into powerful, reliable components of our data lake. - Design & implement new components and various emerging technologies in Hadoop Eco- System, and successful execution of various projects. Skills that will help you succeed in this role : - Strong hands-on experience of 4+years with Spark, preferably PySpark etc. - Excellent programming/debugging skills in Python. - Experience with any scripting language such as Python, Bash etc. - Good experience in Databases such as SQL, MongoDB etc - Good to have experience with AWS and cloud technologies such as S3

Posted 1 day ago

Apply

6.0 - 8.0 years

8 - 12 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

Design and implement global private networking solutions, including AWS Transit Gateway, Private Link, Endpoints, Site-to-site VPN, Route 53, and Network Load Balancers. AWS Organizations Management: Assist in implementing and managing AWS Organizations, including restructuring accounts into Dev/Test/Production/Shared Services and migrating existing resources accordingly. Support and Maintenance: Provide on-call support for P1 incidents, particularly for security remediation. Manage AWS IAM, infrastructure logs, and stack monitoring. Address and manage vulnerability fixes. Monitor infrastructure and integration. Conduct operating system upgrades and patch management. Ensure infrastructure availability and management (including NG, RAM). Oversee business-as-usual (BAU) activities following recent re-architecture implementations. Networking and Firewall Management: Manage networking configurations and firewall settings. Increase platform automation using Infrastructure as Code (IaC) and Platform as a Service (PaaS) solutions. Skills and Qualifications: Extensive experience in managing and implementing AWS infrastructure. Strong knowledge of AWS networking components and services. Proficiency in AWS IAM and security management. Experience in infrastructure and integration monitoring. Ability to manage operating system upgrades and patch management. Skilled in vulnerability management and remediation. Strong analytical skills and attention to detail. Excellent communication and teamwork abilities. Ability to provide on-call support and manage critical incidents.

Posted 1 day ago

Apply

5.0 - 10.0 years

5 - 8 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

Job Description: Responsibilities: Lots of coding! And problem solving. And more coding! Working in a dynamic atmosphere of an evolving strategy and scope Participating in discussions & brainstorming sessions on architecture, design, and logic/algorithms Working closely with a great, close-knit team Receiving comprehensive, platinum-level healthcare and benefits Qualifications: Able to express intelligence and creativity by using programming to solve complex real-world problems 5+ years of experience in development on a variety of languages, platforms, frameworks, and technologies Mastery of design patterns and collaborative software development best practices Pride in shipping beautiful, clean, and easily maintainable code Experience with cloud microservice architectures, configuration, operation, and maintenance A self-starter who requires little day-to-day supervision/direction and is comfortable providing mentorship to other engineers A portfolio of projects in which you had key role; for each, please be prepared to describe your role in the project, the technologies involved, the resources dedicated, and one challenging technical problem encountered and how it was overcome Experience developing software with React.js, React Native, Node.js, PostgreSQL, AWS Experience developing within an agile scrum planning methodology Experience working in healthcare or related industries Experience building products with a data-driven methodology Additional Sills:

Posted 1 day ago

Apply

3.0 - 8.0 years

8 - 13 Lacs

Mumbai

Work from Office

Naukri logo

Development and deployment of SCADA, DMS, OMS, DMS in a Virtualized environment Understanding of Power system, distribution, Substation and Field devices Transformers, RTU, FRTU, FPI, circuit breaker, Feeder, Tap changer etc. Good understanding of IT infra- Servers, DNS, Firewall, VMs Database handling SQL, Custom Reports GIS integration, CIM import and export Networking, IT Infra, Cybersecurity Integration with client system (e. g. SAP) with REST API/JSON, SOAP, FAT/SAT Experience in SE , EcoStruxure ADMS or GE, OSI, Siemens Experience: 3-8 Years Qualification: BE/BTech/MTech in Electrical/ Electronics Engineering/Power systems Work Locatio

Posted 1 day ago

Apply

3.0 - 5.0 years

18 - 30 Lacs

Noida, Delhi / NCR

Work from Office

Naukri logo

Role & responsibilities Key Responsibilities Design, develop, and optimize machine learning models for various business applications. Build and maintain scalable AI feature pipelines for efficient data processing and model training. Develop robust data ingestion, transformation, and storage solutions for big data. Implement and optimize ML workflows, ensuring scalability and efficiency. Monitor and maintain deployed models, ensuring performance, reliability, and retraining when necessary Qualifications and Experience Bachelor's or Master's degree in Computer Science, Artificial Intelligence, Data Science, or a related field. 3.5 to 5 years of experience in machine learning, deep learning, or data science roles. Proficiency in Python and ML frameworks/tools such as PyTorch, Langchain Experience with data processing frameworks like Spark, Dask, Airflow and Dagster Hands-on experience with cloud platforms (AWS, GCP, Azure) and ML services. Experience with MLOps tools like ML flow, Kubeflow Familiarity with containerisation and orchestration tools like Docker and Kubernetes. Excellent problem-solving skills and ability to work in a fast-paced environment. Strong communication and collaboration skills. Preferred candidate profile

Posted 1 day ago

Apply

2.0 - 3.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a proactive and technically skilled AI/ML Engineer with 2\u20133 years of experience to join our growing technology team. The ideal candidate will have hands-on expertise in AWS-based machine learning , Agentic AI , and Generative AI tools , especially within the Amazon AI ecosystem. You will play a key role in building intelligent, scalable solutions that address complex business challenges. Key Responsibilities: 1. AWS-Based Machine Learning Develop, train, and fine-tune ML models on AWS SageMaker, Bedrock, and EC2. Implement serverless ML workflows using Lambda, Step Functions, and EventBridge. Optimize models for cost/performance using AWS Inferentia/Trainium. 2. MLOps & Productionization Build CI/CD pipelines for ML using AWS SageMaker Pipelines, MLflow, or Kubeflow. Containerize models with Docker and deploy via AWS EKS/ECS/Fargate. Monitor models in production using AWS CloudWatch, SageMaker Model Monitor. 3. Agentic AI Development Design autonomous agent systems (e.g., AutoGPT, BabyAGI) for task automation. Integrate multi-agent frameworks (LangChain, AutoGen) with AWS services. Implement RAG (Retrieval-Augmented Generation) for agent knowledge enhancement. 4. Generative AI & LLMs Fine-tune and deploy LLMs (GPT-4, Claude, Llama 2/3) using LoRA/QLoRA. Build Generative AI apps (chatbots, content generators) with LangChain, LlamaIndex. Optimize prompts and evaluate LLM performance using AWS Bedrock/Amazon Titan. 5. Collaboration & Innovation Work with cross-functional teams to translate business needs into AI solutions. Collaborate with DevOps and Cloud Engineering teams to develop scalable, production-ready AI systems. Stay updated with cutting-edge AI research (arXiv, NeurIPS, ICML). 5. Governance & Documentation Implement model governance frameworks to ensure ethical AI/ML deployments. Design reproducible ML pipelines following MLOps best practices (versioning, testing, monitoring). Maintain detailed documentation for models, APIs, and workflows (Markdown, Sphinx, ReadTheDocs). Create runbooks for model deployment, troubleshooting, and scaling. Technical Skills Programming: Python (PyTorch, TensorFlow, Hugging Face Transformers). AWS: SageMaker, Lambda, ECS/EKS, Bedrock, S3, IAM. MLOps: MLflow, Kubeflow, Docker, GitHub Actions/GitLab CI. Generative AI: Prompt engineering, LLM fine-tuning, RAG, LangChain. Agentic AI: AutoGPT, BabyAGI, multi-agent orchestration. Data Engineering: SQL, PySpark, AWS Glue/EMR. Soft Skills Strong problem-solving and analytical thinking. Ability to explain complex AI concepts to non-technical stakeholders. What We\u2019re Looking For Bachelor\u2019s/Master\u2019s in CS, AI, Data Science, or related field. 2-3 years of industry experience in AI/ML engineering. Portfolio of deployed ML/AI projects (GitHub, blog, case studies). Good to have an AWS Certified Machine Learning Specialty certification. Why Join Us? Innovative Projects: Work on cutting-edge AI applications that push the boundaries of technology. Collaborative Environment: Join a team of passionate engineers and researchers committed to excellence. Career Growth: Opportunities for professional development and advancement in the rapidly evolving field of AI. Equal opportunity employer

Posted 1 day ago

Apply

Exploring AWS Jobs in India

With the increasing demand for cloud services and infrastructure, the job market for AWS professionals in India is booming. Companies of all sizes are looking to leverage AWS services for their businesses, leading to a high demand for skilled professionals in this domain.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Chennai
  5. Mumbai

Average Salary Range

The average salary range for AWS professionals in India varies based on experience and expertise. Entry-level positions can expect to earn around ₹6-8 lakhs per annum, while experienced professionals can earn upwards of ₹15 lakhs per annum.

Career Path

In the AWS job market in India, a typical career path may include roles such as Junior Developer, Developer, Senior Developer, Tech Lead, and Architect. With experience and certifications, professionals can progress to higher roles with more responsibilities and higher pay scales.

Related Skills

In addition to AWS expertise, professionals in this field are often expected to have skills in areas such as: - DevOps - Linux/Unix systems administration - Scripting languages (Python, Shell) - Networking concepts - Security best practices

Interview Questions

  • What is AWS and what are its key components? (basic)
  • Explain the difference between EC2 and S3 in AWS. (basic)
  • What is the importance of VPC in AWS? (basic)
  • What is IAM in AWS and why is it important? (medium)
  • How do you monitor AWS resources and applications? (medium)
  • Describe the different types of EC2 instances available in AWS. (medium)
  • What is CloudFormation and how is it used in AWS? (medium)
  • Explain the concept of auto-scaling in AWS. (medium)
  • What is the difference between horizontal and vertical scaling? (medium)
  • How do you secure data at rest and data in transit in AWS? (advanced)
  • Explain the concept of Elastic Load Balancing in AWS. (advanced)
  • How does AWS Lambda work and when would you use it? (advanced)
  • Describe the different storage options available in AWS. (advanced)
  • How do you troubleshoot performance issues in AWS? (advanced)
  • What is the AWS Shared Responsibility Model and how does it apply to security? (advanced)
  • Explain the concept of AWS CloudTrail and its importance. (advanced)
  • How do you optimize costs in AWS? (advanced)
  • What is the difference between RDS and DynamoDB in AWS? (advanced)
  • How do you handle disaster recovery in AWS? (advanced)
  • Explain the concept of serverless computing in AWS. (advanced)
  • How do you manage permissions in AWS? (advanced)
  • Describe the different types of storage classes in AWS S3. (advanced)
  • What is the difference between Amazon RDS and Amazon Redshift? (advanced)
  • How do you deploy applications in AWS? (advanced)
  • Explain the concept of AWS Elastic Beanstalk. (advanced)

Closing Remark

As you explore AWS job opportunities in India, remember to showcase your skills and experience confidently during interviews. Prepare thoroughly, stay updated with the latest technologies, and demonstrate your passion for cloud computing to land your dream job in the AWS domain. Good luck! 🚀

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies