Home
Jobs

765 Aws Cloud Jobs - Page 3

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

14 - 24 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title: Java Full Stack Developer(AWS + Cloud) Java Full Stack Developer(AWS + Azure) Experience: Minimum 5+ of relevant exp Location: Bengaluru Work Mode: Work From Office (WFO) / Hybrid (as per project requirements) Preferred- Immediate to 1month NP(DOJ- 30th July) Job Description: Java and Microservices Development: Proficient in Java and J2EE technologies (JDK8 and JDK 17). In-depth knowledge of Microservices architecture and design principles. Extensive experience with Spring frameworks and Spring Boot technologies. Database and Persistence: Strong expertise in database technologies and stored procedures. Proficient in Hibernate, Entity mapping, and JPA Repository Implementations. Build and Deployment Tools: Hands-on experience with build tools like Maven/Gradle and version control systems like Git. Familiarity with CI/CD pipelines for software build automation. Testing : Proficient in testing frameworks such as Junit and Mockito. Experience in automated testing to ensure the reliability and quality of software solutions. Web Services: Extensive experience in developing and consuming Web Services, including REST, SOAP, and proficiency in handling data formats like JSON, WSDL, and XML. Front-End Development: Hands-on experience in front-end technologies such as Angular or ReactJS is a plus. Cloud and Related Technologies: Exposure to cloud hosting platforms. Familiarity with related IT domains, including cloud services, and the ability to integrate cloud solutions into the overall architecture. Cloud Technology: Substantial experience in cloud technologies, with a focus on AWS, Azure, or Pivotal Cloud. Contribute to the strategic use of cloud hosting platforms in the development and deployment of solutions Share updated resume at siddhi.pandey@adecco.com or whatsapp at 6366783349

Posted 2 days ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Bengaluru

Remote

Naukri logo

Role & responsibilities This SRO Engineer is part of GPU Cloud Business Unit. Here the team doesn't have difference in L0-L3 level of resources. This Engineer will be helping in building clusters from scratch, supporting, monitoring and troubleshooting them. SRO Engineers should be able to automate their daily tasks using Ansible playbooks. They will be supporting the firmware upgradations, failure analysis, read logs etc. Any incidents coming, a ticket will be raised on it. SRO Engineers will take up and resolve the same. Should be available to respond to any Alert/Alarm on the incidents. Should be documenting and do additional troubleshooting as and when required. Documenting done to share the details with the next shift SRO Engineers to refer. They should be able to understand any hardware failure or configuration issues that come up as incidents. Preferred candidate profile • 5+ Years of hands-on Linux Administration experience. • Should have understanding of Hardware clusters. • Should have experience in any Cloud Service Provider (CSP) environment. (AWS/Azure/Oracle/GCP). • Ansible/Python Scripting experience. • Should be a reliable team player. • Should have proven experience in SSH, DNS, DHCP, Bare Metal etc.

Posted 2 days ago

Apply

0.0 - 1.0 years

12 - 17 Lacs

Pune

Work from Office

Naukri logo

About The Role. Perform a hands-on role while mentoring junior team members as needed. Gather requirements from the business and prioritize them within the sprint cycle. Ensure the quality and timely delivery of projects. Contribute to all stages of the software development lifecycle. Design, implement, and maintain Java-based applications with a strong understanding of project architecture. Analyze user requirements to define business objectives. Envision system features and functionality, defining clear application objectives. Ensure that application designs align with business goals. Propose enhancements to the existing Java infrastructure. Develop technical designs to support application development. Create multimedia applications. Write well-designed, testable code. Prepare and produce releases of software components. Support continuous improvement by exploring alternative solutions and technologies and presenting these findings for architectural review. Required Technical Skills. Strong fundamentals in object-oriented programming (OOP). Excellent proficiency in Java fundamentals, including multithreading and streams. Solid understanding of data structures and algorithms. Experience with microservices architecture. Well-versed in the latest technology stack for server-side programming. Good knowledge of distributed caching/computing frameworks and tools. Proficient in SQL query writing and optimization. Preferred Skill Sets. Experience with AWS Lambda (serverless) and Redis. Familiarity with design patterns such as Singleton and Facade. Experience with MongoDB/NoSQL databases. Knowledge of Java Messaging Services or AWS SQS (like JMS). Exposure to AWS Cloud. Business knowledge of Loan Management Systems (LMS) is a plus. Willingness to lead a technical team, with team management experience preferred. Educational Graduates with a B.Tech, M.Tech, or MCA from Tier 1 or Tier 2 colleges. (ref:hirist.tech).

Posted 2 days ago

Apply

5.0 - 10.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Key Responsibilities WorkSpaces Design & Implementation :. Design and deploy scalable Amazon WorkSpaces environments across multiple regions (US East, EMEA, SEA). Architect Personal (persistent) and Pooled (non-persistent) WorkSpaces, supporting up to 5,000 users in production environments and 500 users in non-production environments. Serve as a Disaster Recovery (DR) layer for existing Azure Virtual Desktop solutions. Manage user profiles, including standard, power, and GPU-based WorkSpaces, with FSLogix profile management for O365 and non-persistent Automation & Image Management :. Implement Bring Your Own License (BYOL) strategy and deploy custom images using Image Builder and Merck eCore base images. Design and enforce naming conventions, tagging strategies, and IAM role assignments for workspace deployments. Automate workspace provisioning processes using AWS CLI, CloudFormation, and Compliance & Governance :. Configure and manage IAM permissions, domain policies, secret management, and eDiscovery settings . Implement Active Directory integration using AWS Managed Microsoft AD or AD Connector. Ensure security compliance through encryption, access control policies, and audit logging via CloudTrail and & Connectivity :. Configure network settings, including bandwidth, latency, VPCs, subnets, endpoints, and security groups. Set up multi-AZ deployments for high availability and integrate with AWS Direct Connect for secure hybrid Logging & Recovery :. Design observability frameworks using CloudWatch, CloudTrail, and internal databases. Define backup, recovery, and rollback procedures for workspace environments. Provide operational support, image optimization, and iterative improvements based on testing feedback. Required Skills & Experience. 8+ years of experience as an AWS Cloud Engineer, DevOps Engineer, or EUC Specialist. Proven expertise in deploying and managing Amazon WorkSpaces at scale. Hands-on experience with Python, PowerShell, and scripting for infrastructure automation. Proficiency with Infrastructure as Code (IaC) using Terraform, AWS CloudFormation, and AWS CLI. Strong understanding of networking, security, Active Directory integration, and observability in cloud environments. In-depth knowledge of VDI design principles, including BYOL and hybrid DR configurations. Preferred Qualifications. AWS Certifications (e.g., Solutions Architect, SysOps Administrator, or DevOps Engineer). Experience with Azure Virtual Desktop (AVD) and hybrid VDI solutions. Familiarity with eDiscovery, FSLogix, and enterprise IAM frameworks. Exposure to large-scale EUC and cloud migration projects. (ref:hirist.tech).

Posted 2 days ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Chennai

Remote

Naukri logo

Location: 100% Remote Employment Type: Full-Time Must have Own laptop and Internet connection Work hours: 11 AM to 8 PM IST Position Summary: We are looking for a highly skilled and self-driven Full Stack Developer with deep expertise in React.js, Node.js, and AWS cloud services. The ideal candidate will play a critical role in designing, developing, and deploying full-stack web applications in a secure and scalable cloud environment. Key Responsibilities: Design and develop scalable front-end applications using React.js and modern JavaScript/TypeScript frameworks. Build and maintain robust backend services using Node.js, Express, and RESTful APIs. Architect and deploy full-stack solutions on AWS using services such as Lambda, API Gateway, ECS, RDS, S3, CloudFormation, CloudWatch, and DynamoDB. Ensure application performance, security, scalability, and maintainability. Work collaboratively in Agile/Scrum environments and participate in sprint planning, code reviews, and daily standups. Integrate CI/CD pipelines and automate testing and deployment workflows using AWS-native tools or services like Jenkins, CodeBuild, or GitHub Actions. Troubleshoot production issues, optimize system performance, and implement monitoring and alerting solutions. Maintain clean, well-documented, and reusable code and technical documentation. Required Qualifications: 5+ years of professional experience as a full stack developer. Strong expertise in React.js (Hooks, Context, Redux, etc.). Advanced backend development experience with Node.js and related frameworks. Proven hands-on experience designing and deploying applications on AWS Cloud. Solid understanding of RESTful services, microservices architecture, and cloud-native design. Experience working with relational databases (PostgreSQL, MySQL, DynamoDB). Proficient in Git and modern DevOps practices (CI/CD, Infrastructure as Code, etc.). Strong communication skills and ability to collaborate in distributed teams.

Posted 2 days ago

Apply

8.0 - 13.0 years

16 - 20 Lacs

Pune

Work from Office

Naukri logo

AWS Solution Architect/DevOps1 :We are seeking a highly skilled AWS DevOps Engineer / Solution Architect with a strong background in designing and implementing data-driven and API-based solutions. The ideal candidate will have deep expertise in AWS architecture, a passion for creating scalable, secure, and high-performance systems, and the ability to align technology solutions with business goals.Key Responsibilities Design and Architect Solutions: Develop and architect scalable, secure, and efficient cloud-based solutions on AWS for data and API-related projects. Infrastructure as Code: Implement infrastructure automation using tools such as Terraform, CloudFormation, or AWS CDK. API Development and Integration: Architect and implement RESTful APIs, ensuring high availability, scalability, and security using AWS API Gateway, Lambda, and related services. Data Solutions: Design and optimize data pipelines, data lakes, and storage solutions using AWS services like S3, Redshift, RDS, and DynamoDB. CI/CD Pipelines: Build, manage, and optimize CI/CD pipelines to automate deployments, testing, and infrastructure provisioning (Jenkins, CodePipeline, etc.). Monitoring and Optimization: Ensure robust monitoring, logging, and alerting mechanisms are in place using tools like CloudWatch, Prometheus, and Grafana. Collaboration and Best Practices: Work closely with cross-functional teams (development, data engineering, security) to implement DevOps best practices and deliver innovative cloud solutions. Security and Compliance: Implement AWS security best practices, including IAM, encryption, VPC, and security monitoring to ensure solutions meet security and compliance standards. Cost Optimization: Continuously optimize AWS environments for performance, scalability, and cost-effectiveness. Qualifications 8+ years of experience in AWS cloud architecture, with a focus on data and API solutions. Expertise in AWS core services such as EC2, S3, Lambda, API Gateway, RDS, DynamoDB, Redshift, and CloudFormation. Hands-on experience with infrastructure as code (IaC) tools like Terraform, AWS CDK, or CloudFormation. Proficiency in API design and development , particularly RESTful APIs and serverless architectures. Strong understanding of CI/CD pipelines , version control (Git), and automation tools. Knowledge of networking, security best practices, and AWS Well-Architected Framework . Experience with containerization technologies such as Docker and orchestration tools like Kubernetes or AWS ECS/EKS. Excellent problem-solving skills and ability to work independently and in a team environment. AWS Certifications such as AWS Certified Solutions Architect(Associate/Professional)or AWS Certified DevOps Engineer are highly preferred.

Posted 2 days ago

Apply

8.0 - 10.0 years

7 - 11 Lacs

Hyderabad, Pune

Work from Office

Naukri logo

Sr Data Engineer1 We are looking for a highly skilled Senior Data Engineer with strong expertise in DataWarehousing & Analytics to join our team. The ideal candidate will have extensive experiencein designing and managing data solutions, advanced SQL proficiency, and hands-on expertisein Python.Key ResponsibilitiesDesign, develop, and maintain scalable data warehouse solutions. Write and optimize complex SQL queries for data extraction, transformation, andreporting. Develop and automate data pipelines using Python. Work with AWS cloud services for data storage, processing, and analytics. Collaborate with cross-functional teams to provide data-driven insights and solutions. Ensure data integrity, security, and performance optimization. Work in UK shift hours to align with global stakeholders.Required Skills & Experience8-10 years of experience in Data Warehousing & Analytics. Strong proficiency in writing complex SQL queries with deep understanding of queryoptimization, stored procedures, and indexing. Hands-on experience with Python for data processing and automation. Experience working with AWS cloud services. Ability to work independently and collaborate with teams across different time zones.Good to HaveExperience in the Finance domain and understanding of financial data structures. Hands-on experience with reporting tools like Power BI or Tableau.

Posted 2 days ago

Apply

8.0 - 13.0 years

8 - 12 Lacs

Pune

Work from Office

Naukri logo

Full Stack .NET core + Angular with AWS Experience1 Education Bachelors or Masters degree in Computer Science, Engineering, or related field. Experience Minimum of 8 years of experience in .NET Core development with AWS. .NET Core Development Design, develop, and maintain high-quality, scalable, and efficient .NET Core applications. Implement best practices for coding and ensure code quality through unit testing and code reviews. Hands-on experience with containerization technologies like Docker and Kubernetes. Knowledge of database technologies like SQL Server, MongoDB, or Cosmos DB. AWS Experience Design and implement serverless solutions using AWS Lambda functions. Strong experience with AWS services, particularly Lambda and Step Functions. Deploy, manage on AWS cloud Optimise application performance and scalability in the cloud environment. Containerization Design and implement containerized applications using Docker and Kubernetes. Ensure proper orchestration and management of containers for high availability and resilience. Experience with serverless architecture and Azure Functions. Angular Development Develop and maintain front-end applications using Angular. Collaborate with UI/UX designers to implement responsive and user-friendly interfaces. Integrate front-end components with back-end services. Leadership and Mentorship Lead and mentor a team of developers, providing technical guidance and support. Conduct code reviews and ensure adherence to best practices and coding standards.

Posted 2 days ago

Apply

2.0 - 7.0 years

13 - 17 Lacs

Chennai

Work from Office

Naukri logo

Job Area: Engineering Group, Engineering Group > Software Engineering General Summary: As a leading technology innovator, Qualcomm pushes the boundaries of what's possible to enable next-generation experiences and drives digital transformation to help create a smarter, connected future for all. As a Qualcomm Software Engineer, you will design, develop, create, modify, and validate embedded and cloud edge software, applications, and/or specialized utility programs that launch cutting-edge, world class products that meet and exceed customer needs. Qualcomm Software Engineers collaborate with systems, hardware, architecture, test engineers, and other teams to design system-level software solutions and obtain information on performance requirements and interfaces. Minimum Qualifications: Bachelor's degree in Engineering, Information Systems, Computer Science, or related field and 2+ years of Software Engineering or related work experience. OR Master's degree in Engineering, Information Systems, Computer Science, or related field and 1+ year of Software Engineering or related work experience. OR PhD in Engineering, Information Systems, Computer Science, or related field. 2+ years of academic or work experience with Programming Language such as C, C++, Java, Python, etc. Job Title: MLOps Engineer - ML Platform Hiring Title: Flexible based on candidate experience – about Staff Engineer preferred : We are seeking a highly skilled and experienced MLOps Engineer to join our team and contribute to the development and maintenance of our ML platform both on premises and AWS Cloud. As a MLOps Engineer, you will be responsible for architecting, deploying, and optimizing the ML & Data platform that supports training of Machine Learning Models using NVIDIA DGX clusters and the Kubernetes platform, including technologies like Helm, ArgoCD, Argo Workflow, Prometheus, and Grafana. Your expertise in AWS services such as EKS, EC2, VPC, IAM, S3, and EFS will be crucial in ensuring the smooth operation and scalability of our ML infrastructure. You will work closely with cross-functional teams, including data scientists, software engineers, and infrastructure specialists, to ensure the smooth operation and scalability of our ML infrastructure. Your expertise in MLOps, DevOps, and knowledge of GPU clusters will be vital in enabling efficient training and deployment of ML models. Responsibilities will include: Architect, develop, and maintain the ML platform to support training and inference of ML models. Design and implement scalable and reliable infrastructure solutions for NVIDIA clusters both on premises and AWS Cloud. Collaborate with data scientists and software engineers to define requirements and ensure seamless integration of ML and Data workflows into the platform. Optimize the platform’s performance and scalability, considering factors such as GPU resource utilization, data ingestion, model training, and deployment. Monitor and troubleshoot system performance, identifying and resolving issues to ensure the availability and reliability of the ML platform. Implement and maintain CI/CD pipelines for automated model training, evaluation, and deployment using technologies like ArgoCD and Argo Workflow. Implement and maintain monitoring stack using Prometheus and Grafana to ensure the health and performance of the platform. Manage AWS services including EKS, EC2, VPC, IAM, S3, and EFS to support the platform. Implement logging and monitoring solutions using AWS CloudWatch and other relevant tools. Stay updated with the latest advancements in MLOps, distributed computing, and GPU acceleration technologies, and proactively propose improvements to enhance the ML platform. What are we looking for: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. Proven experience as an MLOps Engineer or similar role, with a focus on large-scale ML and/or Data infrastructure and GPU clusters. Strong expertise in configuring and optimizing NVIDIA DGX clusters for deep learning workloads. Proficient in using the Kubernetes platform, including technologies like Helm, ArgoCD, Argo Workflow, Prometheus , and Grafana . Solid programming skills in languages like Python, Go and experience with relevant ML frameworks (e.g., TensorFlow, PyTorch ). In-depth understanding of distributed computing, parallel computing, and GPU acceleration techniques. Familiarity with containerization technologies such as Docker and orchestration tools. Experience with CI/CD pipelines and automation tools for ML workflows (e.g., Jenkins, GitHub, ArgoCD). Experience with AWS services such as EKS , EC2, VPC, IAM, S3, and EFS. Experience with AWS logging and monitoring tools. Strong problem-solving skills and the ability to troubleshoot complex technical issues. Excellent communication and collaboration skills to work effectively within a cross-functional team. We would love to see: Experience with training and deploying models. Knowledge of ML model optimization techniques and memory management on GPUs. Familiarity with ML-specific data storage and retrieval systems. Understanding of security and compliance requirements in ML infrastructure.

Posted 2 days ago

Apply

1.0 - 5.0 years

12 - 16 Lacs

Chennai

Work from Office

Naukri logo

Job Area: Engineering Group, Engineering Group > Software Engineering General Summary: As a leading technology innovator, Qualcomm pushes the boundaries of what's possible to enable next-generation experiences and drives digital transformation to help create a smarter, connected future for all. As a Qualcomm Software Engineer, you will design, develop, create, modify, and validate embedded and cloud edge software, applications, and/or specialized utility programs that launch cutting-edge, world class products that meet and exceed customer needs. Qualcomm Software Engineers collaborate with systems, hardware, architecture, test engineers, and other teams to design system-level software solutions and obtain information on performance requirements and interfaces. Minimum Qualifications: Bachelor's degree in Engineering, Information Systems, Computer Science, or related field. Job Title: MLOps Engineer - ML Platform Hiring Title: Flexible based on candidate experience – about Staff Engineer preferred : We are seeking a highly skilled and experienced MLOps Engineer to join our team and contribute to the development and maintenance of our ML platform both on premises and AWS Cloud. As a MLOps Engineer, you will be responsible for architecting, deploying, and optimizing the ML & Data platform that supports training of Machine Learning Models using NVIDIA DGX clusters and the Kubernetes platform, including technologies like Helm, ArgoCD, Argo Workflow, Prometheus, and Grafana. Your expertise in AWS services such as EKS, EC2, VPC, IAM, S3, and EFS will be crucial in ensuring the smooth operation and scalability of our ML infrastructure. You will work closely with cross-functional teams, including data scientists, software engineers, and infrastructure specialists, to ensure the smooth operation and scalability of our ML infrastructure. Your expertise in MLOps, DevOps, and knowledge of GPU clusters will be vital in enabling efficient training and deployment of ML models. Responsibilities will include: Architect, develop, and maintain the ML platform to support training and inference of ML models. Design and implement scalable and reliable infrastructure solutions for NVIDIA clusters both on premises and AWS Cloud. Collaborate with data scientists and software engineers to define requirements and ensure seamless integration of ML and Data workflows into the platform. Optimize the platform’s performance and scalability, considering factors such as GPU resource utilization, data ingestion, model training, and deployment. Monitor and troubleshoot system performance, identifying and resolving issues to ensure the availability and reliability of the ML platform. Implement and maintain CI/CD pipelines for automated model training, evaluation, and deployment using technologies like ArgoCD and Argo Workflow. Implement and maintain monitoring stack using Prometheus and Grafana to ensure the health and performance of the platform. Manage AWS services including EKS, EC2, VPC, IAM, S3, and EFS to support the platform. Implement logging and monitoring solutions using AWS CloudWatch and other relevant tools. Stay updated with the latest advancements in MLOps, distributed computing, and GPU acceleration technologies, and proactively propose improvements to enhance the ML platform. What are we looking for: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. Proven experience as an MLOps Engineer or similar role, with a focus on large-scale ML and/or Data infrastructure and GPU clusters. Strong expertise in configuring and optimizing NVIDIA DGX clusters for deep learning workloads. Proficient in using the Kubernetes platform, including technologies like Helm, ArgoCD, Argo Workflow, Prometheus , and Grafana . Solid programming skills in languages like Python, Go and experience with relevant ML frameworks (e.g., TensorFlow, PyTorch ). In-depth understanding of distributed computing, parallel computing, and GPU acceleration techniques. Familiarity with containerization technologies such as Docker and orchestration tools. Experience with CI/CD pipelines and automation tools for ML workflows (e.g., Jenkins, GitHub, ArgoCD). Experience with AWS services such as EKS , EC2, VPC, IAM, S3, and EFS. Experience with AWS logging and monitoring tools. Strong problem-solving skills and the ability to troubleshoot complex technical issues. Excellent communication and collaboration skills to work effectively within a cross-functional team. We would love to see: Experience with training and deploying models. Knowledge of ML model optimization techniques and memory management on GPUs. Familiarity with ML-specific data storage and retrieval systems. Understanding of security and compliance requirements in ML infrastructure.

Posted 2 days ago

Apply

4.0 - 9.0 years

12 - 17 Lacs

Chennai

Work from Office

Naukri logo

Job Area: Engineering Group, Engineering Group > Software Engineering General Summary: As a leading technology innovator, Qualcomm pushes the boundaries of what's possible to enable next-generation experiences and drives digital transformation to help create a smarter, connected future for all. As a Qualcomm Software Engineer, you will design, develop, create, modify, and validate embedded and cloud edge software, applications, and/or specialized utility programs that launch cutting-edge, world class products that meet and exceed customer needs. Qualcomm Software Engineers collaborate with systems, hardware, architecture, test engineers, and other teams to design system-level software solutions and obtain information on performance requirements and interfaces. Minimum Qualifications: Bachelor's degree in Engineering, Information Systems, Computer Science, or related field and 4+ years of Software Engineering or related work experience. OR Master's degree in Engineering, Information Systems, Computer Science, or related field and 3+ years of Software Engineering or related work experience. OR PhD in Engineering, Information Systems, Computer Science, or related field and 2+ years of Software Engineering or related work experience. 2+ years of work experience with Programming Language such as C, C++, Java, Python, etc. Job Title: MLOps Engineer - ML Platform Hiring Title: Flexible based on candidate experience – about Staff Engineer preferred : We are seeking a highly skilled and experienced MLOps Engineer to join our team and contribute to the development and maintenance of our ML platform both on premises and AWS Cloud. As a MLOps Engineer, you will be responsible for architecting, deploying, and optimizing the ML & Data platform that supports training of Machine Learning Models using NVIDIA DGX clusters and the Kubernetes platform, including technologies like Helm, ArgoCD, Argo Workflow, Prometheus, and Grafana. Your expertise in AWS services such as EKS, EC2, VPC, IAM, S3, and EFS will be crucial in ensuring the smooth operation and scalability of our ML infrastructure. You will work closely with cross-functional teams, including data scientists, software engineers, and infrastructure specialists, to ensure the smooth operation and scalability of our ML infrastructure. Your expertise in MLOps, DevOps, and knowledge of GPU clusters will be vital in enabling efficient training and deployment of ML models. Responsibilities will include: Architect, develop, and maintain the ML platform to support training and inference of ML models. Design and implement scalable and reliable infrastructure solutions for NVIDIA clusters both on premises and AWS Cloud. Collaborate with data scientists and software engineers to define requirements and ensure seamless integration of ML and Data workflows into the platform. Optimize the platform’s performance and scalability, considering factors such as GPU resource utilization, data ingestion, model training, and deployment. Monitor and troubleshoot system performance, identifying and resolving issues to ensure the availability and reliability of the ML platform. Implement and maintain CI/CD pipelines for automated model training, evaluation, and deployment using technologies like ArgoCD and Argo Workflow. Implement and maintain monitoring stack using Prometheus and Grafana to ensure the health and performance of the platform. Manage AWS services including EKS, EC2, VPC, IAM, S3, and EFS to support the platform. Implement logging and monitoring solutions using AWS CloudWatch and other relevant tools. Stay updated with the latest advancements in MLOps, distributed computing, and GPU acceleration technologies, and proactively propose improvements to enhance the ML platform. What are we looking for: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. Proven experience as an MLOps Engineer or similar role, with a focus on large-scale ML and/or Data infrastructure and GPU clusters. Strong expertise in configuring and optimizing NVIDIA DGX clusters for deep learning workloads. Proficient in using the Kubernetes platform, including technologies like Helm, ArgoCD, Argo Workflow, Prometheus , and Grafana . Solid programming skills in languages like Python, Go and experience with relevant ML frameworks (e.g., TensorFlow, PyTorch ). In-depth understanding of distributed computing, parallel computing, and GPU acceleration techniques. Familiarity with containerization technologies such as Docker and orchestration tools. Experience with CI/CD pipelines and automation tools for ML workflows (e.g., Jenkins, GitHub, ArgoCD). Experience with AWS services such as EKS , EC2, VPC, IAM, S3, and EFS. Experience with AWS logging and monitoring tools. Strong problem-solving skills and the ability to troubleshoot complex technical issues. Excellent communication and collaboration skills to work effectively within a cross-functional team. We would love to see: Experience with training and deploying models. Knowledge of ML model optimization techniques and memory management on GPUs. Familiarity with ML-specific data storage and retrieval systems. Understanding of security and compliance requirements in ML infrastructure.

Posted 2 days ago

Apply

13.0 - 18.0 years

14 - 18 Lacs

Gurugram

Work from Office

Naukri logo

0px> Who are we? In one sentence We are seeking a Java Full Stack Architect & People Manager with strong technical depth and leadership capabilities to lead our Java Modernization projects. The ideal candidate will possess a robust understanding of Java Full Stack, Databases and Cloud-based solution delivery , combined with proven experience in managing high-performing technical teams. This role requires a visionary who can translate business challenges into scalable distributed solutions while nurturing talent and fostering innovation. What will your job look like? Lead the design and implementation of Java Full Stack solutions covering Frontend, Backend and Batch processes & interface Integrations across business use cases. Translate business requirements into technical architectures using Azure/AWS Cloud Platforms . Manage and mentor a multidisciplinary team of engineers, Leads and Specialists . Drive adoption of Databricks , Python In addition to Java-based frameworks within solution development. Collaborate closely with product owners, data engineering teams, and customer IT & business stakeholders. Ensure high standards in code quality, system performance, and model governance . Track industry trends and continuously improve the technology stack adopting newer trends showcasing productization, automation and innovative ideas. Oversee end-to-end lifecycle: use case identification, PoC, MVP, production deployment, and support. Define and monitor KPIs to measure team performance and project impact. All you need is... 13+ years of overall IT experience with a strong background in Telecom domain (preferred). Proven hands-on experience with Java Full Stack technologies and Cloud DBs Strong understanding of Design Principles and patterns for distributed applications OnPrem as well as OnCloud . Demonstrated experience in building and deploying on Azure or AWS via CI/CD practices . Strong expertise in Java, Databases, Python, Kafka and Linux Scripting . In-depth understanding of cloud-native architecture , microservices , and data pipelines . Solid people management experience: team building, mentoring, performance reviews. Strong analytical thinking and communication skills. Ability to be Hands-On with Coding, Reviews while Development and Production Support Good to Have Skills: Familiarity with Databricks, PySpark Familiarity of Snowflake Why you will love this job: You will be challenged with leading and mentoring a few development teams & projects You will join a strong team with lots of activities, technologies, business challenges and a progression path You will have the opportunity to work with the industry most advanced technologies

Posted 2 days ago

Apply

6.0 - 10.0 years

19 - 27 Lacs

Bengaluru

Work from Office

Naukri logo

Strong Python, Flask, REST API, and NoSQL skills. AWS Developer Associate certification is required. . Architect, build, and maintain secure, scalable backend services on AWS platforms. Utilize core AWS services and serverless technologies. Provident fund

Posted 2 days ago

Apply

0.0 - 1.0 years

0 - 2 Lacs

Ahmedabad

Work from Office

Naukri logo

The key qualifications we are looking for include: Strong problem-solving and communication skills. Experience with virtualization (VMware, Nutanix, Hyper-V). Hands-on experience with server hardware (Dell, HP, Cisco, Supermicro). Proficiency in networking protocols and security best practices. Prior experience as an infrastructure engineer or similar role. Relevant certifications (MCSE, AWS Solutions Architect) are a plus. Ability to work collaboratively in a team-oriented environment. 10th and 12th average should be 70%

Posted 2 days ago

Apply

3.0 - 5.0 years

27 - 32 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title : Data Engineer (DE) / SDE – Data Location Bangalore Experience range 3-15 What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects

Posted 2 days ago

Apply

0.0 - 2.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Data Engineer -1 (Experience – 0-2 years) What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 2 days ago

Apply

3.0 - 5.0 years

3 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 2 days ago

Apply

2.0 - 5.0 years

30 - 32 Lacs

Bengaluru

Work from Office

Naukri logo

Data Engineer -2 (Experience – 2-5 years) What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 2 days ago

Apply

3.0 - 5.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 2 days ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 2 days ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Pune, Ahmedabad

Work from Office

Naukri logo

We are seeking a skilled and motivated Google / AWS Cloud DevOps Engineer with over 3 years of hands-on experience in building and maintaining scalable, reliable, and secure cloud infrastructure. You will be part of a dynamic team that focuses on delivering robust DevOps solutions using Google Cloud Platform (GCP), AWS, helping streamline CI/CD pipelines, automate infrastructure provisioning, and optimize cloud-based deployments. Key Responsibilities: Design, implement, and manage scalable and secure infrastructure on Google Cloud Platform / AWS. Develop and maintain CI/CD pipelines using tools such as Cloud Build, Jenkins, GitLab CI/CD, or similar. Implement infrastructure as code (IaC) using Terraform or Pulumi. Monitor system health and performance using AWS / GCPs operations suite (formerly Stackdriver). Automate manual processes to improve system reliability and deployment frequency. Collaborate with software engineers to ensure best DevOps practices are followed in application development and deployment. Handle incident response and root cause analysis for production issues. Ensure compliance with security and governance policies on AWS / GCP. Optimize cost and resource utilization across cloud services. Required Qualifications: 3+ years of hands-on experience with DevOps tools and practices in a cloud environment. Strong experience with Google Cloud Platform (GCP) / AWS services (Compute Engine, Kubernetes Engine, Cloud Functions, Cloud Storage, VPC, etc.). Google / AWS Cloud Professional Cloud DevOps Engineer certification is mandatory. Proficiency with CI/CD tools and version control systems (e.g., Git, GitHub/GitLab, Cloud Build). Solid scripting skills in Bash, Python, or similar languages. Experience with Docker and Kubernetes. Familiarity with monitoring/logging tools such as Prometheus, Grafana, and Cloud Monitoring. Knowledge of networking, security best practices, and IAM on GCP / AWS. Preferred Qualifications: Experience with multi-cloud or hybrid cloud environments. Familiarity with Agile and DevOps culture and practices. Experience with serverless architectures and event-driven design patterns. Knowledge of cost optimization and GCP/AWS billing.

Posted 2 days ago

Apply

4.0 - 9.0 years

8 - 18 Lacs

Hyderabad, Chennai

Work from Office

Naukri logo

About the Role: We are looking for a highly skilled and experienced Machine Learning / AI Engineer to join our team at Zenardy. The ideal candidate needs to have a proven track record of building, deploying, and optimizing machine learning models in real-world applications. You will be responsible for designing scalable ML systems, collaborating with cross-functional teams, and driving innovation through AI-powered solutions. Location: Chennai, Hyderabad Key Responsibilities: Design, develop, and deploy machine learning models to solve complex business problems Work across the full ML lifecycle: data collection, preprocessing, model training, evaluation, deployment, and monitoring Collaborate with data engineers, product managers, and software engineers to integrate ML models into production systems Conduct research and stay up-to-date with the latest ML/AI advancements, applying them where appropriate Optimize models for performance, scalability, and robustness Document methodologies, experiments, and findings clearly for both technical and non-technical audiences Mentor junior ML engineers or data scientists as needed Required Qualifications: Bachelors or Masters degree in Computer Science, Machine Learning, Data Science, or related field (Ph.D. is a plus) Minimum of 5 hands-on ML/AI projects, preferably in production or with real-world datasets Proficiency in Python and ML libraries/frameworks like TensorFlow, PyTorch, Scikit-learn, XGBoost Solid understanding of core ML concepts: supervised/unsupervised learning, neural networks, NLP, computer vision, etc. Experience with model deployment using APIs, containers (Docker), cloud platforms (AWS/GCP/Azure) Strong data manipulation and analysis skills using Pandas, NumPy, and SQL Knowledge of software engineering best practices: version control (Git), CI/CD, unit testing Preferred Skills: Experience with MLOps tools (MLflow, Kubeflow, SageMaker, etc.) Familiarity with big data technologies like Spark, Hadoop, or distributed training frameworks Experience working in Fintech environments would be a plus Strong problem-solving mindset with excellent communication skills Experience in working with vector database. Understanding of RAG vs Fine-tuning vs Prompt Engineering Why Join Us: Work on impactful, real-world AI challenges Collaborate with a passionate and innovative team Opportunities for career advancement and learning Flexible work environment (remote/hybrid options) Competitive compensation and benefits

Posted 2 days ago

Apply

15.0 - 20.0 years

15 - 19 Lacs

Ahmedabad

Work from Office

Naukri logo

Project Role : Technology Architect Project Role Description : Review and integrate all application requirements, including functional, security, integration, performance, quality and operations requirements. Review and integrate the technical architecture requirements. Provide input into final decisions regarding hardware, network products, system software and security. Must have skills : Amazon Web Services (AWS) Good to have skills : Java Full Stack DevelopmentMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years of fulltime education.Role:Technology Architect Project Role Description:Review and integrate all application requirements, including functional, security, integration, performance, quality and operations requirements. Review and integrate the technical architecture requirements. Provide input into final decisions regarding hardware, network products, system software and security. Must have Skills :Amazon Web Services (AWS), SSINON SSI:Good to Have Skills :SSI:Java Full Stack Development NON SSI :Job :'',//field Key Responsibilities:1 Experience of designing multiple Cloud-native Application Architectures2 Experience of developing and deploying cloud-native application including serverless environment like Lambda 3 Optimize applications for AWS environment 4 Design, build and configure applications on AWS environment to meet business process and application requirements5 Understanding of security performance and cost optimizations for AWS6 Understanding to AWS Well-Architected best practices Technical Experience:1 8/15 years of experience in the industry with at least 5 years and above in AWS 2 Strong development background with exposure to majority of services in AWS3 AWS Certified Developer professional and/or AWs specialty level certification DevOps /Security 4 Application development skills on AWS platform with either Java SDK, Python SDK, Reactjs5 Strong in coding using any of the programming languages like Python/Nodejs/Java/Net understanding of AWS architectures across containerization microservices and serverless on AWS 6 Preferred knowledge in cost explorer, budgeting and tagging in AWS 7 Experience with DevOps tools including AWS native DevOps tools like CodeDeploy, Professional Attributes:a Ability to harvest solution and promote reusability across implementations b Self Motivated experts who can work under their own direction with right set of design thinking expertise c Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Educational Qualification:15 years of fulltime education. Additional Info:1 Application developers skills on AWS platform with either Java SDK, Python SDK, Nodejs, ReactJS 2 AWS services Lambda, AWS Amplify, AWS App Runner, AWS CodePipeline, AWS Cloud nine, EBS, Faregate,Additional comments:Only Bangalore, No Location Flex and No Level Flex Qualification 15 years of fulltime education.

Posted 2 days ago

Apply

9.0 - 12.0 years

27 - 42 Lacs

Chennai

Work from Office

Naukri logo

We are looking for a Sr. Software Engineer to analyze large amounts of raw information to find patterns and build data products to extract valuable business insights. Java developer roles and responsibilities include managing Java/Java EE application development while providing expertise in the full software development lifecycle, from concept and design to testing. Mandatory • Database Knowledge (like MSQL, Oracle, NoSQL etc.,) • Advanced Java (mainly on Spring Boot framework, web development, networking, and some familiarity with specific tools like Maven) Nice to have • PowerBI/Tableau • Python • Azure/AWS Cloud Skills and any Azure AI skills Non-Technical skills: • Analytical mind and business acumen • Strong math skills (e.g. statistics, algebra) • Problem-solving aptitude • Excellent communication and presentation skills

Posted 2 days ago

Apply

2.0 - 5.0 years

18 - 21 Lacs

Hyderabad

Work from Office

Naukri logo

Overview Annalect is currently seeking a data engineer to join our technology team. In this role you will build Annalect products which sit atop cloud-based data infrastructure. We are looking for people who have a shared passion for technology, design & development, data, and fusing these disciplines together to build cool things. In this role, you will work on one or more software and data products in the Annalect Engineering Team. You will participate in technical architecture, design, and development of software products as well as research and evaluation of new technical solutions. Responsibilities Design, build, test and deploy scalable and reusable systems that handle large amounts of data. Collaborate with product owners and data scientists to build new data products. Ensure data quality and reliability Qualifications Experience designing and managing data flows. Experience designing systems and APIs to integrate data into applications. 4+ years of Linux, Bash, Python, and SQL experience 2+ years using Spark and other frameworks to process large volumes of data. 2+ years using Parquet, ORC, or other columnar file formats. 2+ years using AWS cloud services, esp. services that are used for data processing e.g. Glue, Dataflow, Data Factory, EMR, Dataproc, HDInsights , Athena, Redshift, BigQuery etc. Passion for Technology: Excitement for new technology, bleeding edge applications, and a positive attitude towards solving real world challenges

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies