Jobs
Interviews

1084 S3 Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

ahmedabad, gujarat

On-site

As an AWS Cloud Architect in our organization based in Ahmedabad, your responsibilities will include: - Architecting, building, and maintaining cost-efficient, scalable cloud environments for the organization. - Understanding business objectives and creating cloud-based solutions to facilitate those objectives. - Moving archaic systems into the cloud for increased efficiency and digital transformation. - Keeping cloud environments secure to prevent downtime or security breaches. - Assessing risks associated with third-party platforms or frameworks. - Identifying opportunities to improve operations by digitizing common tasks. - Designing, building, and maintaining internal cloud applications. - Migrating data and internal processes to cloud architecture. - Minimizing data leakage and downtime risks. - Staying updated on best practices in cloud computing and enhancing the cloud infrastructure. - Collaborating with internal teams such as Sales, Operations, and IT. - Communicating with stakeholders and developing applications to meet project requirements. Qualifications required for this role: - 5+ years of experience in AWS IaaS as code, AWS Azure solution components. - Proficiency in AWS Services such as EC2, CloudFormation, VPC, ALB, AWS Security, S3, RDS, EBS, ECS. - 5+ years of experience in AWS with ECS and API Gateway. - Strong hands-on experience with infrastructure automation tools such as Azure DevOps, Jenkins, Terraform, Docker, Kubernetes, and Ansible. - Experience in migrating systems to hybrid or fully cloud-based solutions. Join us in our journey towards leveraging cloud technology to drive innovation and efficiency within our organization.,

Posted 1 day ago

Apply

14.0 - 20.0 years

0 Lacs

maharashtra

On-site

As a Senior Architect - Data & Cloud at our company, you will be responsible for architecting, designing, and implementing end-to-end data pipelines and data integration solutions for varied structured and unstructured data sources and targets. You will need to have more than 15 years of experience in Technical, Solutioning, and Analytical roles, with 5+ years specifically in building and managing Data Lakes, Data Warehouse, Data Integration, Data Migration, and Business Intelligence/Artificial Intelligence solutions on Cloud platforms like GCP, AWS, or Azure. Key Responsibilities: - Translate business requirements into functional and non-functional areas, defining boundaries in terms of Availability, Scalability, Performance, Security, and Resilience. - Architect and design scalable data warehouse solutions on cloud platforms like Big Query or Redshift. - Work with various Data Integration and ETL technologies on Cloud such as Spark, Pyspark/Scala, Dataflow, DataProc, EMR, etc. - Deep knowledge of Cloud and On-Premise Databases like Cloud SQL, Cloud Spanner, Big Table, RDS, Aurora, DynamoDB, Oracle, Teradata, MySQL, DB2, SQL Server, etc. - Exposure to No-SQL databases like Mongo dB, CouchDB, Cassandra, Graph dB, etc. - Experience in using traditional ETL tools like Informatica, DataStage, OWB, Talend, etc. - Collaborate with internal and external stakeholders to design optimized data analytics solutions. - Mentor young talent within the team and contribute to building assets and accelerators. Qualifications Required: - 14-20 years of relevant experience in the field. - Strong understanding of Cloud solutions for IaaS, PaaS, SaaS, Containers, and Microservices Architecture and Design. - Experience with BI Reporting and Dashboarding tools like Looker, Tableau, Power BI, SAP BO, Cognos, Superset, etc. - Knowledge of Security features and Policies in Cloud environments like GCP, AWS, or Azure. - Ability to compare products and tools across technology stacks on Google, AWS, and Azure Cloud. In this role, you will lead multiple data engagements on GCP Cloud for data lakes, data engineering, data migration, data warehouse, and business intelligence. You will interface with multiple stakeholders within IT and business to understand data requirements and take complete responsibility for the successful delivery of projects. Additionally, you will have the opportunity to work in a high-growth startup environment, contribute to the digital transformation journey of customers, and collaborate with a diverse and proactive team of techies. Please note that flexible, remote working options are available to foster productivity and work-life balance.,

Posted 1 day ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

Role Overview: As a member of the team at Capgemini, you will have the opportunity to design, implement, and maintain AWS cloud architecture. You will collaborate with software architects and engineering teams to optimize performance using scripting and automation. Your main responsibilities will include ensuring cloud security and compliance, troubleshooting operational issues, and maintaining infrastructure. Key Responsibilities: - Proficiency in AWS services such as EC2, ELB, RDS, and S3 - Experience with DevOps tools like Docker, Jenkins, Kubernetes, GitHub, and Ansible - Familiarity with monitoring tools including CloudWatch, ELK Stack, and Prometheus - Strong programming skills in Python, SQL, and scripting languages - Managing CI/CD pipeline and utilizing cloud security tools Qualifications Required: - Experience in designing, implementing, and maintaining AWS cloud architecture - Proficiency in AWS services: EC2, ELB, RDS, S3 - Familiarity with DevOps tools: Docker, Jenkins, Kubernetes, GitHub, Ansible - Knowledge of monitoring tools: CloudWatch, ELK Stack, Prometheus - Strong programming skills in Python, SQL, and scripting languages - Experience in CI/CD pipeline management and cloud security tools Please note that additional details about the company were not included in the provided job description.,

Posted 1 day ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

Role Overview: As a Back-End Developer, you will be responsible for building and maintaining the server-side logic, databases, and APIs that power applications. Your focus will be on the "behind-the-scenes" functionality that ensures smooth operation. Key Responsibilities: - Utilize your 5-6 years of fully backend experience on node js, javascript, typescript, and rest API (mandatory). - Demonstrate proficiency in MongoDB aggregation (mandatory), with experience in MongoDB Atlas considered a plus. - Hands-on experience with AWS Lambda, microservices, AWS SQS, S3, and CloudWatch is required. - Familiarity with GraphQL is a good-to-have skill. Qualifications Required: - 5-6 years of backend experience in node.js, JavaScript, TypeScript, and REST API. - Proficiency in MongoDB aggregation, with knowledge of MongoDB Atlas preferred. - Hands-on experience with AWS Lambda, microservices, AWS SQS, S3, and CloudWatch. - Familiarity with GraphQL would be beneficial for the role.,

Posted 1 day ago

Apply

6.0 - 8.0 years

6 - 16 Lacs

bangalore rural, bengaluru

Hybrid

Primary skills:ETL/ELT pipelines using DBT and AWS Redshift. Secondary skills :Proficiency in SQL and scripting languages like Python or Shell

Posted 1 day ago

Apply

4.0 - 9.0 years

7 - 17 Lacs

ahmedabad

Remote

• Design & manage CI/CD pipelines for mobile/web apps • Automate builds & deployments using Jenkins/GitHub Actions • Manage AWS infra with Terraform/CloudFormation • Container orchestration with Docker/Kubernetes/EKS Required Candidate profile • Senior: 4+ yrs (3 roles) | Lead: 6+ yrs (1 role) • Strong in AWS (EC2, S3, RDS, Lambda, VPC, IAM) • Skilled in CI/CD, Docker, Kubernetes, Terraform • Experience in iOS/Android build automation

Posted 1 day ago

Apply

5.0 - 10.0 years

9 - 19 Lacs

hyderabad, chennai, bengaluru

Work from Office

candidate having 5-6 years of fully backend experience on node js , javascript, typescript, rest API (mandatory) mongodb aggregation (mandatory) (mongo db atlas good to have) aws lambda , micro services , aws sqs, s3,cloud watch (Handson) (mandatory) graphQL (good to have)

Posted 1 day ago

Apply

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

Role Overview: As a Software Engineering Lead at CGI, you will play a crucial role in the Data and Analytics organization by actively participating in the development of initiatives that align with CGI's strategic goals. Your primary responsibility will involve understanding business logic and engineering solutions to support next-generation reporting and analytical capabilities on an enterprise-wide scale. Working in an agile environment, you will collaborate with your team to deliver user-oriented products for internal and external stakeholders. Key Responsibilities: - Be accountable for the delivery of business functionality. - Work on AWS cloud for migrating/re-engineering data and applications. - Engineer solutions adhering to enterprise standards and technologies. - Provide technical expertise through hands-on development of solutions for automated testing. - Conduct peer code reviews, merge requests, and production releases. - Implement design and functionality using Agile principles. - Demonstrate a track record of quality software development and innovation. - Collaborate effectively in a high-performing team environment. - Maintain a quality mindset to ensure data quality and monitor for potential issues. - Be entrepreneurial, ask smart questions, and champion new ideas. - Take ownership and accountability for your work. Qualifications Required: - 8-11 years of experience in application program development. - Bachelor's degree in Engineering or Computer Science. - Proficiency in PYTHON, Databricks, TERADATA, SQL, UNIX, ETL, Data Structures, Looker, Tableau, GIT, Jenkins, RESTful & GraphQL APIs. - Experience with AWS services such as Glue, EMR, Lambda, Step Functions, CloudTrail, CloudWatch, SNS, SQS, S3, VPC, EC2, RDS, IAM. Additional Company Details: At CGI, life is rooted in ownership, teamwork, respect, and belonging. As a CGI Partner, you will have the opportunity to turn meaningful insights into action from day one. You will contribute to innovative solutions, build relationships, and access global capabilities while shaping your career in a supportive environment that prioritizes your growth and well-being. Join CGI's team, one of the largest IT and business consulting firms globally, to make a difference in the world of technology and consulting.,

Posted 2 days ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

hyderabad

Work from Office

We are looking for a skilled Python Architect cum Developer to lead the design of real-time, event-driven microservices and AI-integrated backend systems. Must have strong expertise in Python, Fast API ,Kafka, and cloud-native AWS deployments. Required Candidate profile Expert in Python, Fast API, Kafka, and AWS with experience in AI/ML integration and event-driven microservices.

Posted 2 days ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

As an AWS Senior Solutions Engineer & Technical Account Manager (TAM) at our company, your role will involve blending deep technical expertise with strategic client engagement to lead complex cloud projects, provide advanced support, and act as a trusted advisor to enterprise customers. **Key Responsibilities:** - **Technical Leadership & Project Delivery** - Lead the design, implementation, and optimization of AWS-based solutions. - Deliver Infrastructure as Code (IaC) using tools like Terraform, CloudFormation, and Serverless Framework. - Conduct AWS Well-Architected Framework Reviews (WAFR) and support Migration Acceleration Program (MAP) engagements. - Ensure successful handover to operations through documentation and knowledge transfer. - **Customer Engagement & Account Management** - Serve as the primary technical point of contact for assigned AWS customers. - Provide proactive guidance on architecture, cost optimization, security, and operational best practices. - Lead customer workshops, technical reviews, and roadmap planning sessions. - **Advanced Support & Troubleshooting** - Provide expert-level support for various AWS services. - Troubleshoot complex infrastructure and application issues with minimal downtime. - Conduct root cause analysis and implement long-term solutions. - **Pre-Sales & Solution Scoping** - Support pre-sales activities including scoping calls and workshops. - Identify opportunities for service expansion and collaborate with sales and delivery teams. - Contribute to the development of AWS service offerings. - **Mentorship & Continuous Improvement** - Mentor junior engineers and consultants to foster a culture of learning and technical excellence. - Stay current with AWS innovations and certifications. **Qualifications Required:** - **Certifications:** - AWS Certified Solutions Architect Professional (required) - AWS Certified DevOps Engineer / SysOps Administrator (preferred) - Additional AWS Specialty certifications are a plus - **Technical Skills:** - 4+ years in AWS-focused engineering, consulting, or support roles - Strong experience with various AWS services including Compute, Networking, Storage, Databases, Monitoring, CI/CD, Security, and IAM best practices - **Soft Skills:** - Excellent communication and stakeholder management skills - Strong analytical and problem-solving abilities - Customer-first mindset with a proactive approach to issue resolution - Ability to lead cross-functional teams and manage technical risks,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Node.js Developer at Capgemini, you will play a crucial role in developing server-side applications and APIs. Your responsibilities will include: - Designing and deploying cloud-native solutions using AWS services such as Lambda, API Gateway, S3, DynamoDB, EC2, and CloudFormation. - Implementing and managing CI/CD pipelines for automated deployments. - Optimizing application performance to ensure scalability and reliability. - Collaborating with front-end developers, DevOps, and product teams to deliver end-to-end solutions. - Monitoring and troubleshooting production issues in cloud environments. - Writing clean, maintainable, and well-documented code. To excel in this role, you should have the following qualifications: - Strong proficiency in Node.js and JavaScript/TypeScript. - Hands-on experience with AWS services (Lambda, API Gateway, S3, DynamoDB, etc.). - Experience with serverless architecture and event-driven programming. - Familiarity with Infrastructure as Code (IaC) tools like CloudFormation or Terraform. - Knowledge of RESTful APIs, authentication (OAuth, JWT), and microservices. - Experience with CI/CD tools (e.g., GitHub Actions, Jenkins, AWS CodePipeline). - Understanding of logging, monitoring, and alerting tools (e.g., CloudWatch, ELK Stack). Capgemini offers a range of career paths and internal opportunities, allowing you to shape your career with personalized guidance from leaders. You will also benefit from comprehensive wellness benefits, access to a digital learning platform with 250,000+ courses, and the opportunity to work for a global business and technology transformation partner. Capgemini is a responsible and diverse group with a strong heritage of over 55 years and a global presence in more than 50 countries. Trusted by clients to unlock the value of technology, Capgemini delivers end-to-end services and solutions leveraging strengths in AI, generative AI, cloud and data, combined with deep industry expertise and a strong partner ecosystem.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

Role Overview: YASH Technologies is seeking AWS Professionals with expertise in AWS services such as Glue, Pyspark, SQL, Databricks, Python, and more. As an AWS Data Engineer, you will be responsible for designing, developing, testing, and supporting data pipelines and applications. This role requires a degree in computer science, engineering, or related fields along with strong experience in data integration and pipeline development. Key Responsibilities: - Design, develop, test, and support data pipelines and applications using AWS services like Glue, Pyspark, SQL, Databricks, Python, etc. - Work with a mix of Apache Spark, Glue, Kafka, Kinesis, and Lambda in S3 Redshift, RDS, MongoDB/DynamoDB ecosystems. - Utilize SQL in the development of data warehouse projects/applications (Oracle & SQL Server). - Develop in Python especially in PySpark in AWS Cloud environment. - Work with SQL and NoSQL databases like MySQL, Postgres, DynamoDB, Elasticsearch. - Manage workflow using tools like Airflow. - Utilize AWS cloud services such as RDS, AWS Lambda, AWS Glue, AWS Athena, EMR. - Familiarity with Snowflake and Palantir Foundry is a plus. Qualifications Required: - Bachelor's degree in computer science, engineering, or related fields. - 3+ years of experience in data integration and pipeline development. - Proficiency in Python, PySpark, SQL, and AWS. - Strong experience with data integration using AWS Cloud technologies. - Experience with Apache Spark, Glue, Kafka, Kinesis, Lambda, S3 Redshift, RDS, MongoDB/DynamoDB ecosystems. - Hands-on experience with SQL in data warehouse projects/applications. - Familiarity with SQL and NoSQL databases. - Knowledge of workflow management tools like Airflow. - Experience with AWS cloud services like RDS, AWS Lambda, AWS Glue, AWS Athena, EMR. Note: The JD also highlights YASH Technologies" empowering work environment that promotes career growth, continuous learning, and a positive, inclusive team culture grounded in flexibility, trust, transparency, and support for achieving business goals.,

Posted 2 days ago

Apply

5.0 - 8.0 years

4 - 8 Lacs

raipur

Work from Office

Exciting Career Opportunity: AWS Project Manager / Administrator Role Position: AWS Project Manager/ AWS Administrator Location: Raipur, Chhattisgarh, Employment Type: Full-Time Experience: 58 Years We are currently seeking an experienced AWS Project Manager / AWS Administrator to join our team. This role requires a proactive professional with hands-on expertise in managing and optimizing AWS cloud environments, implementing automation strategies, and ensuring compliance with best practices. Key Highlights of the Role: Manage and configure AWS Cloud services (VPC, EB, RDS, EC2, S3, CloudFront, Load Balancers, NAT Gateways, etc.). Drive automation to enhance operational efficiency. Implement cost optimization strategies (Reserved Instances, Spot Instances, etc.). Ensure robust security, compliance, and disaster recovery practices. Design AWS architecture aligned with business requirements. Qualifications: Minimum of 5 years’ experience in AWS environments. Strong knowledge of AWS services and cloud-native architecture. Hands-on experience with Docker, Kubernetes, and CloudFormation. AWS Certified Solutions Architect certification (preferred). If this opportunity aligns with your aspirations, we would be delighted to connect and explore further. Kindly share your updated resume with required details: harikrishna@treeht.com Current CTC: Excepted CTC: Notice period: Any Offers / Pipeline: Current Location: Preferred location: Looking forward to hearing from you.

Posted 2 days ago

Apply

6.0 - 11.0 years

18 - 30 Lacs

bengaluru

Work from Office

Role Overview: We are seeking a highly skilled Senior Lead AWS Expert with strong experience in Big Data Analytics to join our team. The ideal candidate will be able to work independently, design scalable solutions, and support customer initiatives from the front line. Non-Negotiable Requirements: Excellent communication and articulation skills Ability to work independently without supervision or guidance Proven hands-on experience with: AWS S3 and Parquet files AWS Lambda, Glue Jobs AWS EMR / Spark AWS Networking (Private Links, Route53, Cross-OU/Account Routing, etc.) API Gateways Terraform Strong solution design capabilities Other Key Details: Location: Bangalore - Experience: Min 6+ Relevant Experience Joining: Immediate joiner preferred

Posted 2 days ago

Apply

10.0 - 16.0 years

20 - 35 Lacs

noida, gurugram, delhi / ncr

Hybrid

Role & responsibilities Skill - Data Engineer- Python ,AWS ,glue, Lamba ,API Exp - 10+ Yrs. Location - Gurugram Notice period - Immediate Preferred candidate profile PFB JD We are seeking an experienced Lead Data Engineer with strong expertise in Python, AWS cloud services, ETL pipelines, and system integrations. The ideal candidate will lead the design, development, and optimization of scalable data solutions and ensure seamless API and data integrations across systems. You will collaborate with cross-functional teams to implement robust DataOps and CI/CD pipelines. Key Responsibilities: Responsible for implementation of scalable, secure, and high-performance data pipelines. Design and develop ETL processes using AWS services (Lambda, S3, Glue, Step Functions, etc.). Own and enhance API design and integrations for internal and external data systems. Work closely with data scientists, analysts, and software engineers to understand data needs and deliver solutions. Drive DataOps practices for automation, monitoring, logging, testing, and continuous deployment. Develop CI/CD pipelines for automated deployment of data solutions. Conduct code reviews and mentor junior engineers in best practices for data engineering and cloud development. Ensure compliance with data governance, security, and privacy policies. Required Skills & Experience: 10+ years of experience in data engineering, software development, or related fields. Strong programming skills in Python for building robust data applications. Expert knowledge of AWS services, particularly Lambda, S3, Glue, CloudWatch, and Step Functions. Proven experience designing and managing ETL pipelines for large-scale data processing. Experience with API design, RESTful services, and API integration workflows. Deep understanding of DataOps practices and principles. Hands-on experience implementing CI/CD pipelines (e.g., using CodePipeline, Jenkins, GitHub Actions). Familiarity with containerization tools like Docker and orchestration tools like ECS/EKS (optional but preferred). Strong understanding of data modeling, data warehousing concepts, and performance optimization.

Posted 2 days ago

Apply

3.0 - 5.0 years

5 - 15 Lacs

bengaluru

Work from Office

"DevOps/Infra Engineer with AWS (EC2, EBS, IAM, EKS, Route53, S3, CloudWatch), Linux admin, Terraform, Ansible, Jenkins CI/CD, Grafana, Kibana, ELK, Kafka, MySQL, MongoDB, Beanstalk, production systems & automation experience."

Posted 2 days ago

Apply

14.0 - 20.0 years

0 Lacs

maharashtra

On-site

Role Overview: As a Principal Architect - Data & Cloud at Quantiphi, you will be responsible for leveraging your extensive experience in technical, solutioning, and analytical roles to architect and design end-to-end data pipelines and data integration solutions for structured and unstructured data sources and targets. You will play a crucial role in building and managing data lakes, data warehouses, data integration, and business intelligence/artificial intelligence solutions on Cloud platforms like GCP, AWS, and Azure. Your expertise will be instrumental in designing scalable data warehouse solutions on Big Query or Redshift and working with various data integration, storage, and pipeline tools on Cloud. Additionally, you will serve as a trusted technical advisor to customers, lead multiple data engagements on GCP Cloud, and contribute to the development of assets and accelerators. Key Responsibilities: - Possess more than 15 years of experience in technical, solutioning, and analytical roles - Have 5+ years of experience in building and managing data lakes, data warehouses, data integration, and business intelligence/artificial intelligence solutions on Cloud platforms like GCP, AWS, and Azure - Ability to understand business requirements, translate them into functional and non-functional areas, and define boundaries in terms of availability, scalability, performance, security, and resilience - Architect, design, and implement end-to-end data pipelines and data integration solutions for structured and unstructured data sources and targets - Work with distributed computing and enterprise environments like Hadoop and Cloud platforms - Proficient in various data integration and ETL technologies on Cloud such as Spark, Pyspark/Scala, Dataflow, DataProc, EMR, etc. - Deep knowledge of Cloud and On-Premise databases like Cloud SQL, Cloud Spanner, Big Table, RDS, Aurora, DynamoDB, Oracle, Teradata, MySQL, DB2, SQL Server, etc. - Exposure to No-SQL databases like Mongo dB, CouchDB, Cassandra, Graph dB, etc. - Design scalable data warehouse solutions on Cloud with tools like S3, Cloud Storage, Athena, Glue, Sqoop, Flume, Hive, Kafka, Pub-Sub, Kinesis, Dataflow, DataProc, Airflow, Composer, Spark SQL, Presto, EMRFS, etc. - Experience with Machine Learning Frameworks like TensorFlow, Pytorch - Understand Cloud solutions for IaaS, PaaS, SaaS, Containers, and Microservices Architecture and Design - Good understanding of BI Reporting and Dashboarding tools like Looker, Tableau, Power BI, SAP BO, Cognos, Superset, etc. - Knowledge of security features and policies in Cloud environments like GCP, AWS, Azure - Work on business transformation projects for moving On-Premise data solutions to Cloud platforms - Serve as a trusted technical advisor to customers and solutions for complex Cloud and Data-related technical challenges - Be a thought leader in architecture design and development of cloud data analytics solutions - Liaise with internal and external stakeholders to design optimized data analytics solutions - Collaborate with SMEs and Solutions Architects from leading cloud providers to present solutions to customers - Support Quantiphi Sales and GTM teams from a technical perspective in building proposals and SOWs - Lead discovery and design workshops with potential customers globally - Design and deliver thought leadership webinars and tech talks with customers and partners - Identify areas for productization and feature enhancement for Quantiphi's product assets Qualifications Required: - Bachelor's or Master's degree in Computer Science, Information Technology, or related field - 14-20 years of experience in technical, solutioning, and analytical roles - Strong expertise in building and managing data lakes, data warehouses, data integration, and business intelligence/artificial intelligence solutions on Cloud platforms like GCP, AWS, and Azure - Proficiency in various data integration, ETL technologies on Cloud, and Cloud and On-Premise databases - Experience with Cloud solutions for IaaS, PaaS, SaaS, Containers, and Microservices Architecture and Design - Knowledge of BI Reporting and Dashboarding tools and security features in Cloud environments Additional Company Details: While technology is the heart of Quantiphi's business, the company attributes its success to its global and diverse culture built on transparency, diversity, integrity, learning, and growth. Working at Quantiphi provides you with the opportunity to be part of a culture that encourages innovation, excellence, and personal growth, fostering a work environment where you can thrive both professionally and personally. Joining Quantiphi means being part of a dynamic team of tech enthusiasts dedicated to translating data into tangible business value for clients. Flexible remote working options are available to promote productivity and work-life balance. ,

Posted 3 days ago

Apply

6.0 - 10.0 years

0 Lacs

kolkata, west bengal

On-site

As a Senior Applied ML Engineer at Cozeva, you will play a crucial role in bridging the gap between research and production, ensuring that Cozeva's AI innovations are successfully deployed, monitored, and scaled within the SaaS platform. Your expertise in machine learning models and enterprise-scale software systems will be instrumental in turning prototypes into high-performing, production-grade AI features that drive value-based care. Key Responsibilities: - Model Integration & Deployment: - Embed various ML/AI models (LLMs, NLP, risk models, forecasting) into Cozeva's software workflows and APIs. - Build and maintain production pipelines for training, evaluation, and deployment on cloud-native infrastructure. - Scalable Systems Engineering: - Design and manage distributed data pipelines for claims, clinical, and EHR data. - Ensure the performance, reliability, and cost efficiency of AI workloads on Aurora MySQL, Redshift, S3, EC2/K8s. - MLOps Practices: - Implement CI/CD for ML, including model versioning, automated retraining, monitoring, and rollback. - Monitor live model performance, drift, and fairness to comply with Cozeva's AI Governance Framework v1.1. - Applied Problem Solving: - Collaborate with AI Scientists to deploy models for NLP abstraction of clinical data, risk stratification, hospitalization/ED prediction, and member engagement. - Collaboration: - Work closely with data engineers, SDEs, and product teams to deliver AI-driven features in Cozeva's SaaS stack. Qualifications Required: - Bachelors or Masters in Computer Science, AI/ML, or related field. - 5-8 years of experience in building and deploying ML models into production. - Strong software engineering skills in Python, SQL, Docker/Kubernetes, and distributed systems. - Experience with ML frameworks such as PyTorch, TensorFlow, Hugging Face, and MLOps tools like MLflow, Kubeflow, SageMaker, or equivalent. - Proficiency in cloud infrastructure, preferably AWS (S3, EC2, RDS/Aurora, Redshift, IAM). - Proven track record of delivering production-ready ML/AI features at scale. Why Join Cozeva By joining Cozeva, you will have the opportunity to: - Make a meaningful impact on health equity and outcomes for millions of patients. - Deploy cutting-edge models directly into a SaaS platform. - Collaborate closely with the CTO, scientists, and engineering leaders. - Work with top-tier AI talent globally in a culture of transparency, collaboration, and impact.,

Posted 3 days ago

Apply

7.0 - 12.0 years

0 Lacs

karnataka

On-site

As a highly skilled Software Architect with strong expertise in Node.js and AWS, your role will involve driving the design and development of scalable, secure, and high-performance applications. You will lead a team of developers, guide architectural decisions, and collaborate with stakeholders to deliver cloud-native enterprise solutions. Key Responsibilities: - Lead and mentor a team of backend engineers in building enterprise-grade applications. - Architect and implement scalable microservices using Node.js and AWS cloud services. - Drive technical discussions, code reviews, and enforce best practices. - Design and manage serverless architectures using AWS Lambda, API Gateway, S3, DynamoDB, and CloudFormation/CDK. - Collaborate with product managers, DevOps, and QA teams to deliver high-quality releases. - Ensure application performance, security, reliability, and cost optimization on AWS. - Troubleshoot production issues and guide root cause analysis. - Stay updated with emerging technologies, tools, and industry trends to propose adoption where appropriate. Qualifications Required: - Bachelors or Masters degree in Computer Science, Engineering, or equivalent. - 7-12 years of hands-on development experience, with at least 3+ years in a lead/architect role. - Strong proficiency in Node.js, JavaScript/TypeScript. - Proven expertise in AWS services (Lambda, API Gateway, DynamoDB, S3, CloudWatch, ECS/EKS, RDS, IAM). - Solid understanding of microservices architecture, REST APIs, GraphQL, and event-driven systems (Kafka/SQS). - Experience with CI/CD pipelines and DevOps practices (Jenkins, GitHub Actions, CodePipeline). - Knowledge of containerization and orchestration (Docker, Kubernetes). - Strong understanding of application security, scalability, and performance optimization. - Excellent problem-solving, communication, and leadership skills. Good to Have: - Experience with Terraform / AWS CDK / CloudFormation (Infrastructure as Code). - Exposure to frontend frameworks (React/Angular) for end-to-end understanding. - Experience in data pipelines / big data / analytics platforms on AWS. - Prior experience working in Agile/Scrum environments.,

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

Role Overview: Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Key Responsibilities: - Design and develop Java-based applications using Spring Boot and RESTful APIs. - Build and deploy microservices on AWS using services like EC2, Lambda, S3, RDS, DynamoDB, and Elastic Beanstalk. - Implement CI/CD pipelines using tools such as Jenkins, Maven, Bitbucket, and Bamboo. - Work with cloud-native architectures and containerization tools like Docker and Kubernetes. - Collaborate with cross-functional teams to define, design, and ship new features. - Ensure the performance, quality, and responsiveness of applications. - Write unit and integration tests using test-driven development (TDD) practices. - Monitor and troubleshoot production issues and optimize application performance. - Maintain documentation for code, processes, and infrastructure. Qualification Required: - 5+ years of experience in Java development with strong knowledge of Spring/Spring Boot. - 3+ years of hands-on experience with AWS services. - Experience with microservices architecture and RESTful API development. - Proficiency in SQL and NoSQL databases (e.g., PostgreSQL, DynamoDB). - Familiarity with DevOps tools and practices. - Strong understanding of cloud security, scalability, and performance tuning. - Excellent problem-solving and communication skills. Company Details: Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market-leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

kochi, kerala

On-site

You are being sought after to take on the role of a Senior Data Engineer, where your primary responsibility will be leading the development of a scalable data ingestion framework. In this capacity, you will play a crucial role in ensuring high data quality and validation, as well as designing and implementing robust APIs for seamless data integration. Your expertise in building and managing big data pipelines using modern AWS-based technologies will be put to good use, making sure that quality and efficiency are at the forefront of data processing systems. Key Responsibilities: - **Data Ingestion Framework**: - Design & Development: You will architect, develop, and maintain an end-to-end data ingestion framework that efficiently extracts, transforms, and loads data from diverse sources. - Framework Optimization: Utilize AWS services such as AWS Glue, Lambda, EMR, ECS, EC2, and Step Functions to build highly scalable, resilient, and automated data pipelines. - **Data Quality & Validation**: - Validation Processes: Develop and implement automated data quality checks, validation routines, and error-handling mechanisms to ensure the accuracy and integrity of incoming data. - Monitoring & Reporting: Establish comprehensive monitoring, logging, and alerting systems to proactively identify and resolve data quality issues. - **API Development**: - Design & Implementation: Architect and develop secure, high-performance APIs to enable seamless integration of data services with external applications and internal systems. - Documentation & Best Practices: Create thorough API documentation and establish standards for API security, versioning, and performance optimization. - **Collaboration & Agile Practices**: - Cross-Functional Communication: Work closely with business stakeholders, data scientists, and operations teams to understand requirements and translate them into technical solutions. - Agile Development: Participate in sprint planning, code reviews, and agile ceremonies, while contributing to continuous improvement initiatives and CI/CD pipeline development. Required Qualifications: - **Experience & Technical Skills**: - Professional Background: At least 5 years of relevant experience in data engineering with a strong emphasis on analytical platform development. - Programming Skills: Proficiency in Python and/or PySpark, SQL for developing ETL processes and handling large-scale data manipulation. - AWS Expertise: Extensive experience using AWS services including AWS Glue, Lambda, Step Functions, and S3 to build and manage data ingestion frameworks. - Data Platforms: Familiarity with big data systems (e.g., AWS EMR, Apache Spark, Apache Iceberg) and databases like DynamoDB, Aurora, Postgres, or Redshift. - API Development: Proven experience in designing and implementing RESTful APIs and integrating them with external and internal systems. - CI/CD & Agile: Hands-on experience with CI/CD pipelines (preferably with GitLab) and Agile development methodologies. - **Soft Skills**: - Strong problem-solving abilities and attention to detail. - Excellent communication and interpersonal skills with the ability to work independently and collaboratively. - Capacity to quickly learn and adapt to new technologies and evolving business requirements. Preferred Qualifications: - Bachelors or Masters degree in Computer Science, Data Engineering, or a related field. - Experience with additional AWS services such as Kinesis, Firehose, and SQS. - Familiarity with data lakehouse architectures and modern data quality frameworks. - Prior experience in a role that required proactive data quality management and API-driven integrations in complex, multi-cluster environments. Please note that the job is based in Kochi and Thiruvananthapuram, and only local candidates are eligible to apply. This is a full-time position that requires in-person work. Experience Required: - AWS: 7 years - Python: 7 years - PySpark: 7 years - ETL: 7 years - CI/CD: 7 years Location: Kochi, Kerala,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Full Stack Software Developer, you will be responsible for leveraging your 5+ years of professional experience to contribute to cutting-edge projects. You should have proficiency in front-end technologies like JavaScript, TypeScript, React, or Angular js, along with a strong knowledge of back-end development using Node.js, Python, or Java. Your expertise in working with databases such as PostgreSQL, MySQL, or MongoDB, and AWS cloud services like EC2, S3, Lambda, API Gateway, RDS, and DynamoDB will be crucial for the successful execution of projects. Key Responsibilities: - Develop and maintain front-end and back-end components using a variety of technologies. - Work with databases, both SQL and NoSQL, to ensure efficient data storage and retrieval. - Implement cloud services on AWS, focusing on security, scalability, and performance optimization. - Collaborate with team members to design and implement RESTful APIs, GraphQL, and microservices architecture. - Utilize version control systems like Git and follow development workflows for continuous integration and deployment. - Apply problem-solving skills and attention to detail in both independent and collaborative work settings. Qualifications: - 5+ years of professional experience in full stack software development. - Proficiency in front-end technologies such as JavaScript, TypeScript, React, or Angular js. - Strong knowledge of back-end development using Node.js, Python, or Java. - Familiarity with AWS cloud services, databases, RESTful APIs, and microservices architecture. - Experience with Git, CI/CD workflows, and infrastructure as code using Terraform or AWS CloudFormation. About the Company: Grid Dynamics (NASDAQ: GDYN) is a prominent technology consulting and engineering firm that specializes in AI, advanced analytics, and platform services. With a focus on solving technical challenges and driving positive business outcomes for enterprise clients, Grid Dynamics stands out for its expertise in enterprise AI and continuous investment in data, analytics, cloud & DevOps, application modernization, and customer experience. Founded in 2006, Grid Dynamics has a global presence with headquarters in Silicon Valley and offices across the Americas, Europe, and India. Join our highly motivated team to work on cutting-edge projects in a flexible and supportive environment with competitive compensation and benefits. Note: The provided job description includes essential functions, qualifications, and additional details about the company, such as the opportunity to work on bleeding-edge projects, professional development opportunities, and a well-equipped office.,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

Role Overview: You will be responsible for utilizing your technical expertise in data engineering to support cybersecurity initiatives. Your role will involve integrating data science workflows, deploying ML models, and ensuring data quality and governance. Additionally, you will be involved in data analytics and visualization, CI/CD automation, and working in Agile/Scrum environments. Key Responsibilities: - Utilize strong skills in Python, PySpark, SQL, Databricks, and big data frameworks for data engineering tasks - Integrate data science workflows and deploy ML models within a cybersecurity context - Demonstrate advanced proficiency in AWS services such as EC2, S3, Lambda, ELB, and container orchestration (Docker, Kubernetes) - Implement automated data quality checks, data lineage, and governance standards to ensure data integrity and compliance - Use analytics and visualization tools like Databricks, Power BI, and Tableau to generate actionable insights for cybersecurity risks - Build CI/CD pipelines for automating testing, security scans, and deployment processes - Work in Agile/Scrum environments to effectively deliver complex projects with cross-functional collaboration Qualification Required: - B.E. in Computer Science, Data Science, Information Systems, or related field, or equivalent professional experience - 5+ years of experience in data engineering with expertise in Python, PySpark, SQL, Databricks, and big data frameworks - Proficiency in AWS services, data quality checks, analytics tools, CI/CD pipelines, and Agile/Scrum methodologies About Us: Oracle, a world leader in cloud solutions, focuses on utilizing cutting-edge technology to address current challenges. With a commitment to inclusivity and innovation, Oracle offers global opportunities for career growth while promoting work-life balance. Competitive benefits, flexible medical, life insurance, and retirement options are provided to support employees. Additionally, Oracle encourages community engagement through volunteer programs and is dedicated to including individuals with disabilities in the workforce. If you require accessibility assistance, you can reach out via email at accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States.,

Posted 3 days ago

Apply

15.0 - 19.0 years

0 Lacs

karnataka

On-site

Role Overview: You will be responsible for serving as a GCP (Google Cloud Platform) Architect, bringing in your 15+ years of experience to the team in Bengaluru (Brigade road). You will play a crucial role in designing and implementing cloud solutions using GCP services. Key Responsibilities: - Hold a Bachelors degree or higher in computer science, information systems, engineering, mathematics, or related technical disciplines. - Demonstrate a detailed understanding of key AWS services such as EC2, EBS, Monitoring, AWS Backups, EFS, FSX, S3, KMS, Lambda, AWS CLI, IAM, and VPC. - Utilize hands-on experience in creating EC2, EBS, and infrastructure using tools like Terraform, Ansible, or AWS CLI. - Review AWS Security features, audit CloudTrail, IAM security settings, environment Security features, VPC Security Groups, inbound/outbound traffic, NACL, and Encryption. - Establish and maintain backups of EC2 instances, ensuring daily validation. - Have a comprehensive understanding of various EBS, S3, Glacier, and possess at least 2 professional certifications related to cloud architecture. Qualifications Required: - Bachelors degree or higher in computer science, information systems, engineering, mathematics, or related technical disciplines. - Detailed knowledge of key AWS services like EC2, EBS, Monitoring, AWS Backups, EFS, FSX, S3, KMS, Lambda, AWS CLI, IAM, and VPC. - Hands-on experience in creating EC2, EBS, and infrastructure using tools like Terraform, Ansible, or AWS CLI. - Familiarity with AWS Security features, including CloudTrail, IAM security settings, VPC Security Groups, NACL, and Encryption. - Proficiency in managing backups of EC2 instances and a thorough understanding of various EBS, S3, Glacier services. - Possession of at least 2 professional certifications related to cloud architecture. Additional Details of the Company: Join Resolve Tech Solutions on a growth journey where you can expand your expertise and contribute to our clients" success through traditional and emerging technologies. As a leading technology solutions provider, we focus on advanced data analytics, smart technology solutions, and solution-specific accelerators. Our collaborative culture emphasizes teamwork and mentorship to support personal and professional growth.,

Posted 3 days ago

Apply

6.0 - 10.0 years

0 Lacs

delhi

On-site

As a Senior Full Stack Developer & AWS Team Lead at the company, your role will involve leading the technology foundation by leveraging your strong AWS expertise. You will be responsible for end-to-end design, development, and deployment of full stack applications, architecting scalable AWS cloud infrastructure, and developing frontend and backend applications using technologies such as React.js/Next.js, Node.js/Nest.js, MongoDB, PostgreSQL/MySQL, and Redis. Additionally, you will establish coding standards, CI/CD pipelines, and DevOps practices while mentoring and guiding a growing team of developers. Your collaboration with product, design, and business teams will be essential in translating vision into technology solutions and driving innovation with AI/ML integrations. Key Responsibilities: - Lead end-to-end design, development & deployment of full stack applications. - Architect scalable AWS cloud infrastructure with best practices. - Develop and optimize frontend (React.js/Next.js) and backend (Node.js/Nest.js) applications. - Set up and manage databases (MongoDB, PostgreSQL/MySQL) and caching (Redis). - Establish coding standards, CI/CD pipelines, DevOps practices for the team. - Mentor & guide a growing team of developers. - Collaborate with product, design, and business teams to translate vision into technology. - Ensure compliance with data privacy & security standards. - Drive innovation with AI/ML integrations and modular product evolution. Qualifications Required: - Full Stack Development experience for web and mobile applications. - Proficiency in React Native React.js/Next.js, JavaScript, TypeScript, CSS frameworks. - Strong knowledge of Node.js, REST APIs, GraphQL, and microservices architecture design. - Experience with MongoDB, PostgreSQL/MySQL, Redis for database management. - Hands-on experience with AWS Cloud services and DevOps practices. - Leadership experience in managing development teams. - Familiarity with Agile/Scrum practices and strong problem-solving skills. Additional Details: - The company offers an ownership-driven culture with growth opportunities into CTO-level leadership. - You will have the chance to work on impactful health & wellness tech solutions shaping India's future. - A flexible work environment and an innovation-driven team are part of the company culture. Preferred / Bonus Skills: - Experience in Health Tech, EdTech, or Aggregator platforms. - Knowledge of AI/ML integration such as chatbots and recommendation engines. - Prior startup experience with an ownership mindset. Experience Required: - Minimum 6-10 years in full stack development. - At least 3 years of hands-on AWS cloud architecture experience. - Proven leadership skills in managing development teams.,

Posted 3 days ago

Apply

Exploring s3 Jobs in India

The job market for s3 professionals in India is growing rapidly with the increasing demand for cloud computing services. Companies are looking for skilled individuals who can effectively manage and optimize their s3 storage solutions. If you are considering a career in s3, here is a detailed guide to help you navigate the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

Average Salary Range

The average salary range for s3 professionals in India varies based on experience level: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum

Career Path

A typical career path in s3 may include the following progression: - Junior s3 Engineer - s3 Developer - s3 Architect - s3 Specialist - s3 Consultant

Related Skills

In addition to expertise in s3, employers often look for professionals with the following skills: - Cloud computing knowledge - AWS certification - Data management skills - Programming skills (e.g., Python, Java) - Problem-solving abilities

Interview Questions

  • What is Amazon s3 and how does it work? (basic)
  • How do you secure s3 buckets? (medium)
  • Explain the difference between s3 and EBS. (medium)
  • How can you improve the performance of s3? (medium)
  • What are the different storage classes in s3? (basic)
  • What is the maximum size of an s3 object? (basic)
  • How would you troubleshoot slow s3 performance? (medium)
  • What is versioning in s3 and why is it useful? (basic)
  • Explain the significance of object lifecycle policies in s3. (advanced)
  • How do you monitor s3 storage usage and performance? (medium)
  • Describe the process of transferring data to and from s3. (basic)
  • What is cross-region replication in s3? (medium)
  • How do you handle encryption in s3? (medium)
  • What are the limitations of s3? (medium)
  • How do you handle data consistency in s3? (advanced)
  • Explain the concept of event notifications in s3. (medium)
  • How do you manage permissions in s3? (basic)
  • What is the difference between s3 and EFS? (medium)
  • How can you optimize costs in s3 storage? (medium)
  • Describe the process of hosting a static website on s3. (medium)
  • What is the significance of multipart uploads in s3? (medium)
  • How do you handle versioning conflicts in s3? (advanced)
  • Explain the concept of pre-signed URLs in s3. (advanced)
  • How do you handle data archiving in s3? (medium)
  • What are the best practices for s3 bucket naming conventions? (basic)

Closing Remark

As you explore opportunities in the s3 job market in India, remember to showcase your expertise, skills, and knowledge confidently during interviews. With the right preparation and a positive attitude, you can excel in s3 roles and contribute effectively to the growing field of cloud computing. Good luck in your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies