Home
Jobs

17 Boto3 Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 13.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

8+ years experience combined between backend and data platform engineering roles Worked on large scale distributed systems. 5+ years of experience building data platform with (one of) Apache Spark, Flink or with similar frameworks. 7+ years of experience programming with Java Experience building large scale data/event pipelines Experience with relational SQL and NoSQL databases, including Postgres/MySQL, Cassandra, MongoDB Demonstrated experience with EKS, EMR, S3, IAM, KDA, Athena, Lambda, Networking, elastic cache and other AWS services.

Posted 1 week ago

Apply

8.0 - 11.0 years

7 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

HIH - Software Engineering Associate Advisor Position Overview The successful candidate will be a member of our US medical Integration Solutions ETL team. They will play a major role in the design and development if the ETL application in support of various portfolio projects. Responsibilities Analyze business requirements and translate into ETL architecture and data rules Serve as advisor and subject matter expert on project teams Manage both employees and consultants on multiple ETL projects. Oversee and review all design and coding from developers to ensure they follow company standards and best practices, as well as architectural direction Assist in data analysis and metadata management Test planning and execution Effectively operate within a team of technical and business professionals Asses new talent and mentor direct reports on best practices Review all designs and code from developers Qualifications Desired Skills & Experience: 8 - 11 Years of Experience in Java and Python, PySpark to support new development as well as support existing 7+ Years of Experience with Cloud technologies, specifically AWS Experience in AWS services such as Lambda, Glue, s3, MWAA, API Gateway and Route53, DynamoDB, RDS MySQL, SQS, CloudWatch, Secrete Manager, KMS, IAM, EC2 and Auto Scaling Group, VPC and Security Groups Experience with Boto3, Pandas and Terraforms for building Infrastructure as a Code Experience with IBM Datastage ETL tool Experience with CD /CI methodologies and processing and the development of these processes DevOps experience Knowledge in writing SQL Data mappingsource to target target to multiple formats Experience in the development of data extraction and load processes in a parallel framework Understanding of normalized and de-normalized data repositories Ability to define ETL standards & processes SQL Standards / Processes / Tools: Mapping of data sources ETL Development, monitoring, reporting and metrics Focus on data quality Experience with DB2/ZOS, Oracle, SQL Server, Teradata and other database environments Unix experience Excellent problem solving and organizational skills Strong teamwork and interpersonal skills and ability to communicate with all management levels Leads others toward technical accomplishments and collaborative project team efforts Very strong communication skills, both verbal and written, including technical writing Strong analytical and conceptual skills Location & Hours of Work (Specify whether the position is remote, hybrid, in-office and where the role is located as well as the required hours of work) About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.

Posted 1 week ago

Apply

4.0 - 8.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

The Cigna International Health unit uses Amazon Web Services (AWS) services and custom, proprietary solutions implemented in AWS, to pre-process paper health care claims from around the world. The volume is expected to reach significantly higher per day, as expansion of the initiative is one of the top priorities for International Health. To do so, the engineer is expected to build and support solutions that pre-process the paper image claims to extract data, build pipelines using serverless solutions and invoke AI/ML processes to populate claim data from the submitted claims. The engineer will also be working on building metrics, monitoring and operational dashboards. Required Skills: Strong hands-on experience with Python, Boto3, and test-driven development techniques such as unit testing and gameday testing. Hands-on experience in writing unit tests with Python. Hands-on experience with common AWS Services such as Lambdas, Step Functions, DynamoDB, S3, and CloudWatch. Experience in deploying applications to development and test environments. Enters an existing team and learns rapidly about the overall goals of the solution. Collaborates with the rest of the team to explore paths towards the overall goals. Participate in peer reviews and deployments.Executes, understands that work is not complete until it is implemented. When analysis is complete and decisions have been made, the work has only just begun. Embraces an agile mindset to adjust to best achieve the overall goals; is not locked into initial decisions. At the same time, develops plans in advance to find a healthy balance of preparedness and flexibility, as appropriate for each situation’s needs. Rapidly raises up defects, and reflects on where prior judgment was incorrect in the spirit of growth. Good news travels fast, bad news faster. Addresses the mistakes of others in the spirit of learning and growth. Models these behaviors in the team retrospective. About The Cigna Group Cigna Healthcare, a division of The Cigna Group, is an advocate for better health through every stage of life. We guide our customers through the health care system, empowering them with the information and insight they need to make the best choices for improving their health and vitality. Join us in driving growth and improving lives.

Posted 2 weeks ago

Apply

1.0 - 5.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Minimum Qualifications:- BA/BSc/B.E./BTech degree from Tier I, II college in Computer Science, Statistics, Mathematics, Economics or related fields- 1to 4 years of experience in working with data and conducting statistical and/or numerical analysis- Strong understanding of how data can be stored and accessed in different structures- Experience with writing computer programs to solve problems- Strong understanding of data operations such as sub-setting, sorting, merging, aggregating and CRUD operations- Ability to write SQL code and familiarity with R/Python, Linux shell commands- Be willing and able to quickly learn about new businesses, database technologies and analysis techniques- Ability to tell a good story and support it with numbers and visuals- Strong oral and written communication Preferred Qualifications:- Experience working with large datasets- Experience with AWS analytics infrastructure (Redshift, S3, Athena, Boto3)- Experience building analytics applications leveraging R, Python, Tableau, Looker or other- Experience in geo-spatial analysis with POSTGIS, QGIS

Posted 3 weeks ago

Apply

4.0 - 9.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Minimum Qualifications: - BA/BSc/B.E./BTech degree from Tier I, II college in Computer Science, Statistics, Mathematics, Economics or related fields - 1to 4 years of experience in working with data and conducting statistical and/or numerical analysis - Strong understanding of how data can be stored and accessed in different structures - Experience with writing computer programs to solve problems - Strong understanding of data operations such as sub-setting, sorting, merging, aggregating and CRUD operations - Ability to write SQL code and familiarity with R/Python, Linux shell commands - Be willing and able to quickly learn about new businesses, database technologies and analysis techniques - Ability to tell a good story and support it with numbers and visuals - Strong oral and written communication Preferred Qualifications: - Experience working with large datasets - Experience with AWS analytics infrastructure (Redshift, S3, Athena, Boto3) - Experience building analytics applications leveraging R, Python, Tableau, Looker or other - Experience in geo-spatial analysis with POSTGIS, QGIS Apply Save Save Pro Insights

Posted 3 weeks ago

Apply

3.0 - 8.0 years

9 - 18 Lacs

Hyderabad

Hybrid

Naukri logo

Data Engineer with Python development experience Experience: 3+ Years Mode: Hybrid (2-3 days/week) Location: Hyderabad Key Responsibilities Develop, test, and deploy data processing pipelines using AWS Serverless technologies such as AWS Lambda, Step Functions, DynamoDB, and S3. Implement ETL processes to transform and process structured and unstructured data eiciently. Collaborate with business analysts and other developers to understand requirements and deliver solutions that meet business needs. Write clean, maintainable, and well-documented code following best practices. Monitor and optimize the performance and cost of serverless applications. Ensure high availability and reliability of the pipeline through proper design and error handling mechanisms. Troubleshoot and debug issues in serverless applications and data workows. Stay up-to-date with emerging technologies in the AWS and serverless ecosystem to recommend improvements. Required Skills and Experience 3-5 years of hands-on Python development experience, including experience with libraries like boto3, Pandas, or similar tools for data processing. Strong knowledge of AWS services, especially Lambda, S3, DynamoDB, Step Functions, SNS, SQS, and API Gateway. Experience building data pipelines or workows to process and transform large datasets. Familiarity with serverless architecture and event-driven programming. Knowledge of best practices for designing secure and scalable serverless applications. Prociency in version control systems (e.g., Git) and collaboration tools. Understanding of CI/CD pipelines and DevOps practices. Strong debugging and problem-solving skills. Familiarity with database systems, both SQL (e.g., RDS) and NoSQL (e.g., DynamoDB). Preferred Qualications AWS certications (e.g., AWS Certied Developer Associate or AWS Certied Solutions Architect Associate). Familiarity with testing frameworks (e.g., pytest) and ensuring test coverage for Python applications. Experience with Infrastructure as Code (IaC) tools such as AWS CDK, CloudFormation. Knowledge of monitoring and logging tools . Apply for Position

Posted 3 weeks ago

Apply

5 - 8 years

8 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Mandatory Skill Set: Devops, AWS, Python Scripting, Kubernetes, CI/CD Pipelines, Automation, Ad-hoc & Post Production Support, Boto3

Posted 1 month ago

Apply

4 - 9 years

12 - 16 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

Role & responsibilities Urgent Hiring for one of the reputed MNC Immediate Joiners Only Females Exp - 4-9 Years Bang / Hyd / Pune As a Python Developer with AWS , you will be responsible for developing cloud-based applications, building data pipelines, and integrating with various AWS services. You will work closely with DevOps, Data Engineering, and Product teams to design and deploy solutions that are scalable, resilient, and efficient in an AWS cloud environment. Key Responsibilities: Python Development : Design, develop, and maintain applications and services using Python in a cloud environment. AWS Cloud Services : Leverage AWS services such as EC2 , S3 , Lambda , RDS , DynamoDB , and API Gateway to build scalable solutions. Data Pipelines : Develop and maintain data pipelines, including integrating data from various sources into AWS-based storage solutions. API Integration : Design and integrate RESTful APIs for application communication and data exchange. Cloud Optimization : Monitor and optimize cloud resources for cost efficiency, performance, and security. Automation : Automate workflows and deployment processes using AWS Lambda , CloudFormation , and other automation tools. Security & Compliance : Implement security best practices (e.g., IAM roles, encryption) to protect data and maintain compliance within the cloud environment. Collaboration : Work with DevOps, Cloud Engineers, and other developers to ensure seamless deployment and integration of applications. Continuous Improvement : Participate in the continuous improvement of development processes and deployment practices. Required Qualifications: Python Expertise : Strong experience in Python programming, including using libraries like Pandas , NumPy , Boto3 (AWS SDK for Python), and frameworks like Flask or Django . AWS Knowledge : Hands-on experience with AWS services such as S3 , EC2 , Lambda , RDS , DynamoDB , CloudFormation , and API Gateway . Cloud Infrastructure : Experience in designing, deploying, and maintaining cloud-based applications using AWS. API Development : Experience in designing and developing RESTful APIs, integrating with external services, and managing data exchanges. Automation & Scripting : Experience with automation tools and scripts (e.g., using AWS Lambda , Boto3 , CloudFormation ). Version Control : Proficiency with version control tools such as Git . CI/CD Pipelines : Experience building and maintaining CI/CD pipelines for cloud-based applications. Preferred candidate profile Familiarity with serverless architectures using AWS Lambda and other AWS serverless services. AWS Certification (e.g., AWS Certified Developer Associate , AWS Certified Solutions Architect Associate ) is a plus. Knowledge of containerization tools like Docker and orchestration platforms such as Kubernetes . Experience with Infrastructure as Code (IaC) tools such as Terraform or AWS CloudFormation .

Posted 1 month ago

Apply

4 - 9 years

7 - 12 Lacs

Pune

Work from Office

Naukri logo

About The Role As a Python Developer with AWS, you will be responsible for developing cloud-based applications, building data pipelines, and integrating with various AWS services. You will work closely with DevOps, Data Engineering, and Product teams to design and deploy solutions that are scalable, resilient, and efficient in an AWS cloud environment. Python DevelopmentDesign, develop, and maintain applications and services using Python in a cloud environment. AWS Cloud ServicesLeverage AWS services such as EC2, S3, Lambda, RDS, DynamoDB, and API Gateway to build scalable solutions. Data PipelinesDevelop and maintain data pipelines, including integrating data from various sources into AWS-based storage solutions. API IntegrationDesign and integrate RESTful APIs for application communication and data exchange. Cloud OptimizationMonitor and optimize cloud resources for cost efficiency, performance, and security. AutomationAutomate workflows and deployment processes using AWS Lambda, CloudFormation, and other automation tools. Security & ComplianceImplement security best practices (e.g., IAM roles, encryption) to protect data and maintain compliance within the cloud environment. CollaborationWork with DevOps, Cloud Engineers, and other developers to ensure seamless deployment and integration of applications. Continuous ImprovementParticipate in the continuous improvement of development processes and deployment practices. Primary Skills P ython ExpertiseStrong experience in Python programming, including using libraries like Pandas, NumPy, Boto3 (AWS SDK for Python), and frameworks like Flask or Django. AWS KnowledgeHands-on experience with AWS services such as S3, EC2, Lambda, RDS, DynamoDB, CloudFormation, and API Gateway. Cloud InfrastructureExperience in designing, deploying, and maintaining cloud-based applications using AWS. API DevelopmentExperience in designing and developing RESTful APIs, integrating with external services, and managing data exchanges. Automation & ScriptingExperience with automation tools and scripts (e.g., using AWS Lambda, Boto3, CloudFormation). Version ControlProficiency with version control tools such as Git. CI/CD PipelinesExperience building and maintaining CI/CD pipelines for cloud-based applications. Secondary Skills

Posted 2 months ago

Apply

2 - 4 years

5 - 14 Lacs

Greater Noida, Noida

Work from Office

Naukri logo

Dear All, We have an openings for python developer with Angular /Boto3 for Noida loc( Work from office)

Posted 2 months ago

Apply

5 - 10 years

12 - 22 Lacs

Bengaluru

Remote

Naukri logo

Hi Candidates, Here we have Job openings in one of our MNC Company Role & responsibilities JD: Data Engineer JD The requirements for the candidate: Data Engineer with a minimum of 5+ years of experience of data engineering experience. The role will require deep knowledge of data engineering techniques to create data pipelines and build data assets. At least 4+ years of Strong hands on programming experience with Pyspark / Python / Boto3 including Python Frameworks, libraries according to python best practices. Strong experience in code optimisation using spark SQL and pyspark. Understanding of Code versioning ,Git repository , JFrog Artifactory. AWS Architecture knowledge specially on S3, EC2, Lambda, Redshift, CloudFormation etc and able to explain benefits of each Code Refactorization of Legacy Codebase: Clean, modernize, improve readability and maintainability. Unit Tests/TDD: Write tests before code, ensure functionality, catch bugs early. Fixing Difficult Bugs: Debug complex code, isolate issues, resolve performance, concurrency, or logic flaws Preferred candidate profile Perks and benefits

Posted 2 months ago

Apply

3 - 5 years

1 - 5 Lacs

Hyderabad

Work from Office

Naukri logo

What you will do Let’s do this. Let’s change the world. Role Description: Amgen’s Virtual Desktop Infrastructure (VDI) service maintains a growing fleet of more than eight thousand desktop instances supporting a broad range of requirements. The Software Engineer will work closely with the Virtual Desktop Service Owner and System Owner to design and develop automation solutions to facilitate VDI service management and optimize the end user experience. Roles & Responsibilities: Maintain existing C# and Python code hosted in Azure and AWS Independently develop, test, and deploy code based on designs generated in collaboration with the Virtual Desktop team Generate code documentation Deploy and maintain a robust CI/CD solution Conduct code reviews to ensure code quality and adherence to best practices What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master’s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor’s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Fluency in C# Direct experience with cloud platform services including Azure or AWS (Azure preferred) Strong knowledge of information systems and related technologies Strong rapid prototyping skills and the ability to translate concepts into working code Preferred Qualifications: Working knowledge of Python Familiarity with Microsoft Graph API, Azure Functions, Azure Logic Apps, and Cosmo DB Familiarity with Python Boto3, AWS Lambda, DynamoDB, S3, SQS Understanding of REST APIs and Database services Experience with API integration, serverless, and microservices architecture Strong problem solving and analytical skills Be willing and able to learn new technologies at a rapid pace Understanding of desktop systems lifecycle and management Experience with data visualization platforms (Power BI or Tableau) Soft Skills: Strong verbal and written communication skills Ability to work independently or in a team environment High degree of initiative and self-motivation Ability to manage multiple priorities successfully Work Hours This position operates on the second shift, from 2:00 PM to 10:00 PM IST. Candidates must be willing and able to work during these hours, including weekends and holidays as required. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 2 months ago

Apply

6 - 11 years

9 - 19 Lacs

Bengaluru

Work from Office

Naukri logo

Dear Candidate, We are hiring Python Tech Lead for Otomeyt AI. PFB the job description. Mandatory Skills - Python + AWS (Lambda, SES, SQS) + Boto3 + Rest API (Fast & Flask API) Job Summary: We are seeking a Python Technical Lead with strong expertise in software architecture, AI model integration, and cloud technologies. The ideal candidate will have 8+ years of experience in Python development and will play a crucial role in architecting scalable solutions, leading AI/ML integrations, and driving strategic technical decisions. This role requires a strong mix of hands-on development, technical leadership, and system design to ensure high performance, security, and scalability of AI-powered applications. Key Responsibilities: Technical Leadership & Architectural Decisions Define and implement high-level software architecture for AI-driven applications. Establish best practices and coding standards to ensure code quality and maintainability. Make critical technical decisions around system design, cloud infrastructure, and microservices architecture. AI/ML Model Integration & Deployment Collaborate with data scientists to integrate and optimize AI/ML models into production environments. Develop robust AI pipelines for model training, inference, and continuous learning. Implement MLOps best practices to automate model deployment and monitoring. Backend & API Development Design and develop high-performance, scalable RESTful APIs using Python frameworks (FastAPI, Flask). Implement event-driven architectures (Kafka, RabbitMQ, Redis Pub/Sub) to optimize AI workflows. Manage database design and performance tuning for structured (PostgreSQL) and unstructured (NoSQL, DynamoDB) data. Cloud & DevOps Lead cloud strategy and architecture decisions using AWS. Oversee CI/CD pipelines, Docker, Kubernetes, and serverless deployment strategies. Ensure security and compliance best practices for data protection and API security. Team Mentorship & Collaboration Act as a technical mentor, guiding junior and mid-level engineers. Work closely with product managers, DevOps, and AI researchers to align business and technical goals. Conduct code reviews, technical training, and architectural discussions. Required Skills & Qualifications: 8+ years of Python development experience with a focus on scalability, architecture, and AI integration. Proven experience in technical leadership, mentoring, and decision-making. Strong background in system design, microservices, and event-driven architectures. Hands-on experience with AWS, including Lambda, ECS, S3, API Gateway, DynamoDB, SQS, SES. Experience with Docker, Kubernetes, Terraform, and CI/CD pipelines. Knowledge of security best practices, API authentication, and role-based access controls (RBAC). Strong problem-solving skills and ability to lead projects end-to-end. Preferred Qualifications: Familiarity with Generative AI applications. Exposure to GraphQL, WebSockets, and real-time data processing. Previous experience as a Technical Lead or Principal Engineer. Why Join Us? Opportunity to lead AI-driven innovation in a cutting-edge product. Work with a talented team of engineers, AI scientists, and cloud architects. A fast-paced environment where your technical decisions will have a direct impact.

Posted 2 months ago

Apply

8 - 13 years

15 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Dear Candidate, We are hiring Python Tech Lead for Otomeyt AI. PFB the job description. Mandatory Skills - Python + AWS (Lambda, SES, SQS) + Boto3 + Rest API (Fast & Flask API) Job Summary: We are seeking a Python Technical Lead with strong expertise in software architecture, AI model integration, and cloud technologies. The ideal candidate will have 8+ years of experience in Python development and will play a crucial role in architecting scalable solutions , leading AI/ML integrations, and driving strategic technical decisions . This role requires a strong mix of hands-on development, technical leadership, and system design to ensure high performance, security, and scalability of AI-powered applications. Key Responsibilities: 1. Technical Leadership & Architectural Decisions Define and implement high-level software architecture for AI-driven applications. Establish best practices and coding standards to ensure code quality and maintainability. Make critical technical decisions around system design, cloud infrastructure, and microservices architecture. 2. AI/ML Model Integration & Deployment Collaborate with data scientists to integrate and optimize AI/ML models into production environments. Develop robust AI pipelines for model training, inference, and continuous learning. Implement MLOps best practices to automate model deployment and monitoring. 3. Backend & API Development Design and develop high-performance, scalable RESTful APIs using Python frameworks (FastAPI, Flask). Implement event-driven architectures (Kafka, RabbitMQ, Redis Pub/Sub) to optimize AI workflows. Manage database design and performance tuning for structured (PostgreSQL) and unstructured (NoSQL, DynamoDB) data. 4. Cloud & DevOps Lead cloud strategy and architecture decisions using AWS . Oversee CI/CD pipelines , Docker, Kubernetes, and serverless deployment strategies. Ensure security and compliance best practices for data protection and API security. 5. Team Mentorship & Collaboration Act as a technical mentor , guiding junior and mid-level engineers. Work closely with product managers, DevOps, and AI researchers to align business and technical goals. Conduct code reviews, technical training, and architectural discussions. Required Skills & Qualifications: 8+ years of Python development experience with a focus on scalability, architecture, and AI integration. Proven experience in technical leadership, mentoring, and decision-making. Strong background in system design, microservices, and event-driven architectures. Hands-on experience with AWS, including Lambda, ECS, S3, API Gateway, DynamoDB, SQS, SES. Experience with Docker, Kubernetes, Terraform, and CI/CD pipelines. Knowledge of security best practices, API authentication, and role-based access controls (RBAC). Strong problem-solving skills and ability to lead projects end-to-end. Preferred Qualifications: Familiarity with Generative AI applications. Exposure to GraphQL, WebSockets, and real-time data processing. Previous experience as a Technical Lead or Principal Engineer. Why Join Us? Opportunity to lead AI-driven innovation in a cutting-edge product. Work with a talented team of engineers, AI scientists, and cloud architects. A fast-paced environment where your technical decisions will have a direct impact. Thanks Garima

Posted 2 months ago

Apply

6 - 10 years

14 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Location: Bengaluru Experience: 8+ years Job Type: Full-time Job Summary: We are seeking a Python Technical Lead with strong expertise in software architecture, AI model integration, and cloud technologies. The ideal candidate will have 8+ years of experience in Python development and will play a crucial role in architecting scalable solutions, leading AI/ML integrations, and driving strategic technical decisions. This role requires a strong mix of hands-on development, technical leadership, and system design to ensure high performance, security, and scalability of AI-powered applications. Key Responsibilities: 1. Technical Leadership & Architectural Decisions Define and implement high-level software architecture for AI-driven applications. Establish best practices and coding standards to ensure code quality and maintainability. Make critical technical decisions around system design, cloud infrastructure, and microservices architecture. 2. AI/ML Model Integration & Deployment Collaborate with data scientists to integrate and optimize AI/ML models into production environments. Develop robust AI pipelines for model training, inference, and continuous learning. Implement MLOps best practices to automate model deployment and monitoring. 3. Backend & API Development Design and develop high-performance, scalable RESTful APIs using Python frameworks (FastAPI, Flask). Implement event-driven architectures (Kafka, RabbitMQ, Redis Pub/Sub) to optimize AI workflows. Manage database design and performance tuning for structured (PostgreSQL) and unstructured (NoSQL, DynamoDB) data. 4. Cloud & DevOps Lead cloud strategy and architecture decisions using AWS. Oversee CI/CD pipelines, Docker, Kubernetes, and serverless deployment strategies. Ensure security and compliance best practices for data protection and API security. 5. Team Mentorship & Collaboration Act as a technical mentor, guiding junior and mid-level engineers. Work closely with product managers, DevOps, and AI researchers to align business and technical goals. Conduct code reviews, technical training, and architectural discussions. Required Skills & Qualifications: 8+ years of Python development experience with a focus on scalability, architecture, and AI integration. Proven experience in technical leadership, mentoring, and decision-making. Strong background in system design, microservices, and event-driven architectures. Hands-on experience with AWS, including Lambda, ECS, S3, API Gateway, DynamoDB, SQS, SES. Experience with Docker, Kubernetes, Terraform, and CI/CD pipelines. Knowledge of security best practices, API authentication, and role-based access controls (RBAC). Strong problem-solving skills and ability to lead projects end-to-end. Preferred Qualifications: Familiarity with Generative AI applications. Exposure to GraphQL, WebSockets, and real-time data processing. Previous experience as a Technical Lead or Principal Engineer.

Posted 3 months ago

Apply

6 - 11 years

10 - 20 Lacs

Bengaluru, Gurgaon

Hybrid

Naukri logo

Hi GOOD DAY. Pleasure emailing you. We have one urgent requirement from one of our clients. Can you please go through the client requirement and let us know if you are interested in this position? Can you please send me your UPDATED RESUME AND FILL THIS DETAILS An early response is really appreciated. Please Share Your Resume To Sudheer@shellinfotech.com Or Call +91 9063691228 DETAILS: Interested In Shell Infotech Pay Roll: Current CTC: Expected CTC (in INR): Official Notice Period: Minimum time required to join: Total Experience: Relevant Experience: Current LOCATION: Preferred Location: Job Description: Role : AWS Cloud Engineer Mode of Work : WFO(Hybrid) Job Type : Fulltime Location : Bangalore/ Gurgaon Exp level : 6+ Years JD:- Proficiency in Python scripting for AWS Lambda, Boto3, and automation tasks . Hands-on experience with API Gateway (REST/HTTP APIs) and Lambda function design . Expertise in authorization frameworks (OAuth, JWT, IAM policies, custom Lambda authorizers). Strong knowledge of networking (VPC, subnets, DNS, load balancing) and security best practices. Experience with CI/CD tools (AWS Code*, Jenkins, GitHub Actions) and GitOps principles. Familiarity with observability tools (CloudWatch, Datadog, OpenTelemetry).

Posted 3 months ago

Apply

8 - 12 years

10 - 14 Lacs

Pune

Work from Office

Naukri logo

At Capgemini Invent, we believe difference drives change. As inventive transformation consultants, we blend our strategic, creative and scientific capabilities,collaborating closely with clients to deliver cutting-edge solutions. Join us to drive transformation tailored to our client's challenges of today and tomorrow.Informed and validated by science and data. Superpowered by creativity and design. All underpinned by technology created with purpose. About The Role : Having 5+ years of experience in creating data strategy frameworks/ roadmaps Having relevant experience in data exploration & profiling, involve in data literacy activities for all stakeholders. 5+ years in Analytics and data maturity evaluation based on current AS-is vs to-be framework. 5+ years Relevant experience in creating functional requirements document, Enterprise to-be data architecture. Relevant experience in identifying and prioritizing use case by for business; important KPI identification opex/capex for CXO's 2+ years working knowledge in Data Strategy:Data Governance/ MDM etc 4+ year experience in Data Analytics operating model with vision on prescriptive, descriptive, predictive, cognitive analytics Primary Skills 8+ years of experience in a Data Strategy role, who has attained a Graduate degree in Computer Science, Informatics, Information Systems, or another quantitative field. They should also have experience using the following software/tools: Experience with understanding big data tools:Hadoop, Spark, Kafka, etc. Experience with understanding relational SQL and NoSQL databases, including Postgres and Cassandra/Mongo dB. Experience with understanding data pipeline and workflow management tools:Luigi, Airflow, etc. Good to have cloud skillsets (Azure/ AWS/ GCP), 5+ years of Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.:Postgres/ SQL/ Mongo 2+ years working knowledge in Data Strategy:Data Governance/ MDM etc.

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies