Jobs
Interviews

9 Aws Sns Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

4 - 9 Lacs

pune

Hybrid

Job Title: Backend Developer (Python + Flask/FastAPI + Cloud) Duration: Full time role Location: Pune (Hybrid) Note: Need Python + Flask/FastAPI + AWS/Azure Job Description: Backend Developer (Python + Flask/FastAPI + Cloud) | 46 Years Backend Developer with strong expertise in Python, Flask/FastAPI, and REST API development, along with experience in cloud platforms (AWS/Azure). Responsibilities: Build and maintain backend services and REST APIs using Flask/FastAPI. Work with message queues/topics (Kafka, RabbitMQ, Azure Service Bus, AWS SQS/SNS). Orchestrate payloads from multiple endpoints and deliver to downstream systems. Deploy and scale applications on AWS/Azure. Follow SDLC best practices (Agile, CI/CD, testing, documentation). Required Skills: 46 years of backend development experience with Python. Strong expertise in Flask or FastAPI. Proven experience in REST API design and development. Hands-on experience with cloud services (AWS/Azure). Experience with message queues/event-driven systems. Nice to Have: Hands on experience in any UI technology SQL/NoSQL databases Monitoring/logging tools (ELK, Grafana, CloudWatch, Azure Monitor)

Posted 4 days ago

Apply

0.0 - 3.0 years

0 Lacs

hyderabad, telangana

On-site

About Evernorth: Evernorth Health Services, a division of The Cigna Group (NYSE: CI), creates pharmacy, care, and benefits solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention, and treatment of illness and disease more accessible to millions of people. Role Title: Software Engineering Analyst Position Summary: As a Software Engineering Analyst at Evernorth, you will demonstrate expertise in data engineering technologies with a focus on engineering, innovation, strategic influence, and a product mindset. You will play a key role in designing, building, testing, and delivering large-scale software applications, systems, platforms, services, or technologies in the data engineering space. Collaborating with partner IT and business teams, you will own and drive major deliverables across all aspects of software delivery. Your responsibilities will include automating processes on Databricks and AWS, collaborating with business and technology partners to gather requirements, develop, and implement solutions. Strong analytical and technical skills are essential, along with the ability to positively influence the delivery of data engineering products. Working in a team that values innovation, a cloud-first approach, self-service orientation, and automation, you will engage with internal and external stakeholders and customers to build solutions as part of Enterprise Data Engineering. Strong technical and communication skills are crucial for success in this role. Job Description & Responsibilities: - Delivering business needs end-to-end from requirements to development into production. - Using a hands-on engineering approach in the Databricks environment to deliver data engineering toolchains, platform capabilities, and reusable patterns. - Following software engineering best practices with an automation-first approach and a continuous learning and improvement mindset. - Ensuring adherence to enterprise architecture direction and architectural standards. - Collaborating in a high-performing team environment and having the ability to influence and be influenced by others. Experience Required: - More than 6 months to 1.5 years of experience in software engineering, building data engineering pipelines, middleware and API development, and automation. Experience Desired: - Expertise in Agile software development principles and patterns. - Expertise in building streaming, batch, and event-driven architectures and data pipelines. Primary Skills: - Experience in messaging systems such as Apache ActiveMQ, WebSphere MQ, Apache Artemis, Kafka, AWS SNS. - Proficiency in self-testing of applications, unit testing, use of mock frameworks, and test-driven development (TDD). - Knowledge of Behavioral Driven Development (BDD) approach. - Familiarity with Spark or Scala with AWS, Python. Additional Skills: - Ability to perform detailed analysis of business problems and technical environments. - Strong oral and written communication skills. - Strategic thinking, iterative implementation, and estimation of financial impact of design/architecture alternatives. - Continuous focus on ongoing learning and development. - AWS Certifications - nice to have. - Experience with DevOps, CI/CD, Databricks.,

Posted 1 week ago

Apply

5.0 - 10.0 years

0 Lacs

haryana

On-site

You will be responsible for designing, building, and maintaining scalable and efficient data pipelines to facilitate the movement of data between cloud-native databases (e.g., Snowflake) and SaaS providers using AWS Glue and Python. Your role will involve implementing and managing ETL/ELT processes to ensure seamless data integration and transformation while adhering to information security and compliance with data governance standards. Additionally, you will be tasked with maintaining and enhancing data environments, including data lakes, warehouses, and distributed processing systems. It is crucial to utilize version control systems (e.g., GitHub) effectively to manage code and collaborate with the team. In terms of primary skills, you should possess expertise in enhancements, new development, defect resolution, and production support of ETL development using AWS native services. Your responsibilities will also include integrating data sets using AWS services such as Glue and Lambda functions, utilizing AWS SNS for sending emails and alerts, authoring ETL processes using Python and PySpark, monitoring ETL processes using CloudWatch events, connecting with different data sources like S3, and validating data using Athena. Experience in CI/CD using GitHub Actions, proficiency in Agile methodology, and extensive working experience with Advanced SQL are essential for this role. Furthermore, familiarity with Snowflake and understanding its architecture, including concepts like internal and external tables, stages, and masking policies, is considered a secondary skill. Your competencies and experience should include deep technical skills in AWS Glue (Crawler, Data Catalog) for over 10 years, hands-on experience with Python and PySpark for over 5 years, PL/SQL experience for over 5 years, CloudFormation and Terraform for over 5 years, CI/CD GitHub actions for over 5 years, experience with BI systems (PowerBI, Tableau) for over 5 years, and a good understanding of AWS services like S3, SNS, Secret Manager, Athena, and Lambda for over 5 years. Additionally, familiarity with Jira and Git is highly desirable. This position requires a high level of technical expertise in AWS Glue, Python, PySpark, PL/SQL, CloudFormation, Terraform, GitHub actions, BI systems, and AWS services, along with a solid understanding of data integration, transformation, and data governance standards. Your ability to collaborate effectively with the team, manage data environments efficiently, and ensure the security and compliance of data will be critical for the success of this role.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

madurai, tamil nadu

On-site

You should have at least 3 years of experience as a React Native developer with a passion for creating high-performance mobile applications for both iOS and Android platforms. Your responsibilities will include designing and developing user interface components for JavaScript-based online and mobile applications using React Native. You should be able to leverage native APIs for seamless integration across all platforms and have the ability to transition a React Web App into a React Native application. Knowledge of Docker, Kubernetes, Node.js, Express, and experience with popular React JS workflows such as Flux or Redux are required. Experience in implementing push notifications using Firebase/AWS SNS and familiarity with code versioning tools like DevOps/GitHub are also expected. The ideal candidate will possess strong knowledge of Object-Oriented Programming, JavaScript, and the ability to write optimized and efficient scripts. You should be well-versed in Offline Storage, REST APIs, and document request models. Writing well-documented and easily readable JavaScript code is essential, as well as managing third-party dependencies and resolving any debugging conflicts that may arise.,

Posted 2 weeks ago

Apply

1.0 - 5.0 years

0 Lacs

haryana

On-site

As an ETL Developer with our team, you will be responsible for a range of tasks including enhancements, new development, defect resolution, and production support of ETL development utilizing AWS native services. Your expertise will be crucial in integrating data sets through AWS services such as Glue and Lambda functions. Additionally, you will be utilizing AWS SNS for sending emails and alerts, authoring ETL processes using Python and PySpark, and monitoring ETL processes using CloudWatch events. Your role will also involve connecting with various data sources like S3, validating data using Athena, and implementing CI/CD processes using GitHub Actions. Proficiency in Agile methodology is essential for effective collaboration within our dynamic team environment. To excel in this position, you should possess deep technical skills in AWS Glue (Crawler, Data Catalog) with at least 5 years of experience. Hands-on experience with Python, PySpark, and PL/SQL is required, with a minimum of 3 years in each. Familiarity with CloudFormation, Terraform, and CI/CD GitHub actions is advantageous. Additionally, having experience with BI systems such as PowerBI and Tableau, along with a good understanding of AWS services like S3, SNS, Secret Manager, Athena, and Lambda, will be beneficial for this role. If you are a detail-oriented professional with a strong background in ETL development and a passion for leveraging AWS services to drive data integration, we encourage you to apply for this exciting opportunity on our team.,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

ahmedabad, gujarat

On-site

You are an experienced MERN Stack Developer with over 4 years of experience in developing scalable web applications. Your expertise lies in both backend and frontend development, with a strong command over Node.js, Next.js with TypeScript, and building serverless architectures using AWS Lambda. It is essential that you have experience working with Large Language Models (LLMs) and AWS services such as SQS, SNS, and MongoDB. As a MERN Stack Developer, your responsibilities will include developing, maintaining, and optimizing full-stack applications using MongoDB, Express.js, React, and Node.js. You will be tasked with building and deploying high-performance, SEO-friendly web applications using Next.js with TypeScript. Additionally, you will design and implement serverless architectures using AWS Lambda, work with AWS SQS and SNS for event-driven architectures, and integrate Large Language Models to enhance application features. Collaboration with cross-functional teams to design and develop scalable solutions will be a key part of your role. You will ensure application security, performance, and scalability, optimize database queries for efficient use of MongoDB, and write clean, maintainable, and well-documented code following industry best practices. Troubleshooting application issues in both production and development environments will also be within your purview. In terms of qualifications, you must have at least 4 years of experience in MERN stack development. Your must-have skills include proficiency in Node.js with Express.js, Next.js with TypeScript, AWS Lambda for serverless architecture, MongoDB for database design and optimization, AWS SQS & SNS for queueing and notification services, and experience with Large Language Models. Additionally, skills such as AWS SES, AWS EC2, React application deployment using Vercel, and AWS Route 53 are considered good to have. Join a team that values work-life balance, treats employees as valuable assets, fosters a positive environment with supportive seniors and friendly colleagues, and provides opportunities for skill learning and career growth. Internal and external events are organized to enable employees to experience something new and enhance their personal and professional development.,

Posted 1 month ago

Apply

10.0 - 18.0 years

0 Lacs

indore, madhya pradesh

On-site

You should possess a BTech degree in computer science, engineering, or a related field of study, or have 12+ years of related work experience. Additionally, you should have at least 7 years of design and implementation experience with large-scale data-centric distributed applications. It is essential to have professional experience in architecting and operating cloud-based solutions, with a good understanding of core disciplines such as compute, networking, storage, security, and databases. A strong grasp of data engineering concepts like storage, governance, cataloging, data quality, and data modeling is required. Familiarity with various architecture patterns like data lake, data lake house, and data mesh is also important. You should have a good understanding of Data Warehousing concepts and hands-on experience with tools like Hive, Redshift, Snowflake, and Teradata. Experience in migrating or transforming legacy customer solutions to the cloud is highly valued. Moreover, experience working with services like AWS EMR, Glue, DMS, Kinesis, RDS, Redshift, Dynamo DB, Document DB, SNS, SQS, Lambda, EKS, and Data Zone is necessary. A thorough understanding of Big Data ecosystem technologies such as Hadoop, Spark, Hive, and HBase, along with other relevant tools and technologies, is expected. Knowledge in designing analytical solutions using AWS cognitive services like Textract, Comprehend, Rekognition, and Sagemaker is advantageous. You should also have experience with modern development workflows like git, continuous integration/continuous deployment pipelines, static code analysis tooling, and infrastructure-as-code. Proficiency in a programming or scripting language like Python, Java, or Scala is required. Possessing an AWS Professional/Specialty certification or relevant cloud expertise is a plus. In this role, you will be responsible for driving innovation within the Data Engineering domain by designing reusable and reliable accelerators, blueprints, and libraries. You should be capable of leading a technology team, fostering an innovative mindset, and enabling fast-paced deliveries. Adapting to new technologies, learning quickly, and managing high ambiguity are essential skills for this position. You will collaborate with business stakeholders, participate in various architectural, design, and status calls, and showcase good presentation skills when interacting with executives, IT Management, and developers. Furthermore, you will drive technology/software sales or pre-sales consulting discussions, ensure end-to-end ownership of tasks, and maintain high-quality software development with complete documentation and traceability. Fulfilling organizational responsibilities, sharing knowledge and experience with other teams/groups, conducting technical training sessions, and producing whitepapers, case studies, and blogs are also part of this role. The ideal candidate for this position should have 10 to 18 years of experience and be able to reference the job with the number 12895.,

Posted 1 month ago

Apply

6.0 - 8.0 years

11 - 13 Lacs

Hyderabad, Gurugram, Bengaluru

Work from Office

iSource Services is hiring for one of their client for the position of Java Developer About the role: We are looking for a skilled Java Developer with strong expertise in Spring Boot, Microservices, and AWS to join our growing team. The ideal candidate must have a proven track record of delivering scalable backend solutions and a minimum of 4 years of hands-on experience with AWS services. Key Responsibilities: Develop and maintain high-performance Java applications using Spring Boot and Microservices architecture Integrate with AWS services including Lambda, DynamoDB, SQS, SNS, S3, ECS, and EC2 Work with event-driven architecture using Kafka Collaborate with cross-functional teams to define, design, and ship new features Ensure the performance, quality, and responsiveness of applications Required Skills: Strong proficiency in Java (8+), Spring Boot, and Microservices Minimum 4 years of hands-on experience with AWS (Lambda, DynamoDB, SQS, SNS, S3, ECS, EC2) Experience with Kafka for real-time data streaming Solid understanding of system design, data structures, and algorithms Excellent problem-solving and communication skills.

Posted 1 month ago

Apply

6 - 8 years

18 - 20 Lacs

Hyderabad, Gurugram, Bengaluru

Work from Office

6+ years of hands on experience with AWS services(Lambda, DynamoDB, SQS, SNS, S3, ECS, EC2) (mandatory in each service) Created lambda functions and done scripting Only deployment experience will not work Hands on Java, Spring Boot, Microservices, Kafka.

Posted 4 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies