Home
Jobs

5 Aws Sqs Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 7.0 years

8 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Role & Responsibilities: Design and develop integrations and microservices with hands-on coding. Build real-time and asynchronous systems integrations. Create API endpoints for internal and partner cloud systems. Document design and runbooks. Take full ownership of the integration lifecycle for multiple integrations. Requirements: Bachelor of Science in Computer Science or Engineering. Strong background in software engineering and integration. 5+ years of overall industry experience. 3+ years of hands-on experience in MuleSoft architecture and full lifecycle implementation from requirements gathering/analysis to Go-Live and Post- production support. Mandatory 2+ years of experience in RTF (Runtime Fabric). Expertise in using REST and SOAP APIs. Proficiency in building MuleSoft Integrations and APIs using Mule v4. Experience in integrating a portfolio of SaaS applications. Strong coding skills in Java. Familiarity with messaging infrastructure, preferably AWS SQS, and storage solutions like AWS S3. Experience with relational databases and solid SQL knowledge. Proficiency in using Anypoint Platform API Manager, Runtime Manager, Exchange, etc. Experience with Runtime Fabric or Kubernetes. Familiarity with Github version control, Jenkins, Maven. MuleSoft developer certification. Knowledge of securing data; understanding of PGP, SSH, OAuth, HTTPS, SFTP.

Posted 3 weeks ago

Apply

5 - 8 years

7 - 10 Lacs

Hyderabad, Ahmedabad

Work from Office

Naukri logo

The Team: S&P Global isa global market leader in providing information, analytics and solutions for industries and markets that drive economies worldwide. The Market Intelligence (MI) division is the largest division within the company. This is an opportunity to join the MI Data and Researchs Data Science Team whichis dedicated to developing cutting-edge Data Science and Generative AI solutions. We are a dynamic group that thrives on innovation and collaboration, working together to push the boundaries of technology and deliver impactful solutions. Our team values inclusivity, continuous learning, and the sharing of knowledge to enhance our collective expertise. Responsibilities and Impact: Develop and productionize cloud-based services and full-stack applications utilizing NLP solutions, including GenAI models. Implement and manage CI/CD pipelines to ensure efficient and reliable software delivery. Automate cloud infrastructure using Terraform. Write unit tests, integration tests and performance tests Work in a team environment using agile practices Support administration of Data Science experimentation environment including AWS Sagemaker and Nvidia GPU servers Monitor and optimize application performance and infrastructure costs. Collaborate with data scientists and other developers to integrate and deploy data science models into production environments Educate others to improve and coding standards, code quality and test coverage, documentation Work closely with cross-functional teams to ensure seamless integration and operation of services. What Were Looking For : Basic Required Qualifications : 5-8 years of experience in software engineering Proficiency in Python and JavaScript for full-stack development. Experience in writing and maintaining high quality code utilizing techniques like unit testing and code reviews Strong understanding of object-oriented design and programming concepts Strong experience with AWS cloud services, including EKS, Lambda, and S3. Knowledge of Docker containers and orchestration tools including Kubernetes Experience with monitoring, logging, and tracing tools (e.g., Datadog, Kibana, Grafana). Knowledge of message queues and event-driven architectures (e.g., AWS SQS, Kafka). Experience with CI/CD pipelines in Azure DevOps and GitHub Actions. Additional Preferred Qualifications : Experience writing front-end web applications using Javascript and React Familiarity with infrastructure as code (IaC) using Terraform. Experience in Azure or GPC cloud services Proficiency in C# or Java Experience with SQL and NoSQL databases Knowledge of Machine Learning concepts Experience with Large Language Models

Posted 1 month ago

Apply

2 - 3 years

4 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

We're looking for engineers who love to create elegant, easy-to-use interfaces, and enjoy new JavaScript technologies as they show up every day. Particularly ReactJS. You will help drive our technology selection, and will coach your team on how to use these new technologies effectively in a production platform development environment. We need our engineers to be versatile, display leadership qualities and be enthusiastic to tackle new problems across the full-stack as we continue to push our technology forward. Responsibilities Design, develop, test, deploy, maintain and improve software Manage individual project priorities, deadlines and deliverables Keep software components loosely coupled as we grow Contribute improvements to our continuous delivery infrastructure Participate in recruiting and mentoring of top engineering talent Drive roadmap execution and enhance customer feedback into the product Develop, collaborate on, and execute Agile development, product scenarios, in order to release high quality software on a regular cadence Proactively assist your team to find and solve development and production software issues through effective collaboration Work with company stakeholders including PM, PO, Customer Facing teams, DevOps, Support to communicate and collaborate on execution Desirable - Contribute to frameworks selection, microservice extraction, and deployment in On-Premise and SAAS scenarios. Experience with troubleshooting, profiling and debugging applications Familiarity with web debugging tools (Chrome development tools, Fiddler etc) is a plus Experience with different databases (ElasticSearch, Impala, HDFS, Mongo etc) is a plus Basic Git command knowledge is a plus Messaging systems (e.g. RabbitMQ, Apache Kafka, Active MQ, AWS SQS, Azure Service Bus, Google Pub/Sub) Cloud solutions (e.g. AWS, Google Cloud Platform, Microsoft Azure) Personal Skills - Strong written and verbal communications skills to collaborate developers, testers, product owners, scrum masters, directors, and executives Experience taking part in the decision-making process in application code design, solution development, code review Strong worth ethic and emotional intelligence including being on time for meetings Ability to work in fast-changing environment and embrace change while still following a greater plan Qualifications Requirements - BS or MS degree in Computer Science or a related field, or equivalent job experience 2-3 years of experience in web application and any experience on building web IDEs and ETL driven web apps Strong knowledge and experience in C#(2+ years) Experience with ReactJS, microservices (2+ years) Experience in CI/CD pipeline Experience with relational databases, hands-on experience with SQL queries Strong experience with several JavaScript frameworks and tools, such as React, Node Strong knowledge in REST APIs Experience with Atlassian suite products such as JIRA, Bitbucket, Confluence Strong knowledge in Computer Science, Computing Theory: Paradigm & Principles (OOP, SOLID) Database theory (RDBMS) Code testing practices Algorithms Data structures Design Patterns Understanding of network interactions: Protocols conventions (e.g. REST, RPC) Authentication and authorization flows, standards and practices (e.g. oAuth, JWT)

Posted 1 month ago

Apply

6 - 8 years

18 - 20 Lacs

Hyderabad, Gurugram, Bengaluru

Work from Office

Naukri logo

6+ years of hands on experience with AWS services(Lambda, DynamoDB, SQS, SNS, S3, ECS, EC2) (mandatory in each service) Created lambda functions and done scripting Only deployment experience will not work Hands on Java, Spring Boot, Microservices, Kafka.

Posted 1 month ago

Apply

8 - 13 years

10 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Responsibilities Technical Leadership: Provide technical direction and mentorship to a team of data engineers, ensuring best practices in coding, architecture, and data operations. End-to-End Ownership: Architect, implement, and optimize end-to-end data pipelines that process and transform large-scale datasets efficiently and reliably. Orchestration and Automation: Design scalable workflows using orchestration tools such as Apache Airflow, ensuring high availability and fault tolerance. Data Warehouse and Lake Optimization: Lead the implementation and optimization of Snowflake and data lake technologies like Apache Iceberg for storage, query performance, and scalability. Real-Time and Batch Processing: Build robust systems leveraging Kafka, SQS, or similar messaging technologies for real-time and batch data processing. Cross-Functional Collaboration: Work closely with Data Science, Product, and Engineering teams to define data requirements and deliver actionable insights. Data Governance and Security: Establish and enforce data governance frameworks, ensuring compliance with regulatory standards and maintaining data integrity. scalability and Performance: Develop strategies to optimize performance for systems processing terabytes of data daily while ensuring scalability. Team Building: Foster a collaborative team environment, driving skill development, career growth, and continuous learning within the teamInnovation and Continuous Improvement: Stay ahead of industry trends to evaluate and incorporate new tools, technologies, and methodologies into the organization.Qualifications Reuired Skills: 8+ years of experience in data engineering with a proven track record of leading data projects or teams. Strong programming skills in Python, with expertise in building and optimizing ETL pipelines. Extensive experience with Snowflake or equivalent data warehouses for designing schemas, optimizing queries, and managing large datasets. Expertise in orchestration tools like Apache Airflow, with experience in building and managing complex workflows. Deep understanding of messaging queues such as Kafka, AWS SQS, or similar technologies for real-time data ingestion and processing. Demonstrated ability to architect and implement scalable data solutions handling terabytes of data. Hands-on experience with Apache Iceberg for managing and optimizing data lakes. Proficiency in containerization and orchestration tools like Docker and Kubernetes for deploying and managing distributed systems. Strong understanding of CI/CD pipelines, including version control, deployment strategies, and automated testing. Proven experience working in an Agile development environment and managing cross-functional team interactions. Strong background in data modeling, data governance, and ensuring compliance with data security standards. Experience working with cloud platforms like AWS, Azure, or GCP.Preferred Skills: Proficiency in stream processing frameworks such as Apache Flink for real-time analytics. Familiarity with programming languages like Scala or Java for additional engineering tasks. Exposure to integrating data pipelines with machine learning workflows. Strong analytical skills to evaluate new technologies and tools for scalability and performance.

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies