Home
Jobs

290 Aws Iam Jobs - Page 2

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 12.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

RolePython Django developer Experience4 to 7 years Work ModeHybrid Work Timings1:30 pm IST to 10:30 pm IST LocationChennai & Hyderabad Primary Skills: Python, Communication, Django, AWS services. JD: Experience with Django REST Framework (DRF). Knowledge of SQL and NoSQL databases (e.g., PostgreSQL, MongoDB). Familiarity with Docker and containerization Experience with automated testing frameworks Familiarity with CI/CD tools and pipelines. Assist in developing, testing, and maintaining web applications using Django and Python that are deployed on AWS cloud platform. Take KT from application team on the code and understand BMO coding standards Work on Application vulnerability fixes based on Vulnerability report from BMO team Work with RESTful APIs and integrate third-party services. Develop and manage database models, migrations, and queries using Django ORM. Collaborate with front-end developers to integrate user-facing elements with server-side logic. Write clean, efficient, and reusable code. Participate in code reviews and follow best practices for software development. Troubleshoot, debug, and optimize applications for performance and security. Write unit and integration tests to ensure code reliability. Contribute to deployment processes and continuous integration pipelines. Stay updated with the latest trends and best practices in Django and web development. Responsibility:Assist in developing, testing, and maintaining web applications using Django and Python that are deployed on AWS cloud platform. Take KT from application team on the code and understand BMO coding standards Work on Application vulnerability fixes based on Vulnerability report from BMO team.

Posted -1 days ago

Apply

8.0 - 13.0 years

5 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

6+ years of experience with Java Spark. Strong understanding of distributed computing, big data principles, and batch/stream processing. Proficiency in working with AWS services such as S3, EMR, Glue, Lambda, and Athena. Experience with Data Lake architectures and handling large volumes of structured and unstructured data. Familiarity with various data formats. Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Design, develop, and optimize large-scale data processing pipelines using Java Spark Build scalable solutions to manage data ingestion, transformation, and storage in AWS-based Data Lake environments. Collaborate with data architects and analysts to implement data models and workflows aligned with business requirements. Ensure performance tuning, fault tolerance, and reliability of distributed data processing systems.

Posted -1 days ago

Apply

8.0 - 13.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

8 years of hands-on experience in Thought Machine Vault, Kubernetes, Terraform, GCP/AWS, PostgreSQL, CI/CD REST APIs, Docker, Kubernetes, Microservices Architect and manage enterprise-level databases with 24/7 availability Lead efforts on optimization, backup, and disaster recovery planning Ensure compliance, implement monitoring and automation Guide developers on schema design and query optimization Conduct DB health audits and capacity planningCollaborate with cross-functional teams to define, design, and ship new features. Work on the entire software development lifecycle, from concept and design to testing and deployment. Implement and maintain AWS cloud-based solutions, ensuring high performance, security, and scalability. Integrate microservices with Kafka for real-time data streaming and event-driven architecture. Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Bachelor's or master's degree in computer science or related field. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Good to have TM Vault core banking knowledge Strong communication and collaboration skills

Posted -1 days ago

Apply

8.0 - 13.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

10+ years of experience with Java Spark. Strong understanding of distributed computing, big data principles, and batch/stream processing. Proficiency in working with AWS services such as S3, EMR, Glue, Lambda, and Athena. Experience with Data Lake architectures and handling large volumes of structured and unstructured data. Familiarity with various data formats. Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Design, develop, and optimize large-scale data processing pipelines using Java Spark Build scalable solutions to manage data ingestion, transformation, and storage in AWS-based Data Lake environments. Collaborate with data architects and analysts to implement data models and workflows aligned with business requirements.

Posted -1 days ago

Apply

6.0 - 11.0 years

3 - 7 Lacs

Pune

Work from Office

Naukri logo

Experience :7-9 yrs Experience in AWS services must like S3, Lambda , Airflow, Glue, Athena, Lake formation ,Step functions etc. Experience in programming in JAVA and Python. Experience performing data analysis (NOT DATA SCIENCE) on AWS platforms Nice to have : Experience in a Big Data technologies (Terradata, Snowflake, Spark, Redshift, Kafka, etc.) Experience with data management process on AWS is a huge Plus Experience in implementing complex ETL transformations on AWS using Glue. Familiarity with relational database environment (Oracle, Teradata, etc.) leveraging databases, tables/views, stored procedures, agent jobs, etc.

Posted -1 days ago

Apply

4.0 - 9.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

Minimum 6 years of hands-on experience in data engineering or big data development roles. Strong programming skills in Python and experience with Apache Spark (PySpark preferred). Proficient in writing and optimizing complex SQL queries. Hands-on experience with Apache Airflow for orchestration of data workflows. Deep understanding and practical experience with AWS services: Data Storage & ProcessingS3, Glue, EMR, Athena Compute & ExecutionLambda, Step Functions DatabasesRDS, DynamoDB MonitoringCloudWatch Experience with distributed data processing, parallel computing, and performance tuning. Strong analytical and problem-solving skills. Familiarity with CI/CD pipelines and DevOps practices is a plus.

Posted -1 days ago

Apply

14.0 - 19.0 years

13 - 18 Lacs

Hyderabad

Work from Office

Naukri logo

10 years of hands-on experience in JAVA FULL STACK - JAVA SPRING BOOT Java 11, Spring Boot, Angular/React, REST APIs, Docker, Kubernetes, Microservices Design, develop, test, and deploy scalable and resilient microservices using Java and Spring Boot. Collaborate with cross-functional teams to define, design, and ship new features. Work on the entire software development lifecycle, from concept and design to testing and deployment. Implement and maintain AWS cloud-based solutions, ensuring high performance, security, and scalability. Integrate microservices with Kafka for real-time data streaming and event-driven architecture. Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Should Be a Java Full Stack Developer. Bachelor's or Master's degree in Computer Science or related field. Proficiency in Spring Boot and other Spring Framework components. Extensive experience in designing and developing RESTful APIs. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS. Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Good to have TM Vault core banking knowledge, Strong communication and collaboration skills.

Posted -1 days ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Strong knowledge of AWS services, including but not limited to Hands on AWS networking skills (e.g. VPC, subnets, NACL, Transit Gateway, Route tables. Load Balancer, Direct Connect gateway, Route53, etc). Thorough understanding of networking concepts, especially TCPIP, IP addressing and subnet calculation. Solid experience with AWS Security services IAM (identity, resource, and service control policies, permission boundary, roles, federation, etc.), Security groups, KMS, ACM/ACM-PCA, Network Firewall, Config GuardDuty CloudTrail, secrets manager, systems manager (ssm) etc. Good knowledge of various AWS Integration patterns, lambda with amazon EventBridge, and SNS. Any workload-related experience is a bonus, e.g. EKS, ECS, Autoscaling, etc Containerisation experience with Docker and EKS (preferred) Infrastructure as a Code and scripting: Solid hands-on experience with declarative languages, Terraform (& Terragrunt preferred) and their capabilities Comfortable with bash scripting, and at least one programming language (Python or Golang preferred). Sound knowledge of secure coding practices, and configuration/secrets management Knowledge in writing unit and integration tests. Experience in writing infrastructure unit tests; Terratest preferred Solid understanding of CI/CD Solid understanding of zero-downtime deployment patterns Experience with automated continuous integration testing, including security testing using SAST tools Experience in automated CI/CD pipeline tooling; Codefresh preferred Experience in creating runners, docker images Experience using version control systems such as git Exposed to, and comfortable working on large source code repositories in a team environment. Solid expertise with Git and Git workflows, working within mid to large (infra) product development teams General / Infrastructure Experience Experience with cloud ops (DNS, Backups, cost optimisation, capacity management, monitoring/alerting, patch management, etc.) Exposure to complex application environments, including containerised as well as serverless applications Windows and/or Linux systems administration experience (preferred) Experience with Active Directory (preferred) Exposure to multi-cloud and hybrid infrastructure Exposure to large-scale on-premise to cloud infrastructure migrations Solid experience in working with mission-critical production systems

Posted -1 days ago

Apply

10.0 - 15.0 years

12 - 16 Lacs

Hyderabad

Work from Office

Naukri logo

JD for Data Engineering Lead - Python: Data Engineering Lead with at least 7 to 10 years experience in Python with following AWS Services AWS servicesAWS SQS, AWS MSK, AWS RDS Aurora DB, BOTO 3, API Gateway, and CloudWatch. Providing architectural guidance to the offshore team,7-10, reviewing code and troubleshoot errors. Very strong SQL knowledge is a must, should be able to understand & build complex queries. Familiar with Gitlab( repos and CI/CD pipelines). He/she should be closely working with Virtusa onshore team as well as enterprise architect & other client teams at onsite as needed. Experience in API development using Python is a plus. Experience in building MDM solution is a plus.

Posted -1 days ago

Apply

4.0 - 9.0 years

4 - 8 Lacs

Gurugram

Work from Office

Naukri logo

Data Engineer Location PAN INDIA Workmode Hybrid Work Timing :2 Pm to 11 PM Primary Skill Data Engineer Experience in data engineering, with a proven focus on data ingestion and extraction using Python/PySpark.. Extensive AWS experience is mandatory, with proficiency in Glue, Lambda, SQS, SNS, AWS IAM, AWS Step Functions, S3, and RDS (Oracle, Aurora Postgres). 4+ years of experience working with both relational and non-relational/NoSQL databases is required. Strong SQL experience is necessary, demonstrating the ability to write complex queries from scratch. Also, experience in Redshift is required along with other SQL DB experience Strong scripting experience with the ability to build intricate data pipelines using AWS serverless architecture. understanding of building an end-to end Data pipeline. Strong understanding of Kinesis, Kafka, CDK. Experience with Kafka and ECS is also required. strong understanding of data concepts related to data warehousing, business intelligence (BI), data security, data quality, and data profiling is required Experience in Node Js and CDK. JDResponsibilities Lead the architectural design and development of a scalable, reliable, and flexible metadata-driven data ingestion and extraction framework on AWS using Python/PySpark. Design and implement a customizable data processing framework using Python/PySpark. This framework should be capable of handling diverse scenarios and evolving data processing requirements. Implement data pipeline for data Ingestion, transformation and extraction leveraging the AWS Cloud Services Seamlessly integrate a variety of AWS services, including S3,Glue, Kafka, Lambda, SQL, SNS, Athena, EC2, RDS (Oracle, Postgres, MySQL), AWS Crawler to construct a highly scalable and reliable data ingestion and extraction pipeline. Facilitate configuration and extensibility of the framework to adapt to evolving data needs and processing scenarios. Develop and maintain rigorous data quality checks and validation processes to safeguard the integrity of ingested data. Implement robust error handling, logging, monitoring, and alerting mechanisms to ensure the reliability of the entire data pipeline. QualificationsMust Have Over 6 years of hands-on experience in data engineering, with a proven focus on data ingestion and extraction using Python/PySpark. Extensive AWS experience is mandatory, with proficiency in Glue, Lambda, SQS, SNS, AWS IAM, AWS Step Functions, S3, and RDS (Oracle, Aurora Postgres). 4+ years of experience working with both relational and non-relational/NoSQL databases is required. Strong SQL experience is necessary, demonstrating the ability to write complex queries from scratch. Strong working experience in Redshift is required along with other SQL DB experience. Strong scripting experience with the ability to build intricate data pipelines using AWS serverless architecture. Complete understanding of building an end-to end Data pipeline. Nice to have Strong understanding of Kinesis, Kafka, CDK. A strong understanding of data concepts related to data warehousing, business intelligence (BI), data security, data quality, and data profiling is required. Experience in Node Js and CDK. Experience with Kafka and ECS is also required.

Posted -1 days ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Developer P3 C3 TSTS Hybrid US Shift Primary Skills DevOps and infrastructure engineering CI/CD tools AWS networking services, storage services, certificate management, secrets management, and database setup (RDS) Terraform/Cloud Formation/AWS CDK Python and Bash Secondary Skills Expertise in AWS CDK and CDK Pipelines for IaC. Understanding of logging and monitoring services like AWS CloudTrail, CloudWatch, GuardDuty, and other AWS security services Communication and collaboration skills to work effectively in a team-oriented environment. JD Design, implement, and maintain cloud infrastructure using AWS Cloud Development Kit (CDK) Develop and evolve Infrastructure as Code (IaC) to ensure efficient provisioning and management of AWS resources. Develop and automate Continuous Integration/Continuous Deployment (CI/CD) pipelines for infrastructure provisioning and application deployment. Configure and manage various AWS services, including but not limited to EC2, VPC, Security Group, NACL, S3, CloudFormation, CloudWatch, AWS Cognito, IAM, Transit Gateway, ELB, CloudFront, Route53, and more. Collaborate with development and operations teams, bridging the gap between infrastructure and application development. Monitor and troubleshoot infrastructure performance issues, ensuring high availability and reliability. Implement proactive measures to optimize resource utilization and identify potential bottlenecks. Implement security best practices, including data encryption and adherence to security protocols. Ensure compliance with industry standards and regulations. Must Have 5+ years of hands-on experience in DevOps and infrastructure engineering Solid understanding of AWS services and technologies, including EC2, VPC, S3, Lambda, Route53, and CloudWatch Experience with CI/CD tools, DevOps implementation and HA/DR setup In-depth experience with AWS networking services, storage services, certificate management, secrets management, and database setup (RDS) Proven expertise in Terraform/Cloud Formation/AWS CDK Strong scripting and programming skills, with proficiency in languages such as Python and Bash Nice to have Proven expertise in AWS CDK and CDK Pipelines for IaC. Familiarity or understanding of logging and monitoring services like AWS CloudTrail, CloudWatch, GuardDuty, and other AWS security services. Excellent communication and collaboration skills to work effectively in a team-oriented environment.

Posted -1 days ago

Apply

8.0 - 13.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Experience: 8 years of experience in data engineering, specifically in cloud environments like AWS. Proficiency in PySpark for distributed data processing and transformation. Solid experience with AWS Glue for ETL jobs and managing data workflows. Hands-on experience with AWS Data Pipeline (DPL) for workflow orchestration. Strong experience with AWS services such as S3, Lambda, Redshift, RDS, and EC2. Technical Skills: Proficiency in Python and PySpark for data processing and transformation tasks. Deep understanding of ETL concepts and best practices. Familiarity with AWS Glue (ETL jobs, Data Catalog, and Crawlers). Experience building and maintaining data pipelines with AWS Data Pipeline or similar orchestration tools. Familiarity with AWS S3 for data storage and management, including file formats (CSV, Parquet, Avro). Strong knowledge of SQL for querying and manipulating relational and semi-structured data. Experience with Data Warehousing and Big Data technologies, specifically within AWS. Additional Skills: Experience with AWS Lambda for serverless data processing and orchestration. Understanding of AWS Redshift for data warehousing and analytics. Familiarity with Data Lakes, Amazon EMR, and Kinesis for streaming data processing. Knowledge of data governance practices, including data lineage and auditing. Familiarity with CI/CD pipelines and Git for version control. Experience with Docker and containerization for building and deploying applications. Design and Build Data PipelinesDesign, implement, and optimize data pipelines on AWS using PySpark, AWS Glue, and AWS Data Pipeline to automate data integration, transformation, and storage processes. ETL DevelopmentDevelop and maintain Extract, Transform, and Load (ETL) processes using AWS Glue and PySpark to efficiently process large datasets. Data Workflow AutomationBuild and manage automated data workflows using AWS Data Pipeline, ensuring seamless scheduling, monitoring, and management of data jobs. Data IntegrationWork with different AWS data storage services (e.g., S3, Redshift, RDS) to ensure smooth integration and movement of data across platforms. Optimization and ScalingOptimize and scale data pipelines for high performance and cost efficiency, utilizing AWS services like Lambda, S3, and EC2.

Posted -1 days ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Hyderabad

Work from Office

Naukri logo

JD for API Developer: Python API developer with strong experience in building RESTful APIs and hands-on experience in AWS services, Gitlab, and Terraform. Key Skills Required: Proficient in Python(Flask/FastAPI). Experience with AWS Lambda, API Gateway, and Cloudwatch. Knowledge of AWS Queues. Strong in error handling, exception management, and API response standardization. Familiar with Gitlab( repos and CI/CD pipelines). Experience using Terraform for infrastructure as code. Understanding of CI/CD, cloud-native development, and DevOps best practices.

Posted -1 days ago

Apply

10.0 - 15.0 years

12 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

8 years of hands-on experience in AWS, Kubernetes, Prometheus, Cloudwatch,Splunk.Datadog Terraform, Scripting (Python/Go), Incident Management Architect and manage enterprise-level databases with 24/7 availability Lead efforts on optimization, backup, and disaster recovery planning Design and manage scalable CI/CD pipelines for cloud-native apps Automate infrastructure using Terraform/CloudFormation Implement container orchestration using Kubernetes and ECS Ensure cloud security, compliance, and cost optimization Monitor performance and implement high-availability setups Collaborate with dev, QA, and security teams; drive architecture decisions Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Bachelor's or Master's degree in Computer Science or related field. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS. Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Good to have TM Vault core banking knowledge, Strong communication and collaboration skills.

Posted -1 days ago

Apply

8.0 - 13.0 years

9 - 13 Lacs

Hyderabad

Work from Office

Naukri logo

Design, develop, test, and deploy scalable and resilient microservices using Java and Spring Boot. Collaborate with cross-functional teams to define, design, and ship new features. Work on the entire software development lifecycle, from concept and design to testing and deployment. Implement and maintain AWS cloud-based solutions, ensuring high performance, security, and scalability. Integrate microservices with Kafka for real-time data streaming and event-driven architecture. Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Should Be a Java Full Stack Developer. Bachelor's or Master's degree in Computer Science or related field. 6+ years of hands-on experience in JAVA FULL STACK - ANGULAR + JAVA SPRING BOOT Proficiency in Spring Boot and other Spring Framework components. Extensive experience in designing and developing RESTful APIs. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS. Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Strong communication and collaboration skills.

Posted -1 days ago

Apply

4.0 - 8.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Lead and manage a team of Software developers Work with AWS services and Cloud technologies. Work closely with Product management helping define new features Work closely with other technical leaders on the product Work closely with DevOps team to deliver monthly releases of the product. Experience in Java, building web services Experience building RESTFul APIs Some experience with one or more of following technologies: Spring, Spring Boot, Hibernate, Redis, Docker, AWS Services Must have strong teamwork orientation and the ability to foster collaboration within and across teams Thorough understanding of and experience with structured software development methodologies including design, development and testing in an Agile environment Excellent work ethic and strong sense of ownership of end result.

Posted 1 hour ago

Apply

5.0 - 10.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

Immediate Job Openings on # AWS Devops Engineer_ Bangalore/Mumbai_Contract Experience: 5+ Years Skill : AWS Devops Engineer Location: Bangalore/Mumbai Notice Period: Immediate . Employment Type: Contract Job Description AWS Devops Engineer 1. AWS/Ansible /yaml and a good understanding any one of the programming languages and ability to provide infrastructure on AWS using automated scripts. 2. Desired skills: elastic search, filebeat, Linux 3. Experience with multiple programming languages (Groovy, yaml ,python) and need experience in amazon web services. 4. Strong problem solving and analytical skills. 5. manage the cloud reliability teams to provide strong managed services support to end customers. 6. Responsible for automation using ansible. 7. Handle code deployments in all environments work towards CI /CD 8. CI/CD engines: Jenkins, GitLab 9. Configuration management: Ansible 10. Logging and monitoring: Grafana Primary Skill : CICD PIPELINE ; GITHUB ; ANSIBLE ; AWS Services( ec2,rds,vpc,S3 etc) Secondary Skill : Elastic search , Filebeat, Logstash

Posted 2 hours ago

Apply

9.0 - 12.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

We have Immedaite openings for AWS IAM engineer. JD: Primary Skills: Extensive experience with AWS services : IAM, S3, Glue, CloudFormation and CloudWatch In-depth understanding of AWS IAM policy evaluation for permissions and access control Proficient in using Bitbucket, Confluence, GitHub, and Visual Studio Code Proficient in policy languages, particularly Rego scriptingGood to Have Skills : Experience with the WIZ tool for security and compliance Good programming skills in Python Advanced knowledge of additional AWS services : ECS, EKS, Lambda, SNS and SQSRoles & ResponsibilitiesSenior Developer on the Wiz team specializing in Rego and AWS- -Project Manager - One to Three Years,AWS Cloud Formation - Four to Six Years,AWS IAM - Four to Six Years-PSP Defined SCU in Data engineer.

Posted 2 hours ago

Apply

6.0 - 9.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Candidate will be responsible for data preparation jobs, data analysis and data reporting. Responsible for performing technical troubleshooting and providing technical expertise for ETL jobs Designing and Developing the ETL mapping in Talend, SQL, PL SQL, etc Preparing the documentation likeMapping document, design document, deployment document Co-ordination with the support team, client and third party vendor on various aspects of SDLC cycle gathering Attend the status / stand up calls. Primary SkillTalend DI, SQL, PL SQL, AWS services Secondary Skill Python Primary skill is Talend please suggest more suitable Talend/SQL profiles.

Posted 2 hours ago

Apply

6.0 - 11.0 years

3 - 6 Lacs

Hyderabad

Hybrid

Naukri logo

Bachelors degree in Computer Science, Engineering, or a related field. 4-6 years of professional experience with AWS Lambda and serverless architecture. Proficiency in Python programming. Strong experience with shell scripting and SQL Experience working in Production environment and well versed with ITIL processes. Excellent communication and interpersonal skills. Experience with Oracle BRM is an advantage but not mandatory. Familiarity with other AWS services (e.g., S3, DynamoDB, API Gateway) is desirable. Ability to work independently and in a team environment.

Posted 2 hours ago

Apply

7.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Urgent requirement for Devops Engineer. Location Bangalore. Mandatory Skills AWS Services, Ansible, Graphana Terraform, Jenkins, Kubernetes Github, Vault, JFrog, Linux Services. Shift Timings General Shift. POSITION GENERAL DUTIES AND TASKS Below is the list of tools but not limited to , required preferably to have knowledge and exposure about 1. AWS services (RDS, EC2, S3, Kubernetes, SNS, Iam, AWS CLI, Route, Lambds, IoT, Greengrass, kafka, VPC, Security groups) 2. Ansible 3. Graphana 4. Terraform, 5. Terragrunt, 6. Jenkins

Posted 3 hours ago

Apply

5.0 - 10.0 years

10 - 15 Lacs

Noida, Gurugram

Work from Office

Naukri logo

5+ years of overall technical experience Minimum 2+ years experience in Node.js Minimum 1-year relevant experience in team management. Experience with Angular/ Reactjs lambda Services PHP is an added advantage. Experience in working with MySQL, Mongo DB, Dynamodb, AWS services. Hands-on experience in Object-Oriented JavaScript, ES6, TypeScript. Pratibha Tanwar

Posted 19 hours ago

Apply

5.0 - 10.0 years

15 - 20 Lacs

Gurugram

Work from Office

Naukri logo

Backend Developer - Node js + Mongo DB + Mysql 5+ Yr Exp , Product Development Description - Design, implement and support the technical solution. Participate actively in all phases of the application development lifecycle. - Analyze and improve the software architecture with a focus on maintainability and scalability. - Mentor and guide the team, including performing code reviews and pair programming. - Work with user experience designers to ensure all user interactions are implemented correctly and optimized for performance so that we can build an amazing user experience. - Must have experience in effort estimation and project estimation. - Drive the team on the test-driven development approach - Ability to manage multiple priorities and projects, and able to clearly define different delivery options. Preferred Skills - 5+ years of overall technical experience - Minimum 2+ years experience in Node.js - Minimum 1-year relevant experience in team management. - Experience with Angular/ Reactjs lambda Services PHP is an added advantage. - Experience in working with MySQL, Mongo DB, Dynamodb. - Experience in working with AWS services. - Hands-on experience in Object-Oriented JavaScript, ES6, TypeScript.

Posted 19 hours ago

Apply

3.0 - 5.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Cloud Contact Center Implementation Good to have skills : Java, AWS Architecture, React.jsMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time educationAWS Connect DeveloperLevel:- AMExperience:- Above 9 Years (AM) Summary :As an AWS Connect Developer you should Design, develop, and deploy contact center solutions using AWS Connect. You should be aware of cloud contact center operations and customer service processes.Desired Responsibilities:Design, develop, and deploy contact center solutions using AWS Connect.Customize and configure AWS Connect to meet the specific requirements of various projects.Integrate AWS Connect with other AWS services such as Lambda, DynamoDB, and S3.Develop and maintain Interactive Voice Response (IVR) systems and other customer service workflows.Implement security best practices to ensure data protection and compliance.Troubleshoot and resolve issues related to AWS Connect deployments.Collaborate with business stakeholders to gather requirements and translate them into technical specifications.Monitor and optimize the performance of AWS Connect solutions.Create and maintain documentation for system configurations, processes, and procedures.Stay updated with the latest AWS services and best practices. Technical Experience:3+ years of experience in developing solutions with AWS services.Hands-on experience with AWS Connect, including setup, configuration, and troubleshooting.Proficiency in programming languages such as Python, JavaScript, or Node.js.Experience with AWS Lambda, DynamoDB, S3, and other AWS services.Knowledge of contact center operations and customer service processes.Professional AttributeMinimum 9 years of hands-on experience on one or more products mentioned above.Team handling / people management, good communication skills.Excellent communication skills.Previous experience working with cross geography teams.Ready to work in shifts including night.Education15 years of full-time education-Bachelor's Degree or higher in Science/Computer Science /Electronics or any other relevant field. Bachelors degree in engineering preferred. Qualification 15 years full time education

Posted 20 hours ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Noida, Gurugram

Work from Office

Naukri logo

About the Role: Grade Level (for internal use): 12 Key Responsibilities: Lead the implementation and maintenance of security best practices to protect sensitive data across our systems. Collaborate with development teams to integrate security into the software development lifecycle (SDLC) and drive security initiatives. Automate deployment processes and manage infrastructure as code using tools such as Terraform or CloudFormation. Monitor system performance and security, responding to incidents and vulnerabilities promptly and effectively. Develop and maintain CI/CD pipelines to ensure efficient and secure software delivery. Conduct security assessments, audits, and penetration testing to identify and mitigate risks. Mentor and guide junior engineers, fostering a culture of continuous learning and best practices in DevSecOps. Collaborate with cross-functional teams to ensure compliance with industry standards and regulations. Qualifications: Proficient in scripting languages and familiar with Java. Strong understanding of cloud security practices and AWS services. Extensive experience with DevOps tools. Knowledge of data security frameworks and compliance standards. Strong leadership, problem-solving skills, and ability to communicate effectively with team members. Why Join Us: Opportunity to lead and work on dynamic and innovative projects. Collaborate with a talented team of engineers and industry experts. Contribute to the growth and security of our organization. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, andmake decisions with conviction.For more information, visit www.spglobal.com/marketintelligence . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)

Posted 21 hours ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies