AWS Cloud engineer

3 - 12 years

5 - 10 Lacs

Posted:1 month ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Client of Alp consulting ltd, which is market leading Content and Data Technology company providing data services, subject matter expertise, & technology solutions to multiple domains.
Position: AWS Cloud engineer

Job location: Bangalore, Chennai, Hyderabad, Mumbai, Kolkata, Gurgaon, Noida
Mode of work: Remote

Experience: 5 to 12years

Qualification: BE/BTech/ME/MTech/MCA (Fulltime & regular)



A Cloud Engineer is responsible for designing, building, and maintaining cloud-based infrastructure and processes to support data and application solutions. The role emphasizes implementing governance, best practices, and security measures while optimizing cloud costs. This individual will leverage Infrastructure as Code (IaC) and Continuous Integration/Continuous Deployment (CI/CD) pipelines to ensure scalability, efficiency, and security. They will work as part of the platform team in close collaboration with various groups, including data governance and data engineering.

Roles and Responsibilities:

Collaborate closely with data teams to support the development and deployment of innovative and efficient data solutions.

Respond to and fulfill platform requests from various NFL data teams, including internal business stakeholders, data analytics professionals, data engineers, and quality assurance teams.

Required Skill Sets:

Hands-on experience with AWS, including familiarity working with the following services: Analytics:

Athena

Glue

Redshift

Application Integration:

EventBridge

MWAA

SNS

SQS

Compute:

EC2

Lambda

Containers:

ECR

ECS

Database:

DynamoDB

RDS

Developer Tools:

CDK

CloudFormation

CodeBuild

CodeCommit

CodePipeline

Management & Governance:

CloudTrail

CloudWatch

Network:

API Gateway

VPC

Security, Identity & Compliance:

IAM

KMS

Secrets Manager

Storage:

S3

Well-versed in the core principles of the AWS Well-Architected Framework, encompassing its five foundational pillars: Operational Excellence, Security, Reliability, Performance Efficiency, and Cost Optimization.

Strong problem-solving abilities and a passion for tackling complex challenges collaboratively and independently.

Proactive, detail-oriented self-starter with excellent organizational skills.

Exceptional communication skills, with the ability to present findings effectively to both technical and non-technical audiences.

Previous experience in AWS infrastructure operations, including monitoring, troubleshooting & supporting cross functional teams.

Proficiency in optimizing AWS resources to enhance performance and scalability while reducing costs.

Working knowledge of CI/CD pipelines and build/test/deploy automation tools. Experience in environments leveraging IaC tools (e.g., CDK & AWS CloudFormation). Proficient in Python, with intermediate-level expertise in developing scripts, automating

workflows, and implementing solutions using AWS Cloud Development Kit (CDK) and boto3 Extensive expertise in implementing security best practices, including the design and management of AWS IAM policies, encryption using AWS Key Management Service (KMS), and robust data protection mechanisms. Adept at adhering to the principle of least privilege to ensure secure and efficient access control.

Active participation in defining cloud strategies and evaluating emerging technologies and AWS services.

Proficient in working within agile environments, with a strong understanding of agile methodologies.

Experienced in utilizing JIRA to manage backlogs and sprints, as well as leveraging JIRA Service Management to support ticketing and operational workflows.

Preferred Qualifications

A university degree in Computer Science, Engineering, or a related field.

A minimum of 3 years of experience in data engineering or production support. Further hand-on experience with AWS, including familiarity working with the following additional services (nice to have, not required):

Analytics:

Clean Rooms

Kinesis Analytics/Flink, Data Streams & Firehose

MSK

Quicksight

Sagemaker

Application Integration:

Step Functions

Compute:

AppRunner

Batch

EC2 Image Builder

Containers:

EKS

Database:

Aurora

Elasticache

Developer Tools:

CodeArtifact

CodeDeploy

X-Ray

Management & Governance:

AWS Organizations

Grafana

Lake Formation

Prometheus

Systems Manager

Network & Content Delivery:

Cloudfront

Route 53

Security, Identity & Compliance:

Certificate Manager

Cognito

Identity Centre

WAF & Shield

Storage:

AWS Backup

S3, specifically with Hudi, Iceberg or other Open Table Data Lake formats.

Glacier

Experienced in building data platforms, with a strong understanding of Data Lake, Data Warehouse, and Lakehouse architectures.

Possess intermediate proficiency in SQL, with the ability to write and optimize queries for data retrieval, transformation, and analysis.

Demonstrated expertise in resiliency planning and the development and implementation of disaster recovery strategies.

Proven ability to implement integration services with secure authentication and authorization protocols, such as OAuth 2.0, OpenID Connect (OIDC), and SAML.

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You