AWS Data Engineer

3 years

0 Lacs

Posted:3 hours ago| Platform: GlassDoor logo

Apply

Work Mode

On-site

Job Type

Part Time

Job Description

AWS Data Engineer (ETL Specialist)
Experience: 3-5 years (minimum 2+ years hands-on experience with AWS-native data engineering tools)

Location
: Gigaplex, Airoli West
Role Overview: The AWS Data Engineer will design, develop, and manage scalable data pipelines and analytics infrastructure in a cloud-native environment. This engineer will be responsible for architecting complex ETL processes using AWS-managed services, optimizing data performance, and ensuring data quality, security, and observability across multiple systems. The ideal candidate has deep AWS knowledge, strong ETL design experience, and a solid grasp of modern data engineering practices.
Key Responsibilities:
  • Design and implement end-to-end ETL workflows leveraging AWS services such as Glue, Lambda, Step Functions, EMR, Redshift, Kinesis, and S3.
  • Develop and maintain data ingestion pipelines from structured, semi-structured, and streaming data sources.
  • Design and maintain data lake and data warehouse solutions (S3, Redshift, Lake Formation).
  • Build transformation logic with PySpark, SQL, or Python, ensuring performance and integrity.
  • Orchestrate workflows using AWS Glue Workflows, Apache Airflow, or Step Functions.
  • Implement data quality validations, monitoring frameworks, and automated alerts for pipeline health.
  • Collaborate with data scientists, analysts, and application engineering teams to ensure data accessibility and alignment with analytics use cases.
  • Ensure compliance with data governance and security frameworks (IAM, encryption, GDPR/HIPAA as applicable).
  • Participate in data architecture reviews, contributing to design best practices for reliability and scalability.
  • Document all data flows, transformations, and pipeline specifications for reproducibility and audits.
Required Technical Skills:
  • Strong development background in Python and SQL.
  • Expertise with AWS data services: Glue, Redshift, EMR, S3, RDS, Lambda, Kinesis, CloudWatch, and CloudFormation.
  • Deep understanding of ETL/ELT design patterns, including incremental loads and change data capture (CDC).
  • Familiarity with data modelling (Star/Snowflake schemas) and data lakehouse architectures.
  • Experience working with large-scale or real-time datasets.
  • Knowledge of data quality frameworks and data observability tools.
  • Comfort with DevOps and CI/CD workflows using Git, CodePipeline, or Terraform.
  • Advanced understanding of data security practices in AWS (IAM roles, encryption, network isolation).
Desired Skills:
  • Hands-on experience with Snowflake, Databricks, or Athena.
  • Familiarity with BI/analytics tools (QuickSight, Power BI, Tableau).
  • AWS certifications such as AWS Certified Data Engineer – Associate or Data Analytics Specialty.
  • Strong analytical and communication skills to translate business data needs into engineering solutions.
Educational Requirements:
  • Master's or Bachelor’s degree in Computer Science, Data Engineering, or related technical field.
  • AWS Data Engineering or Data Analytics certification preferred

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
CrossAsyst logo
CrossAsyst

Information Technology & Services

Sample City

RecommendedJobs for You

bengaluru, karnataka, india

chennai, tamil nadu, india