Aws Data Engineer

5 - 8 years

10 - 20 Lacs

Posted:12 hours ago| Platform: Naukri logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

AWS Data Engineer

Key Responsibilities

  • Design and implement

    scalable, secure, and high-performance

    data architectures

    to support enterprise data warehousing and analytics.
  • Develop and maintain ETL/ELT pipelines

    leveraging AWS services such as

    Glue, Redshift, Lambda, DMS, and S3

    .
  • Utilize

    Python, PySpark, and SQL

    to build data transformation and ingestion workflows.
  • Optimize data flows and query performance for high-volume, real-time, and batch data processing.
  • Implement and maintain

    data quality, validation, and reconciliation frameworks

    to ensure accuracy and reliability of data.
  • Work closely with Data Architects, Analysts, and Product Teams to understand business needs and translate them into data-driven solutions.
  • Apply

    data warehousing best practices

    , including star/snowflake schema design, partitioning, indexing, and performance tuning.
  • Participate in

    CI/CD workflows

    and use

    version control systems

    (Git, CodeCommit) to ensure smooth data pipeline deployment and versioning.
  • Monitor, troubleshoot, and enhance existing pipelines for performance, scalability, and cost optimization.

Required Qualifications

  • 5+ years

    of experience as a

    Data Engineer

    , with a strong focus on

    AWS-based data platforms

    .
  • Deep understanding of

    AWS data ecosystem

    , including

    Redshift, Glue, S3, DMS, Lambda

    .
  • Proficiency in

    Python, PySpark, and SQL

    for data processing and transformation.
  • Strong foundation in

    data warehousing concepts and design techniques

    (e.g., dimensional modeling, normalization, denormalization).
  • Experience with

    workflow orchestration

    tools like

    Apache Airflow or AWS Managed Airflow

    .
  • Familiarity with

    Terraform

    and

    CI/CD pipelines

    for automating data infrastructure deployments.
  • Hands-on experience with

    Git-based version control

    and collaborative development practices.
  • Proven experience working with

    JSON and other semi-structured data formats.

  • Excellent analytical thinking, problem-solving, and debugging skills.
  • Strong written and verbal communication skills to collaborate with cross-functional stakeholders.

Soft Skills

  • Strong ownership mindset and ability to work independently in fast-paced environments.
  • Collaborative attitude and willingness to mentor junior team members.
  • Attention to detail with a focus on delivering high-quality, production-ready data solutions.

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Vrize logo
Vrize

IT Services and IT Consulting

N/A

RecommendedJobs for You

bangalore rural, bengaluru

hyderabad, chennai, bengaluru

kochi, kolkata, thiruvananthapuram