AWS Data Engineer - Integration & Modelling

58 years

0 Lacs

Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Description

We are seeking an experienced AWS Data Engineer to design, build, and maintain scalable data pipelines and cloud-based data platforms on Amazon Web Services (AWS). The ideal candidate will have strong expertise in data integration, data modeling, ETL development, and cloud data architecture, with a focus on performance, scalability, and security.

Key Responsibilities

  • Design and implement data ingestion, transformation, and storage pipelines using AWS services such as Glue, Lambda, EMR, Redshift, and S3.
  • Develop and optimize ETL/ELT workflows to support analytics, data science, and reporting requirements.
  • Collaborate with data scientists, analysts, and business teams to understand data needs and ensure reliable data delivery.
  • Build and maintain data lake and data warehouse architectures on AWS.
  • Work with both structured and unstructured data, ensuring high quality, consistency, and availability.
  • Manage data security, governance, and compliance according to organizational standards.
  • Implement data validation, quality checks, and monitoring frameworks for pipelines.
  • Optimize performance and cost across storage, compute, and data processing layers.
  • Leverage Infrastructure as Code (IaC) tools like Terraform or CloudFormation for environment setup and automation.
  • Support DevOps and CI/CD practices for automated data pipeline deployments.

Required Skills & Qualifications

  • Bachelors or Masters degree in Computer Science, Information Systems, Data Engineering, or a related field.
  • 58 years of professional experience in data engineering, ETL development, or cloud data solutions.

Hands-on Expertise In AWS Data Services Such As

  • AWS Glue, S3, Lambda, Redshift, EMR, Athena, Step Functions, and Kinesis.
  • Strong proficiency in SQL and Python for data processing and automation.
  • Solid understanding of data modeling (OLTP and OLAP), data warehousing concepts, and performance tuning.
  • Experience with ETL tools (AWS Glue, Talend, Informatica, dbt, or similar).
  • Familiarity with big data technologies such as Spark, Hadoop, or PySpark.
  • Knowledge of version control (Git) and CI/CD pipelines for data projects.
  • Strong understanding of data security, encryption, and IAM policies in AWS.

Preferred Skills

  • Experience with streaming data solutions (Kafka, Kinesis Data Streams, or AWS MSK).
  • Familiarity with modern data stack tools like dbt, Airflow, Snowflake, or Databricks.
  • Exposure to MLOps or data science pipeline integration.
  • Knowledge of API-based data integration and RESTful services.
  • AWS certification such as :
  • AWS Certified Data Analytics Specialty or AWS Certified Solutions Architect Skills :
  • Strong problem-solving and analytical abilities.
  • Excellent communication and collaboration skills.
  • Ability to work independently and deliver in agile environments.
  • Detail-oriented with a focus on data quality and reliability.
(ref:hirist.tech)

Mock Interview

Practice Video Interview with JobPe AI

Start DevOps Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You