Data Engineer - AWS

5 - 8 years

0 Lacs

Posted:5 days ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

Position :

Data Engineer

Experience :

5-8 Years

Location :

Work From Home

Job Summary

We are seeking a skilled Data Engineer with 5-8 years of experience to join our remote team. The ideal candidate will have extensive experience with AWS Glue and a strong background in building and maintaining robust data pipelines. You will be responsible for designing, developing, and optimizing our data infrastructure to support business intelligence and analytics. This role requires proficiency in Python, SQL, Kafka, and Apache Airflow, along with a commitment to implementing DataOps best practices for continuous delivery.

Key Responsibilities

ETL & Data Integration :

  • Design, build, and maintain scalable data pipelines using AWS Glue for data integration and transformation.
  • Utilize Python for scripting, automation, and custom data processing tasks.

Workflow & Orchestration

  • Orchestrate complex data workflows using Apache Airflow to ensure efficient and reliable data delivery.
  • Implement and manage real-time data streaming and processing using Kafka.

Data Management & Quality

  • Use advanced SQL skills to query, manage, and optimize relational databases.
  • Ensure data quality and accuracy across all data pipelines and systems.

DataOps & Collaboration

  • Apply DataOps tools and methodologies for continuous integration and delivery (CI/CD) in data engineering.
  • Collaborate with cross-functional teams to understand data requirements and deliver on business objectives.

Required Skills

Core Experience :

  • 5-8 years of experience as a Data Engineer.

Technical Proficiency

  • Extensive hands-on experience with AWS Glue.
  • Strong proficiency in Apache Airflow for workflow orchestration.
  • In-depth knowledge of Kafka for real-time data streaming.
  • Advanced SQL skills.
  • Proficiency in Python for scripting and automation.

Databases

  • Familiarity with SAP HANA for data storage and management.

Preferred Skills

  • Knowledge of Snowflake for cloud-based data warehousing.
  • Experience with other AWS data services like Redshift, S3, and Athena.
  • Familiarity with big data technologies such as Hadoop, Spark, and Hive.
  • Experience with DataOps tools and methodologies.
(ref:hirist.tech)

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

Mumbai Metropolitan Region

Gurugram, Haryana, India