AWS Data engineer

6 - 7 years

0 Lacs

Posted:2 weeks ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Job Title :

AWS Data Engineer

Location :

Hyderabad (Hybrid)

Experience Required :

6 to 7 Years

Employment Type :

PermanentWe are looking for a highly skilled and experienced AWS Data Engineer to join our growing data engineering team. The ideal candidate will have a strong background in building scalable data pipelines, data integration, and hands-on experience with AWS cloud services. You will play a critical role in designing, developing, and optimizing data solutions to support analytics and business intelligence efforts across the Responsibilities :
  • Design, develop, and maintain scalable and robust data pipelines using AWS cloud-native tools.
  • Build and manage ETL/ELT processes to ingest and transform structured and unstructured data.
  • Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver reliable datasets.
  • Implement best practices for data engineering, including data quality, governance, testing, and security.
  • Monitor data workflows and troubleshoot issues across the pipeline lifecycle.
  • Work on performance tuning, scalability, and cost optimization of data processes on AWS.
  • Create and maintain technical documentation related to data pipelines and infrastructure.
  • Contribute to continuous integration and deployment (CI/CD) automation for data Skills :
  • 6 to 7 years of overall experience in data engineering.
  • Strong expertise in AWS data services such as AWS Glue, Redshift, S3, Lambda, Step Functions, and Athena.
  • Proficiency in Python or Scala for ETL development and data transformation.
  • Solid experience with SQL for data manipulation and querying.
  • Experience with data lake and data warehouse architecture.
  • Good understanding of data modeling concepts and performance tuning.
  • Hands-on experience with version control tools like Git and CI/CD pipelines.
  • Familiarity with tools like Airflow, DBT, or similar workflow orchestration frameworks is a plus.
  • Excellent problem-solving, analytical, and communication to Have :
  • Experience with big data technologies like Spark, Kafka, or Hadoop.
  • Exposure to DevOps practices and infrastructure-as-code tools like Terraform or CloudFormation.
  • Knowledge of data security, GDPR, and compliance Qualification :
  • Bachelors or Masters degree in Computer Science, Information Technology, Engineering, or a related field.

Why Join Us?

  • Work with a dynamic and collaborative team focused on cutting-edge data solutions.
  • Opportunity to contribute to high-impact projects in a cloud-first environment.
  • Flexible hybrid working model with a long-term career growth path.
(ref:hirist.tech)

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

Chennai, Tamil Nadu, India

Hyderabad, Telangana, India

Kolkata metropolitan area, West Bengal, India

Chennai, Tamil Nadu, India

Hyderabad, Telangana, India

Kolkata metropolitan area, West Bengal, India