Senior Data Engineer - Data lakehouse

8 years

0 Lacs

Posted:3 weeks ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

Job Title: Senior

Job Location:

Working Mode:

Experience:

Working Hours:

Interview Process: 2-


Job Title: Senior Data Engineer – AWS & Data Lakehouse Specialist

Requirement Summary:

We are seeking a highly skilled and motivated Senior Data Engineer with overall 8+ years of experience and 4–5 years of hands-on experience in building scalable data solutions on AWS. The ideal candidate will be proficient in Python, PySpark, and AWS Glue, with a strong understanding of Data Lakehouse architecture—especially the medallion model. You will play a key role in designing, developing, and optimizing data pipelines, ensuring data quality, and implementing infrastructure as code using Terraform.


Key Responsibilities

  • Design and implement scalable ETL/ELT pipelines using AWS Glue and PySpark
  • Develop and maintain data workflows using AWS Glue DataBrew and AWS Data Quality services
  • Architect and manage Data Lakehouse solutions following the medallion architecture (Bronze, Silver, Gold layers)
  • Optimize data lake performance (Parquet formats, partitioning strategies, DPU tuning)
  • Implement S3 data encryption and security best practices
  • Automate infrastructure provisioning using Terraform (IaC)
  • Collaborate with data analysts, scientists, and business stakeholders to deliver clean, reliable data
  • Integrate and manage DBT workflows for data transformation and modeling (preferred)
  • Monitor, troubleshoot, and enhance data pipeline reliability and performance


Required Skills & Qualifications:

  • 4–5 years of relevant experience in data engineering with a focus on AWS
  • Strong proficiency in SQL, Python and PySpark
  • Hands-on experience with AWS Glue ETL, Glue DataBrew, and AWS Data Quality
  • Proven expertise in building and managing Data Lakehouse architectures using medallion layering
  • Deep understanding of Parquet file formats, partitioning, and DPU configuration for performance tuning
  • Experience with S3 encryption and data security protocols
  • Solid grasp of Infrastructure as Code using Terraform
  • Familiarity with DBT for data transformation and modeling (preferred)
  • Strong problem-solving skills and ability to work independently and collaboratively


Preferred Qualifications:

  • AWS certifications (e.g., AWS Certified Data Analytics, AWS Solutions Architect)
  • Experience with CI/CD pipelines and DevOps practices
  • Exposure to data governance and cataloging tools (e.g., AWS Glue Catalog, Lake Formation)


Please share me the below details:

Total Experience:

Relevant Experience in Data Engineering:

Exp in Data Lakehouse:

Exp in AWS:

Exp in AWS Glue ETL:

Exp in Python & Pyspark:

Reason for the Job change:

Current Company:

Current Location:

Preferred Location:

CCTC:

ECTC:

Notice Period:

Are you able to join us with in 15-20 Days Yes/No:

If You're serving notice Pls mention the Last working date:

Marital Status & Your Native Location:

If You are Holding any offer, pls mention the CTC & Job Location:

Available for Virtual Interview on Weekdays Yes/No:


Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You