Senior Data Engineer – AWS & Data Lakehouse Specialist

7 - 8 years

25 - 35 Lacs

Posted:2 months ago| Platform: GlassDoor logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Hi, Dear candidate

Greetings of the day!!

I am Amutha and I'm reaching out to you regarding an exciting opportunity with TechMango. You can connect with me on LinkedIn https://www.linkedin.com/in/amutha-valli-32611b289/ Or Email: amutha.m@techmango.net
Techmango Technology Services is a full-scale software development services company founded in 2014 with a strong focus on emerging technologies. It holds a primary objective of delivering strategic solutions towards the goal of its business partners in terms of technology.

We are a full-scale leading Software and Mobile App Development Company. Techmango is driven by the mantra “Clients Vision is our Mission”.

We have a tendency to stick on to the current statement. To be the technologically advanced & most loved organization providing prime quality and cost-efficient services with a long-term client relationship strategy. We are operational in the USA - Chicago, Atlanta, Dubai - UAE, in India - Bangalore, Chennai, Madurai, Trichy.Techmangohttps://www.techmango.net/

Job Title: Senior Data Engineer – AWS & Data Lakehouse Specialist

Requirement Summary:
We are seeking a highly skilled and motivated Senior Data Engineer with overall 7-8 years ofexperience and 4–5 years of hands-on experience in building scalable data solutions on AWS. The ideal candidate will be proficient in Python, PySpark, and AWS Glue, with a strong understanding of Data Lakehouse architecture—especially the medallion model. You will play a key role in designing, developing, and optimizing data pipelines, ensuring data quality, and implementing infrastructure as code using Terraform.Key Responsibilities:

  • Design and implement scalable ETL/ELT pipelines using AWS Glue and PySpark
  • Develop and maintain data workflows using AWS Glue DataBrew and AWS Data Quality services
  • Architect and manage Data Lakehouse solutions following the medallion architecture
  • (Bronze, Silver, Gold layers)
  • Optimize data lake performance (Parquet formats, partitioning strategies, DPU tuning)
  • Implement S3 data encryption and security best practices
  • Automate infrastructure provisioning using Terraform (IaC)
  • Collaborate with data analysts, scientists, and business stakeholders to deliver clean, reliable data
  • Integrate and manage DBT workflows for data transformation and modeling (preferred)
  • Monitor, troubleshoot, and enhance data pipeline reliability and performance

Required Skills & Qualifications:

  • 4–5 years of relevant experience in data engineering with a focus on AWS
  • Strong proficiency in SQL, Python and PySpark
  • Hands-on experience with AWS Glue ETL, Glue DataBrew, and AWS Data Quality
  • Proven expertise in building and managing Data Lakehouse architectures using medallion layering
  • Deep understanding of Parquet file formats, partitioning, and DPU configuration for performance tuning
  • Experience with S3 encryption and data security protocols
  • Solid grasp of Infrastructure as Code using Terraform
  • Familiarity with DBT for data transformation and modeling (preferred)
  • Strong problem-solving skills and ability to work independently and collaboratively

Preferred Qualifications:

  • AWS certifications (e.g., AWS Certified Data Analytics, AWS Solutions Architect)
  • Experience with CI/CD pipelines and DevOps practices
  • Exposure to data governance and cataloging tools (e.g., AWS Glue Catalog, Lake
  • Formation)

Job Type: Full-time

Pay: ₹2,500,000.00 - ₹3,500,000.00 per year

Application Question(s):

  • Overall work experience:
  • How many years of experience do you have in AWS?
  • How many years of experience do you have in SQL, Python and PySpark?
  • How many years of experience do you have in AWS Glue ETL, Glue DataBrew, and AWS Data Quality?
  • How many years of experience do you have in Data Lakehouse ?
  • May I know your last working date:
  • CTCC:
  • ETCC:

Work Location: In person

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now