Posted:2 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Job Title : Data Engineer

Role:  AWS Data Engineer

Key Skills: Data Engineer, Python, AWS services,SQL, ETL, pyspark

Experience: 3 to 8 Yrs

Location: Madurai

Mode : Hybrid


About the Role:

AWS Data Engineer


What You Will Do:

  • Architect, build, and maintain robust and scalable data pipelines using AWS services.

  • Implement performant and reliable ETL/ELT processes that handle large volumes of structured and unstructured data.

  • Enforce and monitor data SLAs to ensure freshness, reliability, and availability of datasets across environments.

  • Collaborate with engineering, product, and analytics teams to transform business requirements into robust data models and pipelines.

  • Proactively identify and resolve bottlenecks, data quality issues, and system inefficiencies.

  • Implement schema versioning, data lineage tracking, and database change management practices.

  • Define and enforce best practices for data governance, access control, observability, and compliance.

  • Contribute to CI/CD workflows and infrastructure as code practices using tools like CloudFormation or Terraform.

What You Will Bring:

  • 3+ years of experience in data engineering or backend systems development, with a strong focus on cloud-based architectures.

  • Deep expertise in AWS data ecosystem—especially Redshift, Glue, S3, Athena, Lambda, Step Functions, and CloudWatch.

  • Strong background in SQL performance tuning, schema design, indexing, and partitioning strategies for large datasets.

  • Experience maintaining data freshness SLAs and end-to-end ownership of production pipelines.

  • Hands-on experience with Python (or PySpark), T-SQL, and scripting automation for data ingestion and transformation.

  • Solid understanding of relational and dimensional data modeling, normalization, and schema evolution.

  • Experience with source control systems (e.g., Git, Bitbucket) and CI/CD pipelines for data infrastructure.

  • Track record of transforming complex business requirements into reliable and scalable data solutions.

  • Experience with data governance, security, and compliance frameworks (e.g., HIPAA, GDPR) is a plus.

  • Familiarity with monitoring and observability tools (e.g., CloudWatch, Datadog, or Prometheus).

  • Bonus: Exposure to Snowflake or MSSQL in hybrid cloud environments.

Nice to Have:

  • AWS certifications such as AWS Certified Data Analytics, Solutions Architect, or DevOps Engineer.

  • Experience with Apache Airflow, dbt, or other data orchestration tools.

  • Familiarity with Kafka, Kinesis, or other streaming technologies.

  • Understanding of data mesh or data lakehouse architectures


Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

chennai, tamil nadu, india