AWS Data Platform Engineer (6+ years experience)

6 years

0 Lacs

Posted:1 week ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Contractual

Job Description

Company Description

Lean IT Inc. is a leader in implementing cutting-edge cloud solutions for diverse clients. As an official Salesforce Ridge Partner, we provide expert consultations for optimal Salesforce platform use, robust solution design, and seamless API integrations, also contributing custom applications to the AppExchange platform. We specialize in Data Visualization, Big Data Implementation, Data Migration, and Data Modernization, transforming data into actionable insights and modernizing infrastructures. Our highly skilled and certified professionals continuously upskill to adhere to best practices in the evolving tech landscape. As a socially responsible company, Lean IT Inc. is a proud member of Pledge 1%, contributing to global philanthropy.


Role Description

This is a contract role for an AWS Data Platform Engineer with 6+ years of experience. This is a hybrid role based in Chennai, with some work from home acceptable. The AWS Data Platform Engineer will be responsible for designing, developing, and maintaining data pipelines and platforms, ensuring smooth data migrations, and providing infrastructure support. Day-to-day tasks include troubleshooting, software development, programming, and database management.


Title: Data Platform Engineer

Experience: 6+ years

Location: Remote

Duration: 6 months


Are you passionate about building and supporting modern data platforms in the cloud? We’re looking for a Sr. Data Platform Engineer who thrives in a hybrid role—60% administration and 40% development/support—to help us scale our data and DataOps infrastructure. You’ll work with cutting-edge technologies like Databricks, Apache Spark, Delta Lake, and AWS CloudOps, Cloud Security, while supporting mission-critical data pipelines and integrations. If you’re a hands-on engineer with strong Python skills, deep AWS experience, and a knack for solving complex data challenges, we want to hear from you.


Are you passionate about building and supporting modern data platforms in the cloud? We’re looking for a Sr. Data Platform Engineer who thrives in a hybrid role—60% administration and 40% development/support—to help us scale our data and DataOps infrastructure. You’ll work with cutting-edge technologies like Databricks, Apache Spark, Delta Lake, and AWS CloudOps, Cloud Security, while supporting mission-critical data pipelines and integrations. If you’re a hands-on engineer with strong Python skills, deep AWS experience, and a knack for solving complex data challenges, we want to hear from you.


Key Responsibilities

• Design, develop, and maintain scalable ETL pipelines and integration frameworks.

• Administer and optimize Databricks and Apache Spark environments for data engineering workloads.

• Build and manage data workflows using AWS services such as Lambda, Glue, Redshift, SageMaker, and S3.

• Support and troubleshoot DataOps pipelines, ensuring reliability and performance across environments.

• Automate platform operations using Python, PySpark, and infrastructure-as-code tools.

• Collaborate with cross-functional teams to support data ingestion, transformation, and deployment.

• Provide technical leadership and mentorship to junior developers and third-party teams.

• Create and maintain technical documentation and training materials.

• Troubleshoot recurring issues and implement long-term resolutions.


Minimum Qualifications

• Bachelor’s or Master’s degree in Computer Science or related field.

• 5+ years of experience in data engineering or platform administration.

• 3+ years of experience in integration framework development with a strong emphasis on Databricks, AWS, and ETL.


Required Technical Skills

• Strong programming skills in Python and PySpark.

• Expertise in Databricks, Apache Spark, and Delta Lake.

• Proficiency in AWS CloudOps, Cloud Security, including configuration, deployment, and monitoring.

• Strong SQL skills and hands-on experience with Amazon Redshift.

• Experience with ETL development, data transformation, and orchestration tools.


Nice to Have / Working Knowledge

• Kafka for real-time data streaming and integration.

• Fivetran and DBT for data ingestion and transformation.

• Familiarity with DataOps practices and open-source data tooling.

• Experience with integration tools such as Apache Camel and MuleSoft.

• Understanding of RESTful APIs, message queuing, and event-driven architectures

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You