Date Engineer(Python, Pyspark and AWS)

6 years

18 - 21 Lacs

Posted:2 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Key Skills Required-

  • Minimum 6 years of working experience in Python, Pyspark, AWS and SQL.
  • Programming & Frameworks: Python, PySpark
  • Databases & Querying: Strong SQL (query optimization, joins, window functions)
  • Big Data & Processing: PySpark for distributed data processing, ETL pipelines
  • Cloud Platform: AWS (S3, Glue, Lambda, EMR, Redshift, Athena)
  • Data Engineering: Data pipeline design, data ingestion, transformation, and integration
  • Other Nice-to-Have: Airflow (or other orchestration tools), CI/CD, Docker
🔹

Responsibilities

  • Design and develop scalable ETL pipelines using PySpark and SQL
  • Manage data ingestion & transformation on AWS cloud (S3, Glue, EMR, Redshift)
  • Optimize SQL queries for performance in large-scale datasets
  • Collaborate with data scientists and analysts for data availability & quality
  • Implement best practices in data engineering (partitioning, schema design, data governance)
  • Monitor, debug, and optimize pipeline performance
Skills: sql,python,pyspark,data engineer,aws

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You