Home
Jobs

PySpark Engineer

3 - 4 years

0 Lacs

Posted:2 weeks ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Requirement Must Have: • Bachelor’s degree in computer science, Engineering, or a related field. • 3 to 4 years of experience with PySpark for data processing and manipulation on large scale datasets. • Solid understanding of Spark architecture, including RDDs,DataFrames, and Datasets. • Strong programming experience in Python, including libraries such as pandas, numpy, and matplotlib. • Experience with Hadoop, Hive, and NoSQL databases (e.g.,Cassandra, MongoDB). • Working knowledge of cloud computing services (e.g., AWS, Azure, or Google Cloud). • Familiarity with batch and stream processing (using Kafka, Flink, Spark Streaming). • Strong problem-solving skills and attention to detail. • Excellent communication and teamwork skills. Good to have skills: • Experience with Apache Airflow or other orchestration tools for managing workflows. • Familiarity withDocker or Kubernetes for containerized data environments. • Experience in implementing and managing Continuous Integration (CI) and Continuous Delivery (CD) pipelines, with a focus on automating, testing, and deploying code in a fast-paced development environment. Show more Show less

Mock Interview

Practice Video Interview with JobPe AI

Start Pyspark Interview Now

RecommendedJobs for You

Hyderabad, Telangana, India

Hyderabad, Telangana, India