Spark Engineer

1 - 6 years

2 - 6 Lacs

Posted:-1 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

We are seeking an entry-level Spark Engineer with at least 1 year of experience working with Apache Spark to join our Data engineering team. In this role, you will assist in developing, optimizing, and supporting Spark-based applications and data pipelines. You will work alongside senior engineers and Data scientists to ensure the effective use of Spark for large-scale data processing and analytics. This is a great opportunity to enhance your skills while supporting Spark jobs in a dynamic and fast-paced environment.
Please read the job criteria below and drop us an email at joinus@tudip.com OR create an account at our

Recruitment Portal

to get started.
Roles & Responsibilities
  • Assist in monitoring, troubleshooting, and supporting Spark-based data processing jobs, ensuring optimal performance and reliability.
  • Collaborate with senior engineers to design, build, and optimize Spark jobs and data pipelines for large-scale data processing and analytics tasks.
  • Identify and resolve performance bottlenecks, errors, and failures in Spark applications by analyzing logs and job execution data.
  • Monitor the health and performance of Spark clusters, recommend improvements, and implement fixes as needed.
  • Work with data engineers and data scientists to understand data processing requirements and provide technical support for data-driven applications.
  • Assist in documenting Spark job configurations, processes, and troubleshooting steps for knowledge sharing and team collaboration.
  • Keep up with new developments in Apache Spark and related big data technologies, and apply best practices for performance, scalability, and data integrity.
Job Requirements/Qualifications
  • At least 1 year of experience working with Apache Spark in a production or development environment.
  • Educational Qualification: BE/BTech (CS/IT), MCA from a reputed institute.
  • Hands-on experience with Spark SQL, DataFrames, and RDDs.
  • Knowledge of big data processing concepts and distributed computing.
  • Familiarity with Spark cluster management tools like YARN, Mesos, or Kubernetes.
  • Experience with Java, Scala, or Python for Spark job development.
  • Understanding of data formats such as Parquet, Avro, ORC, and JSON.
  • Basic knowledge of ETL processes, data ingestion, and data transformation pipelines.
  • Strong problem-solving skills and ability to troubleshoot issues in Spark jobs.
  • Familiarity with Git or other version control tools for managing code.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You