Home
Jobs

1 Spark Applications Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2 - 6 years

3 - 6 Lacs

Maharashtra

Work from Office

Naukri logo

Design, develop, and optimize scalable, high performance Spark applications using Scala. Work on mission critical projects, ensuring high availability, reliability, and performance. Analyze and optimize Spark jobs for efficient data processing and resource utilization. Collaborate with cross functional teams to deliver robust, production ready solutions. Troubleshoot and resolve complex issues related to Spark applications and data pipelines. Integrate Spark applications with Kafka for real time data streaming and MongoDB for data storage and retrieval. Follow best practices in coding, testing, and deployment to ensure high quality deliverables. Mentor junior team members and provide technical leadership. Mandatory Skills and Qualifications: 7+ years of hands on experience in Scala programming and Apache Spark. Strong expertise in Spark architecture, including RDDs, DataFrames, and Spark SQL. Proven experience in performance tuning and optimization of Spark applications. Must have hands on experience with Spark Streaming for real time data processing. Solid understanding of distributed computing and big data processing concepts. Proficient in Linux with the ability to work in a Linux environment. Strong knowledge of data structures and algorithms, with a focus on space and time complexity analysis. Ability to work independently and deliver results in a fast paced, high pressure environment. Excellent problem solving, debugging, and analytical skills. Good to Have Skills: Experience with Apache Kafka for real time data streaming. Knowledge of MongoDB or other NoSQL databases. Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and containerization (e.g., Docker, Kubernetes). Understanding of DevOps practices and CI/CD pipelines. Interview Focus Areas: Coding Exercise in Scala: A hands on coding assessment to evaluate your problem solving and coding skills. Spark Integration with Other Technologies: Practical understanding of how Spark integrates with tools like Kafka, MongoDB, etc. Spark Streaming: Demonstrated experience with real time data processing using Spark Streaming. Best Practices and Optimization in Spark: In depth knowledge of Spark job optimization, resource management, and performance tuning. Data Structures, Space, and Time Complexity Analysis: Strong grasp of data structures and algorithms, with a focus on optimizing space and time complexity. Shift Requirements: Flexible shift hours with the shift ending by midday US time. Willingness to adapt to dynamic project needs and timelines.

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies