Posted:1 week ago|
Platform:
On-site
Full Time
Job Title: Spark & Delta Lake Developer Job Type: Full-Time Location: Pune, Maharashtra, India Experience: 5–8 years Industry: Banking & Finance Job Summary: We’re hiring an experienced Spark & Delta Lake Developer to build high-performance data pipelines and cloud-native solutions for a global banking project. If you have strong hands-on experience with Apache Spark , Delta Lake , and cloud-based lakehouse architecture , this role is for you. Key Responsibilities: Develop and optimize Apache Spark pipelines for batch/streaming data Work with Delta Lake to enable scalable and reliable data workflows Design and maintain cloud-based data lakehouse architectures Collaborate with data architects and DevOps to deploy enterprise-grade data solutions Implement robust data ingestion, transformation, and governance practices Participate in code reviews and CI/CD processes Required Skills: 5–8 years in big data / distributed systems Strong knowledge of Apache Spark (RDD, DataFrame, SQL, Streaming) Hands-on experience with Delta Lake architecture Programming with PySpark or Scala Experience with cloud platforms (AWS, Azure, or GCP) Familiarity with data security, governance, and performance tuning Job Type: Full-time Pay: ₹1,800,000.00 - ₹2,200,000.00 per year Application Question(s): How many years of hands-on experience do you have with Apache Spark (RDD, DataFrame, SQL, Streaming)? Have you worked on Delta Lake architecture in a production environment? Which programming language have you used with Spark? Which cloud platform(s) have you used for big data or data lakehouse projects? Do you have experience with implementing data governance or security practices in large-scale data pipelines? Work Location: On the road Application Deadline: 13/06/2025 Expected Start Date: 16/06/2025
Talent Crafters
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Pune District, Maharashtra
Experience: Not specified
Salary: Not disclosed