Home
Jobs

Spark & Delta Lake Professional

3 - 5 years

5 - 8 Lacs

Posted:1 week ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Understanding of Spark core concepts like RDD s, DataFrames, DataSets, SparkSQL and Spark Streaming. Experience with Spark optimization techniques. Deep knowledge of Delta Lake features like time travel, schema evolution, data partitioning. Ability to design and implement data pipelines using Spark and Delta Lake as the data storage layer. Proficiency in Python / Scala/Java for Spark development and integrate with ETL process. Knowledge of data ingestion techniques from various sources (flat files, CSV, API, database) Understanding of data quality best practices and data validation techniques. Other Skills: Understanding of data warehouse concepts, data modelling techniques. Expertise in Git for code management. Familiarity with CI/CD pipelines and containerization technologies. Nice to have experience using data integration tools like DataStage/Prophecy/ Informatica/Ab Initio.

Mock Interview

Practice Video Interview with JobPe AI

Start Spark Interview Now

My Connections IDESLABS PRIVATE LIMITED

Download Chrome Extension (See your connection in the IDESLABS PRIVATE LIMITED )

chrome image
Download Now

RecommendedJobs for You

Hyderabad, Pune, Chennai, Bengaluru