Posted:6 hours ago|
Platform:
Work from Office
Full Time
Understanding of Spark core concepts like RDDs, DataFrames, DataSets, SparkSQL and Spark Streaming. Experience with Spark optimization techniques. Deep knowledge of Delta Lake features like time travel, schema evolution, data partitioning. Ability to design and implement data pipelines using Spark and Delta Lake as the data storage layer. Proficiency in Python/Scala/Java for Spark development and integrate with ETL process. Knowledge of data ingestion techniques from various sources (flat files, CSV, API, database) Understanding of data quality best practices and data validation techniques. Other Skills: Understanding of data warehouse concepts, data modelling techniques. Expertise in Git for code management. Familiarity with CI/CD pipelines and containerization technologies. Nice to have experience using data integration tools like DataStage/Prophecy/Informatica/Ab Initio"
IDESLABS PRIVATE LIMITED
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Now6.0 - 9.0 Lacs P.A.
6.0 - 9.0 Lacs P.A.
5.0 - 8.0 Lacs P.A.
Bengaluru
12.0 - 16.0 Lacs P.A.
Bengaluru
8.0 - 13.0 Lacs P.A.
8.0 - 12.0 Lacs P.A.
Pune, Puducherry, Delhi / NCR
5.0 - 8.0 Lacs P.A.
Bengaluru
11.0 - 14.0 Lacs P.A.
4.0 - 8.0 Lacs P.A.
20.0 - 25.0 Lacs P.A.