Posted:1 day ago|
Platform:
Work from Office
Full Time
This is what youll do : - The position grows our analytics capabilities with faster, more reliable tools, handling petabytes of data every day. - Brainstorm and create new platforms that can help in our quest to make available to cluster users in all shapes and forms, with low latency and horizontal scalability. - Make changes to our diagnosing any problems across the entire technical stack. - Design and develop a real-time events pipeline for Data ingestion for real-time dash- boarding. - Develop complex and efficient functions to transform raw data sources into powerful, reliable components of our data lake. - Design & implement new components and various emerging technologies in Hadoop Eco- System, and successful execution of various projects. Skills that will help you succeed in this role : - Strong hands-on experience of 4+years with Spark, preferably PySpark etc. - Excellent programming/debugging skills in Python. - Experience with any scripting language such as Python, Bash etc. - Good experience in Databases such as SQL, MongoDB etc - Good to have experience with AWS and cloud technologies such as S3
Aviso India
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Pune
13.0 - 14.0 Lacs P.A.
Kolkata
3.0 - 7.0 Lacs P.A.
Kolkata, Hyderabad
3.0 - 7.5 Lacs P.A.
Hyderabad
15.0 - 25.0 Lacs P.A.
Pune, Chennai
6.0 - 15.0 Lacs P.A.
Gurugram
10.8 - 24.0 Lacs P.A.
Hyderabad
4.0 - 8.0 Lacs P.A.
Gurugram
4.0 - 8.0 Lacs P.A.
14.0 - 22.5 Lacs P.A.
Ahmedabad
3.0 - 7.0 Lacs P.A.