Posted:2 months ago|
Platform:
Hybrid
Full Time
Work Mode: Hybrid
Experience: 5 to 10 Years
Design, develop, and maintain scalable data pipelines and ETL processes using AWS services (Glue, Lambda, RedShift).
Write efficient, reusable, and maintainable Python and PySpark scripts for data processing and transformation.Optimize SQL queries for performance and scalability.Expertise in writing complex SQL queries and optimizing them for performance.Monitor, troubleshoot, and improve data pipelines for reliability and performance.
Focusing on ETL automation using Python and PySpark, responsible for design, build, and maintain efficient data pipelines,
ensuring data quality and integrity for various applications.
Globant
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Nownagpur, chennai, bengaluru
12.0 - 22.0 Lacs P.A.
navi mumbai
15.0 - 18.0 Lacs P.A.
hyderabad, pune, greater noida
10.0 - 20.0 Lacs P.A.
coimbatore
10.0 - 14.0 Lacs P.A.
bengaluru
15.0 - 20.0 Lacs P.A.
4.0 - 8.0 Lacs P.A.
hyderabad, chennai, bengaluru
20.0 - 30.0 Lacs P.A.
pune, maharashtra, india
Salary: Not disclosed
bengaluru
8.0 - 10.0 Lacs P.A.
hyderabad, pune, greater noida
10.0 - 20.0 Lacs P.A.