On-site
Full Time
The role will require deep knowledge of data engineering techniques to create data
pipelines and build data assets.
• At least 4+ years of Strong hands on programming experience with Pyspark /
Python / Boto3 including Python Frameworks, libraries according to python best
practices.
• Strong experience in code optimisation using spark SQL and pyspark.
• Understanding of Code versioning ,Git repository , JFrog Artifactory.
• AWS Architecture knowledge specially on S3, EC2, Lambda, Redshift,
CloudFormation etc and able to explain benefits of each
• Code Refactorization of Legacy Codebase: Clean, modernize, improve
readability and maintainability.
• Unit Tests/TDD: Write tests before code, ensure functionality, catch bugs early.
• Fixing Difficult Bugs: Debug complex code, isolate issues, resolve performance,
concurrency, or logic flaws.
Elevate
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Now7.0 - 14.0 Lacs P.A.
Kolkata
10.0 - 20.0 Lacs P.A.
Nellore
3.0 - 7.0 Lacs P.A.
Chennai
20.0 - 20.0 Lacs P.A.
Hyderabad
22.5 - 25.0 Lacs P.A.
Ahmedabad
13.0 - 14.0 Lacs P.A.
Bengaluru
4.0 - 8.0 Lacs P.A.
Chennai
4.0 - 8.0 Lacs P.A.
Hyderabad
4.0 - 8.0 Lacs P.A.
Navi Mumbai
4.0 - 8.0 Lacs P.A.