Posted:1 day ago|
Platform:
Remote
Full Time
- Work in multi-cloud environments including AWS, Azure, and GCP.
- Implement workflow orchestration using Airflow or similar frameworks.
- Design, implement, and manage data warehouse solutions, schema evolution, and data versioning.
- Collaborate with cross-functional teams to deliver high-quality data solutions.
Required Skills & Experience :
- 4+ years of hands-on experience in Databricks, Python, Spark (PySpark), DBT, and AWS data services.
- Strong experience with SQL and large-scale datasets.
- Hands-on exposure to multi-tenant environments (AWS/Azure/GCP).
- Knowledge of data modeling, data warehouse design, and best practices.
- Good understanding of workflow orchestration tools like Airflow.
Stanra Tech Solutions Pvt Ltd
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Now2.0 - 6.0 Lacs P.A.
2.0 - 6.0 Lacs P.A.
2.0 - 6.0 Lacs P.A.
2.0 - 6.0 Lacs P.A.
2.0 - 6.0 Lacs P.A.
visakhapatnam
3.0 - 7.0 Lacs P.A.
3.0 - 7.0 Lacs P.A.
3.0 - 7.0 Lacs P.A.
3.0 - 7.0 Lacs P.A.
3.0 - 7.0 Lacs P.A.