Posted:-1 days ago|
Platform:
Remote
Full Time
Key Responsibilities:
• Design, develop, and maintain ETL pipelines using Azure Databricks notebooks and workflows. • Perform big data processing and analytics using Spark on the Databricks platform.
• Write optimized and efficient code using PySpark, SparkSQL, and Python.
• Implement and enhance data transformation and integration pipelines.
• Manage secrets and credentials securely using Key Vault and Databricks secret scopes
• Write and debug complex SQL queries (preferably PL/SQL and Spark SQL) for data retrieval and analysis.
• Troubleshoot and resolve issues related to Python, PySpark, and SQL.
• Collaborate with cross-functional teams to understand data requirements and deliver solutions.
• Use Git for version control and manage CI/CD pipelines for automated deployments.
Required Skills:
• Strong experience with Azure cloud services and cloud-native data engineering.
• Proficiency in Databricks, Spark, PySpark, and SparkSQL.
• Solid understanding of SQL variants, especially PL/SQL.
• Experience with Git and CI/CD tools and practices.
• Excellent problem-solving, communication, and collaboration skills
PI Square Technologies India
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Nowbengaluru
9.0 - 14.0 Lacs P.A.
pune, gurugram, bengaluru
25.0 - 30.0 Lacs P.A.
hyderabad
15.0 - 20.0 Lacs P.A.
hyderābād
5.3125 - 7.875 Lacs P.A.
hyderābād
4.0 - 9.0 Lacs P.A.
navi mumbai, pune, mumbai (all areas)
10.0 - 12.0 Lacs P.A.
pune, chennai, bengaluru
1.0 - 2.0 Lacs P.A.
9.0 - 13.0 Lacs P.A.
mumbai, hyderabad, chennai
4.0 - 6.0 Lacs P.A.
chennai, bengaluru, delhi / ncr, remote
8.0 - 12.0 Lacs P.A.