Posted:3 weeks ago|
Platform:
Work from Office
Full Time
Responsibilities : - Design, develop, and deploy data pipelines using Databricks, including data ingestion, transformation, and loading (ETL) processes. - Develop and maintain high-quality, scalable, and maintainable Databricks notebooks using Python. - Work with Delta Lake and other advanced features. - Leverage Unity Catalog for data governance, access control, and data discovery. - Develop and optimize data pipelines for performance and cost-effectiveness. - Integrate with various data sources, including but not limited to databases and cloud storage (Azure Blob Storage, ADLS, Synapse), and APIs. - Experience working with Parquet files for data storage and processing. - Experience with data integration from Azure Data Factory, Azure Data Lake, and other relevant Azure services. - Perform data quality checks and validation to ensure data accuracy and integrity. - Troubleshoot and resolve data pipeline issues effectively. - Collaborate with data analysts, business analysts, and business stakeholders to understand their data needs and translate them into technical solutions. - Participate in code reviews and contribute to best practices within the team. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
D-TechWorks Pvt Ltd
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
1.0 - 5.0 Lacs P.A.
4.0 - 8.0 Lacs P.A.
Hyderabad
8.0 - 12.0 Lacs P.A.
5.0 - 9.0 Lacs P.A.
8.0 - 13.0 Lacs P.A.
Pune
13.0 - 14.0 Lacs P.A.
Hyderabad, Pune, Bengaluru
12.0 - 18.0 Lacs P.A.
Bengaluru
25.0 - 30.0 Lacs P.A.
Kolkata
3.0 - 7.0 Lacs P.A.
9.0 - 13.0 Lacs P.A.