Posted:3 weeks ago|
Platform:
Work from Office
Full Time
Responsibilities : - Design, develop, and deploy data pipelines using Databricks, including data ingestion, transformation, and loading (ETL) processes. - Develop and maintain high-quality, scalable, and maintainable Databricks notebooks using Python. - Work with Delta Lake and other advanced features. - Leverage Unity Catalog for data governance, access control, and data discovery. - Develop and optimize data pipelines for performance and cost-effectiveness. - Integrate with various data sources, including but not limited to databases and cloud storage (Azure Blob Storage, ADLS, Synapse), and APIs. - Experience working with Parquet files for data storage and processing. - Experience with data integration from Azure Data Factory, Azure Data Lake, and other relevant Azure services. - Perform data quality checks and validation to ensure data accuracy and integrity. - Troubleshoot and resolve data pipeline issues effectively. - Collaborate with data analysts, business analysts, and business stakeholders to understand their data needs and translate them into technical solutions. - Participate in code reviews and contribute to best practices within the team.
D-TechWorks Pvt Ltd
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
My Connections D-TechWorks Pvt Ltd
2.0 - 6.0 Lacs P.A.
Chennai
25.0 - 30.0 Lacs P.A.
Bengaluru
5.0 - 9.0 Lacs P.A.
11.0 - 15.0 Lacs P.A.
Bengaluru
10.0 - 14.0 Lacs P.A.
8.0 - 18.0 Lacs P.A.
Hyderabad
3.0 - 6.0 Lacs P.A.
Hyderabad
10.0 - 15.0 Lacs P.A.
Hyderabad, Chennai, Bengaluru
Experience: Not specified
1.5 - 6.0 Lacs P.A.
Chennai
0.5 - 0.6 Lacs P.A.