Posted:3 months ago|
Platform:
Work from Office
Full Time
Design, develop, and deploy data pipelines using Databricks, including data ingestion, transformation, and loading (ETL) processes. Develop and maintain high-quality, scalable, and maintainable Databricks notebooks using Python. Work with Delta Lake and other advanced features. Leverage Unity Catalog for data governance, access control, and data discovery. Develop and optimize data pipelines for performance and cost-effectiveness. Integrate with various data sources, including but not limited to databases and cloud storage (Azure Blob Storage, ADLS, Synapse), and APIs. Experience working with Parquet files for data storage and processing. Experience with data integration from Azure Data Factory, Azure Data Lake, and other relevant Azure services. Perform data quality checks and validation to ensure data accuracy and integrity. Troubleshoot and resolve data pipeline issues effectively. Collaborate with data analysts, business analysts, and business stakeholders to understand their data needs and translate them into technical solutions. Participate in code reviews and contribute to best practices within the team.
D-TechWorks Pvt Ltd
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
My Connections D-TechWorks Pvt Ltd
Salary: Not disclosed
16.0 - 20.0 Lacs P.A.
Chennai, Tamil Nadu, India
Salary: Not disclosed
Salary: Not disclosed
Chennai
3.0 - 7.0 Lacs P.A.
Hyderabad
3.0 - 7.0 Lacs P.A.
Hyderabad
3.0 - 7.0 Lacs P.A.
10.0 - 20.0 Lacs P.A.
10.0 - 20.0 Lacs P.A.
Bengaluru
10.0 - 20.0 Lacs P.A.