Posted:1 day ago|
Platform:
Remote
Full Time
Required Skills: Azure Synapse Azure Fabric Azure Data Factory (ADF) Azure Storage PySpark SQL Azure Key Vault Excellent communication skills as this would be client facing and L2 will be client round of interview. Responsibilities: Design and implement scalable data pipelines using Microsoft Fabric, including Dataflows Gen2, Lakehouse, Notebooks and SQL endpoints. Develop ETL/ELT solutions using PySpark, T-SQL and Spark Notebooks within Fabric and Azure Synapse. Manage and optimize data storage and compute in OneLake supporting Lakehouse and Warehouse use cases. Implement and manage Azure Key Vault for secure handling of secrets, credentials and connection strings. Configure and manage CI/CD pipelines for Data engineering projects using Azure Devops including automated deployment of Fabric assets. Integrate data from diverse sources including SQL server, Azure Blob, REST APIs and on-prem systems. Collaborate closely with business teams and PowerBI developers to ensure data models support reporting and self-service needs. Monitor and troubleshoot data pipeline performance, data quality and failure recovery. Contribute to architecture design, governance processes and performance tuning.
Ethiraj Associates
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
20.0 - 25.0 Lacs P.A.
4.75 - 6.0 Lacs P.A.
50.0 - 60.0 Lacs P.A.
Hyderabad, Chennai
10.0 - 20.0 Lacs P.A.
14.0 - 16.0 Lacs P.A.
Hyderabad
6.0 - 10.0 Lacs P.A.
Bengaluru
8.0 - 12.0 Lacs P.A.
6.0 - 7.0 Lacs P.A.
Noida, Greater Noida
12.0 - 18.0 Lacs P.A.
Vadodara
Experience: Not specified
1.0 - 3.5 Lacs P.A.