Work from Office
Full Time
Roles & Responsibilities: Design end-to-end data code development using pyspark, python, SQL and Kafka leveraging Microsoft Fabric's capabilities. Requirements: Hands-on experience with Microsoft Fabric , including Lakehouse, Data Factory, and Synapse . Strong expertise in PySpark and Python for large-scale data processing and transformation. Deep knowledge of Azure data services (ADLS Gen2, Azure Databricks, Synapse, ADF, Azure SQL, etc.). Experience in designing, implementing, and optimizing end-to-end data pipelines on Azure. Understanding of Azure infrastructure setup (networking, security, and access management) is good to have. Healthcare domain knowledge is a plus but not mandatory.
Atos
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Pune
12.0 - 16.0 Lacs P.A.
Pune, Maharashtra, India
Experience: Not specified
Salary: Not disclosed
Bengaluru
4.85 - 10.0 Lacs P.A.
3.75 - 6.6 Lacs P.A.
Noida
Salary: Not disclosed
Noida
3.42 - 9.18 Lacs P.A.
Bengaluru
8.0 - 13.0 Lacs P.A.
Bengaluru
8.0 - 13.0 Lacs P.A.
Bengaluru
Salary: Not disclosed
Hyderābād
Salary: Not disclosed