Posted:12 hours ago|
Platform:
On-site
Contractual
Our client is a global technology company headquartered in Santa Clara, California. it focuses on helping organisations harness the power of data to drive digital transformation, enhance operational efficiency, and achieve sustainability. over 100 years of experience in operational technology (OT) and more than 60 years in IT to unlock the power of data from your business, your people and your machines. We help enterprises store, enrich, activate and monetise their data to improve their customers’ experiences, develop new revenue streams and lower their business costs. Over 80% of the Fortune 100 trust our client for data solutions.
The company’s consolidated revenues for fiscal 2024 (ended March 31, 2024). approximately $57.5 billion USD., and the company has approximately 296,000 employees worldwide. It delivers digital solutions utilising Lumada in five sectors, including Mobility, Smart Life, Industry, Energy and IT, to increase our customers’ social, environmental and economic value.
• Design and implement scalable data pipelines using Azure Databricks (PySpark/Scala).
• Develop and deploy Azure Function Apps for event-driven data processing and automation.
• Integrate Databricks with other Azure services like Data Lake Storage, Synapse, Event Hubs, Key Vault, etc.
• Optimize Spark jobs for performance and cost-efficiency.
• Implement CI/CD pipelines using Azure DevOps for Databricks notebooks and Function Apps.
• Monitor and troubleshoot production workloads, ensuring high availability and reliability.
• Collaborate with data scientists, analysts, and business stakeholders to deliver data solutions.
• Strong experience with Azure Databricks and Apache Spark (PySpark or Scala).
• Proficiency in developing Azure Function Apps using C#, Python, or JavaScript.
• Solid understanding of Azure Data Lake, Event Hubs, Blob Storage, and Key Vault.
• Experience with CI/CD pipelines, Git, and Infrastructure as Code (ARM/Bicep/Terraform).
• Familiarity with data modeling, ETL/ELT processes, and data governance.
• Strong problem-solving skills and ability to work in an agile environment.
• Microsoft Certified: Azure Data Engineer Associate or similar certifications.
• Experience with Delta Lake, MLflow, or Unity Catalog.
People Prime Worldwide
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python NowPune, Maharashtra, India
Salary: Not disclosed
Noida, Pune, Bengaluru
20.0 - 30.0 Lacs P.A.
Mumbai, Maharashtra, India
8.0 - 14.5 Lacs P.A.
4.0 - 9.0 Lacs P.A.
Chennai, Tamil Nadu, India
8.0 - 30.0 Lacs P.A.
Bengaluru, Karnataka, India
8.0 - 30.0 Lacs P.A.
Hyderabad, Telangana, India
8.0 - 30.0 Lacs P.A.
Gurugram, Haryana, India
8.0 - 30.0 Lacs P.A.
7.0 - 12.0 Lacs P.A.
Chennai, Tamil Nadu, India
8.0 - 30.0 Lacs P.A.