Posted:12 hours ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Contractual

Job Description

Our client is a global technology company headquartered in Santa Clara, California. it focuses on helping organisations harness the power of data to drive digital transformation, enhance operational efficiency, and achieve sustainability. over 100 years of experience in operational technology (OT) and more than 60 years in IT to unlock the power of data from your business, your people and your machines. We help enterprises store, enrich, activate and monetise their data to improve their customers’ experiences, develop new revenue streams and lower their business costs. Over 80% of the Fortune 100 trust our client for data solutions.

The company’s consolidated revenues for fiscal 2024 (ended March 31, 2024). approximately $57.5 billion USD., and the company has approximately 296,000 employees worldwide. It delivers digital solutions utilising Lumada in five sectors, including Mobility, Smart Life, Industry, Energy and IT, to increase our customers’ social, environmental and economic value.


Job Title:


Location: Pune


Experience: 4-9Years


Job Type :


Notice Period:


Mandatory Skills: Azure Functionapp, Azure Databricks,pyspark.


Azure Functionapp, Azure Databricks (pyspark) good experience.

SQL,T-sql, Data Lake Storage, Synapse, Event Hubs, Key Vault

Key Responsibilities:

• Design and implement scalable data pipelines using Azure Databricks (PySpark/Scala).

• Develop and deploy Azure Function Apps for event-driven data processing and automation.

• Integrate Databricks with other Azure services like Data Lake Storage, Synapse, Event Hubs, Key Vault, etc.

• Optimize Spark jobs for performance and cost-efficiency.

• Implement CI/CD pipelines using Azure DevOps for Databricks notebooks and Function Apps.

• Monitor and troubleshoot production workloads, ensuring high availability and reliability.

• Collaborate with data scientists, analysts, and business stakeholders to deliver data solutions.

Required Skills & Qualifications:

• Strong experience with Azure Databricks and Apache Spark (PySpark or Scala).

• Proficiency in developing Azure Function Apps using C#, Python, or JavaScript.

• Solid understanding of Azure Data Lake, Event Hubs, Blob Storage, and Key Vault.

• Experience with CI/CD pipelines, Git, and Infrastructure as Code (ARM/Bicep/Terraform).

• Familiarity with data modeling, ETL/ELT processes, and data governance.

• Strong problem-solving skills and ability to work in an agile environment.

Preferred Qualifications:

• Microsoft Certified: Azure Data Engineer Associate or similar certifications.

• Experience with Delta Lake, MLflow, or Unity Catalog.

  • • Knowledge of monitoring tools like Azure Monitor, Log Analytics, and Application Insights.

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

Mumbai, Maharashtra, India