Senior Manager

0 years

6 - 10 Lacs

Posted:11 hours ago| Platform: GlassDoor logo

Apply

Work Mode

On-site

Job Type

Part Time

Job Description

Senior Manager

EXL/SM/1450065

    Healthcare AnalyticsNoida
    Posted On
    13 Aug 2025
    End Date
    27 Sep 2025
    Required Experience
    NA

Basic Section

Number Of Positions

2

Band

C2

Band Name

Senior Manager

Cost Code

D011198

Campus/Non Campus

-

Employment Type

Permanent

Requisition Type

New

Max CTC

1500000.0000 - 1800000.0000

Complexity Level

Not Applicable

Work Type

Hybrid – Working Partly From Home And Partly From Office

Organisational

Group

Analytics

Sub Group

Healthcare

Organization

Healthcare Analytics

LOB

Healthcare D&A

SBU

Healthcare Analytics

Country

India

City

Noida

Center

Noida-SEZ BPO Solutions

Skills

Skill

AZURE DATABRICKS

AZURE CLOUD

SQL

PYTHON

PYSPARK

CI/CD PIPELINE

Minimum Qualification

B.TECH/B.E

Certification

No data available

Job Description

Job Summary:


The Databricks Architect is responsible for designing, implementing, and optimizing scalable data solutions using the Databricks platform. This role involves working closely with cross-functional teams to deliver high-performance, secure, and reliable data architecture that support advanced analytics, machine learning, and business intelligence


Key Responsibilities:

  • Design Databricks-based data architectures and models to meet business requirements

  • Develop event-driven and scalable GenAI application architectures

  • Refactor and rehost legacy applications onto cloud platforms

  • Design, develop, and optimize data pipelines and ETL processes using Azure Databricks (PySpark /Spark SQL).

  • Implement data security best practices and ensure compliance with industry standards

  • Collaborate with data architects, analysts, and other developers to deliver data solutions aligned with business requirements.

  • Translate business requirements into technical solutions

  • Perform data wrangling, cleansing, transformation, and aggregation from multiple sources.

  • Implement and maintain data lake and data warehouse solutions using Azure services (ADLS, Synapse, Delta Lake).

  • Monitor pipeline performance, troubleshoot issues, and ensure data integrity and reliability.

  • Develop and maintain CI/CD pipelines for Databricks workflows and jobs.

  • Participate in code reviews, unit testing, and documentation of data processes.


Required Skills & Experience:

  • Strong experience with Databricks, Delta Lake, PySpark/Scala Spark, and Unity Catalog.
  • Proficiency in Python, SQL, and familiarity with relational and NoSQL databases
  • Experience with cloud platforms (Azure, AWS, GCP) and Infrastructure as Code tools
  • Knowledge of CI/CD, DevSecOps, Kafka, MLFlow, and Structured Streaming
  • Solid understanding of Azure Data Services: Azure Data Lake Storage (ADLS), Azure Data Factory (ADF)
  • Strong problem-solving skills and ability to work independently or as part of a team.
  • Databricks certifications (e.g., Certified Data Engineer, Associate Developer) preferred
  • Excellent communication skills.
Workflow

Workflow Type

L&S-DA-Consulting

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

Gurugram, Haryana, India

Hyderabad, Telangana, India

Gurgaon, Haryana, India