Senior Data Engineer (Databricks | Insurance | Data Migration)

12 years

0 Lacs

Posted:4 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Job Description – Senior Data Engineer (Databricks | Insurance | Data Migration)

Location:

Experience:



Role Overview

Senior Manager / Manager


Key Responsibilities

Databricks Data Engineering & Management

  • Design, build, and optimize

    Silver and Gold layer data pipelines

    in Databricks using PySpark, SQL, Delta Lake, and Workflow orchestration.
  • Implement

    data quality, lineage, schema evolution, and governance

    controls across curated layers.
  • Optimize Databricks jobs for

    performance, scalability, and cost efficiency

    .

Guidewire → Databricks Migration

  • Lead the end-to-end migration of large-scale insurance data from

    Guidewire PolicyCenter/ClaimCenter/BillingCenter

    into Databricks.
  • Map and transform complex Guidewire entity structures into normalized and star-schema models.

Data Modelling & Architecture

  • Develop robust

    logical and physical data models

    aligned to insurance business processes.
  • Build high-quality

    curated data marts (Gold)

    for analytics, reporting, pricing, underwriting, and claims.
  • Define standards for metadata, naming conventions, partitioning, and model documentation.

Insurance Domain Expertise

  • Understand core insurance data entities such as

    policy, claims, billing, customer, underwriting, rating, and product hierarchies

    .
  • Apply domain knowledge to rationalize Guidewire data structures and create business-ready datasets.

Solutioning & Ideation

  • Collaborate with client SMEs, architects, and business analysts to shape data solutions and propose design improvements.
  • Ability to ** ideate, simplify complex data flows**, and contribute to overall solution architecture.

Required Skills & Experience

Technical

  • 7–12 years of experience in

    data engineering, data modelling, and data management

    .
  • Strong hands-on experience in

    Databricks, Delta Lake, PySpark, Spark SQL, and ETL/ELT pipelines

    .
  • Expertise in

    logical & physical data modelling

    (3NF, Star Schema, Data Vault preferred).
  • Practical knowledge of

    Guidewire data model

    and prior migration experience (mandatory).
  • Experience working with large-scale insurance datasets
  • Strong understanding of

    data quality frameworks

    , lineage, cataloging, and governance.

Soft Skills

  • Strong problem-solving and conceptualization / ideation capability.
  • Excellent communication and stakeholder-management for UK client environment.
  • Ability to work in fast-paced delivery tracks with cross-functional global teams.

Preferred Qualifications

  • Certifications in

    Databricks

    ,

    Azure/AWS

    , and

    Data Migration

    are added advantages.
  • Experience delivering

    enterprise-grade data lakes or lakehouse architectures

    .

Why Join This Role?

  • Work on a

    flagship insurance data modernisation

    project for a top UK carrier.
  • Opportunity to shape enterprise-scale data models on the

    Databricks Lakehouse

    .
  • High-visibility role with strong career growth in

    insurance data engineering

    .

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
PwC India logo
PwC India

Business Consulting and Services

Kolkata West Bengal

RecommendedJobs for You