Technical Lead – Data Platforms

8 years

0 Lacs

Posted:6 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

🔹 Position:

📍 Location:

🕛 Shift Timing:

🕒 Experience:

📅 Immediate Joiners Preferred


About the Role

Technical Lead


Key Responsibilities

  • Lead the

    end-to-end design and development

    of robust data pipelines using

    Snowflake

    and

    SQL Server

    .
  • Architect and implement

    Medallion Architecture

    (Bronze, Silver, Gold layers) for structured and semi-structured data.
  • Drive the adoption of

    Data Mesh

    principles to promote domain-oriented, decentralized data ownership.
  • Collaborate with

    data analysts, scientists, and business teams

    to translate requirements into scalable solutions.
  • Ensure

    data quality, governance, and lineage

    across all data assets.
  • Optimize data workflows for

    performance, scalability, and cost efficiency

    .
  • Mentor and guide data engineers, fostering a

    culture of technical excellence and innovation

    .
  • Stay current with emerging

    data engineering technologies

    and recommend continuous improvements.


Qualifications & Skills

  • Education:

    Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
  • Experience:

    8–12 years in data engineering,

    including3+ years in a technical leadership role

    . ✅
  • Core Skills:

  • Strong hands-on experience with

    Snowflake

    (data ingestion, transformation, performance tuning).
  • Advanced proficiency in

    SQL Server

    and

    T-SQL

    for complex data operations.
  • Deep understanding of

    Medallion Architecture

    and

    Data Mesh

    principles.
  • Experience with

    ELT/ETL tools

    ,

    Git

    , and

    CI/CD pipelines

    .
  • Familiarity with

    data orchestration tools

    (Airflow, dbt) and

    cloud platforms

    (AWS or Azure).
  • Strong analytical, problem-solving, and leadership abilities.
  • Good to Have:

  • Experience with

    Kafka

    ,

    Spark Streaming

    ,

    AWS

    ,

    dbt

    , or

    Prompt Engineering

    .
  • Proficiency in

    Python

    or

    .NET

    for data processing.
  • Insurance domain knowledge

    is highly preferred.


Why Zywave

Join a global leader in cloud-based solutions for the insurance and employee benefits industry. At Zywave, you’ll design data systems that power insights and decisions for thousands of businesses worldwide — all within a collaborative, innovation-driven environment.


📧 To Apply:

send your resume


📌 Email Subject:

(Only candidates based in Pune or willing to relocate to Pune – Kharadi location – will be considered.)

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You