Senior Data Platform Engineer

8 years

0 Lacs

Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Brief Description

Job Title:

Senior Data Platform Engineer

Location:

Pune, India

Work Mode:

Work From Office (WFO), 5 Days a Week

Shift Timing:

12:00 PM – 9:00 PM IST

About Zywave

Zywave is a leading provider of InsurTech solutions, empowering insurance brokers and agencies with innovative software tools to grow and manage their business. We are building a modern data platform to deliver scalable, secure, and high-performance solutions that drive actionable insights across the organization.

Job Summary

We are looking for a

highly skilled and experienced Senior Data Platform Engineer

to lead the design, development, and optimization of our enterprise data platform. The ideal candidate will have deep expertise in

Snowflake, ELT pipelines, DBT, and Azure Data Factory

and will play a key role in enabling data-driven decision-making across Zywave.

Key Responsibilities

  • Design and implement scalable ELT pipelines using DBT and Azure Data Factory to ingest, transform, and load data into Snowflake.
  • Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver robust data models.
  • Optimize Snowflake performance through clustering, partitioning, and query tuning.
  • Develop and maintain reusable DBT models and documentation for data consistency and transparency.
  • Ensure data quality, governance, and security across the platform.
  • Monitor and troubleshoot pipeline issues; implement proactive and scalable solutions.
  • Lead code reviews, mentor junior engineers, and drive best practices in data engineering.
  • Stay current with emerging technologies and recommend enhancements to the data platform architecture.

Qualifications

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
  • 8+ years of experience in data engineering or data platform development.
  • Strong hands-on expertise in Snowflake, DBT, Azure Data Factory, and ELT pipeline design.
  • Proficiency in SQL and Python for data manipulation and automation.
  • Experience with CI/CD tools and version control systems (e.g., Git).
  • Familiarity with data governance, security, and compliance standards.
  • Strong problem-solving skills and ability to work independently as well as in teams.
  • Solid understanding of data warehousing concepts and dimensional modeling.
  • Exposure to Tableau, Power BI, or similar visualization tools is a plus.
  • Snowflake certification is an advantage.

Skills

Mandatory:

Git, Snowflake, DBT, Python, SQL

Good to Have:

Azure Data Factory, ETL, AWS Glue, Tableau, Power BI, Prompt Engineering

Domain Knowledge:

Insurance domain exposure preferred

Requirements

Mandatory:

Git, Snowflake, DBT, Python, SQL

Good to Have:

Azure Data Factory, ETL, AWS Glue, Tableau, Power BI, Prompt Engineering

Domain Knowledge:

Insurance domain exposure preferred

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You