Data Engineer

70 years

0 Lacs

Posted:2 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

About The Company

Independent for over 70 years, Milliman delivers market-leading services and solutions to clients worldwide. Today, we are helping companies take on some of the worlds most critical and complex issues, including retirement funding and healthcare financing, risk management and regulatory compliance, data analytics and business transformation. Through a team of professionals ranging from actuaries to clinicians, technology specialists to plan administrators, we offer unparalleledexpertise in employee benefits, investment consulting, healthcare, life insurance, and financial services, and property and casualty insurance.

The Department

The Life & Annuity Predictive Analytics (LAPA) business unit is a lean, agile, diverse, and geographically distributed data science startup within Milliman. Our team consists of professionals with varied backgrounds including data scientists, data engineers, software engineers/developers, and actuarial domain experts.We help insurers and distributors of life and retirement products to understand and use their own data, industry data, and customer data to advance their competitive position and improve financial outcomes. Through our powerful combination of subject matter expertise, data management, and advanced analytics, we provide our clients with tools to analyze their business performance, manage risk, and generate new business leads to facilitate more profitable growth.

The Role

As a Data Engineer on the LAPA team, you will be responsible for designing and implementing data pipelines using industry-leading cloud applications such as Databricks and orchestration tools such as Azure Data Factory. You will use programming languages such as Python, R, or SQL to automate the ETL, analytics, and data quality processes from the ground up. You will design and implement complex data models, metadata, build reports and dashboards, and own data presentation and dashboarding tools for the end users of our data products and systems. You will work with leading edge technologies like Databricks, Azure Data Lake, Azure Data Factory, Snowflake, and more. You will write scalable, highly tuned SQL/Pyspark code running over millions of rows of data.You will work closely with other data scientists, data engineers, software engineers/developers, and domain experts to continuously improve our data collection, data cleaning, data analysis, predictive modeling, data visualization, and application development. You will also investigate, evaluate, and present new technologies and processes for the team to use.

You Will

  • Design, build, and manage reliable ETL pipelines using PySpark and Databricks for life and annuity data products.
  • Implement automated data quality checks to ensure accuracy, completeness, and consistency of data
  • Deploy data pipelines to production and monitor them for performance, reliability, and data issues
  • Collaborate with actuaries, analysts, and data scientists to deliver clean, usable, and secure data
  • Support AI and machine learning teams by preparing model-ready datasets and contributing to data-driven use cases
  • Follow engineering best practices like code reviews, automation, and efforts to reduce technical debt.
  • Document data workflows, business logic, and best practices to support internal knowledge sharing

Job Knowledge, Experience Skills

Job Knowledge Required
  • Bachelor's degree in computer science, Engineering, or any STEM-related field
  • 3-5 years of hands-on experience in data engineering or data science roles
  • Strong programming skills in Python, PySpark, and optionally R
  • Proficient in SQL, including data modeling, performance tuning, and query optimization
  • Experience building ETL/ELT pipelines and implementing data quality checks
  • Hands-on expertise with Apache Spark, Databricks, and cloud data tools (preferably Azure Data Factory, Data Lake, Synapse)
  • Familiarity with cloud data warehouses and large-scale data processing
  • Understanding of DevOps practices and use of version control tools like Git in data engineering workflows
  • Knowledge of data governance, metadata management, and secure handling of PII data
  • Basic understanding of AI/ML concepts and how data engineering supports AI-driven use cases

Experience And Soft Skills Required

  • Passion for technology, growth, self-motivated, energetic, organized, driven, and result oriented.
  • Ability to work in a highly collaborative, Agile environment with a strong desire to learn
  • Ability to take ownership of a technical challenge and see it through to a successful conclusion
  • Commitment to continuous education to be equipped to lead continuous process improvement
  • Excellent written and verbal communication skills
  • Ability to manage competing priorities and deadlines

Additional Knowledge And Skills To Build

  • Sharp critical thinking skills, sound judgment and decision-making ability, and both the ability and willingness to clearly articulate your ideas
  • Experience in ETL optimization, writing custom pyspark functions (UDFs) and tuning PySpark or Spark SQL code
  • Experience in using DAG orchestration with tools like Data Factory, Airflow, dbt, Delta Lake, Kafka, Prefect etc.
  • Experience in handling data - data lineage, data governance, ensuring data quality, feature stores etc.
  • Knowledge of data engineering best practices and in using industry-standard methodologies.
  • Experience with CI/CD pipelines, Git, and DevOps practices.
  • Interest in building AI driven solutions.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Milliman logo
Milliman

Consulting, Actuarial Services

Seattle

RecommendedJobs for You

vellore, tamil nadu, india

madurai, tamil nadu, india

coimbatore, tamil nadu, india

faridabad, haryana, india

pune, maharashtra, india

pune, maharashtra, india