Big Data Engineer

4 - 9 years

20 - 35 Lacs

Posted:2 days ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Position Summary

MetLife established a Global capability center (MGCC) in India to scale and mature Data & Analytics, technology capabilities in a cost-effective manner and make MetLife future ready. The center is integral to Global Technology and Operations with a with a focus to protect & build MetLife IP, promote reusability and drive experimentation and innovation. The Data & Analytics team in India mirrors the Global D&A team with an objective to drive business value through trusted data, scaled capabilities, and actionable insights

Role Value Proposition

MetLife Global Capability Center (MGCC) is looking for a Senior Cloud data engineer who has the responsibility of building ETL/ELT, data warehousing and reusable components using Azure, Databricks and spark. He/She will collaborate with the business systems analyst, technical leads, project managers and business/operations teams in building data enablement solutions across different LOBs and use cases.

Job Responsibilities

  • Collect, store, process and analyze large datasets to build and implement extract, transfer, load (ETL) processes
  • Develop metadata and configuration based reusable frameworks to reduce the development effort
  • Develop quality code with integral performance optimizations in place right at the development stage.
  • Collaborate with global team in driving the delivery of projects and recommend development and performance improvements.
  • Extensive experience of various databases types and knowledge to leverage the right one for the need.
  • Strong understanding of data tools and ability to leverage them to understand the data and generate insights
  • Hands on experience in building/designing at-scale Data Lake, Data warehouses, data stores for analytics consumption On prem and Cloud (real time as well as batch use cases)
  • Ability to interact with business analysts and functional analysts in getting the requirements and implementing the ETL solutions.

Education, Technical Skills & Other Critical Requirement

Education

Bachelors degree in computer science, Engineering, or related discipline

Experience

(In Years)

4 to 10 years of working experience on Azure Cloud using Databricks or Synapse

Technical Skills

1. Experience in transforming data using Python, Spark or Scala

2.

Azure Data Factory, Databricks Workflows, Azure Synapse, Cosmos DB, Spark (Scala/python), Data bricks

Synapse dedicated SQL Pool and serverless Pools,

5. Scripting experience primarily on shell/bash/PowerShell would be desirable.

Role & responsibilities

Preferred candidate profile

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
Metlife logo
Metlife

Insurance and Financial Services

New York

RecommendedJobs for You

kanpur, uttar pradesh, india

bengaluru, karnataka, india

pune, maharashtra, india