Big Data Developer

8 - 10 years

0 Lacs

Posted:3 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Position Summary

MetLife established a Global capability center (MGCC) in India to scale and mature Data & Analytics, technology capabilities in a cost-effective manner and make MetLife future ready. The center is integral to Global Technology and Operations with a with a focus to protect & build MetLife IP, promote reusability and drive experimentation and innovation. The Data & Analytics team in India mirrors the Global D&A team with an objective to drive business value through trusted data, scaled capabilities, and actionable insights


Role Value Proposition

MetLife Global Capability Center (MGCC) is looking for a Senior Cloud data engineer who has the responsibility of building ETL/ELT, data warehousing and reusable components using Azure, Databricks and spark. He/She will collaborate with the business systems analyst, technical leads, project managers and business/operations teams in building data enablement solutions across different LOBs and use cases.

Job Responsibilities

  • Collect, store, process and analyze large datasets to build and implement extract, transfer, load (ETL) processes
  • Develop metadata and configuration based reusable frameworks to reduce the development effort
  • Develop quality code with integral performance optimizations in place right at the development stage.
  • Collaborate with global team in driving the delivery of projects and recommend development and performance improvements.
  • Extensive experience of various databases types and knowledge to leverage the right one for the need.
  • Strong understanding of data tools and ability to leverage them to understand the data and generate insights
  • Hands on experience in building/designing at-scale Data Lake, Data warehouses, data stores for analytics consumption On prem and Cloud (real time as well as batch use cases)
  • Ability to interact with business analysts and functional analysts in getting the requirements and implementing the ETL solutions.


Education, Technical Skills & Other Critical Requirement

Education

Bachelor’s degree in computer science, Engineering, or related discipline

Experience

8 to 10 years of working experience on Azure Cloud using Databricks or Synapse

Technical Skills

  1. Experience in transforming data using Python, Spark or Scala
  2. Technical depth in

    Cloud Architecture Framework, Lakehouse Architecture and One Lake solutions.

  3. Experience in implementing data ingestion and curation process using Azure with tools such as

    Azure Data Factory, Databricks Workflows, Azure Synapse, Cosmos DB, Spark (Scala/python), Data bricks

    .
  4. Experience in cloud optimized code on Azure using Databricks,

    Synapse dedicated SQL Pool and serverless Pools,

    Cosmos, SQL APIs loading and consumption optimizations.
  5. Scripting experience primarily on shell/bash/PowerShell would be desirable.
  6. Experience in writing SQL and performing data analysis skills for data anomaly detection and data quality assurance.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

bangalore, chennai, hyderabad, mumbai city, delhi

Andhra Pradesh, India

Hyderabad, Telangana, India

Gurugram, Haryana, India

Pune, Maharashtra, India

Pune, Maharashtra, India