IN | Azure Data Engineer

3 - 6 years

4 - 8 Lacs

Posted:2 months ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Description Job Summary Role Value Proposition MetLife Data & Analytics organization is the team of expert technologists responsible for building big data platforms and data services with innovative technologies to enable MetLife businesses to generate insights and value to its customers. The team is the center of excellence in data engineering in MetLife and plays a key role in data enablement through multiple data stores supporting different kind of analytical use cases to be able to derive predictive prescriptive and descriptive insights. The Azure Data Engineer III serves as big data development expert within the data analytics engineering organization of MetLife Data & Analytics. This position has the responsibility of building ETL data warehousing and reusable components using cutting edge big data and cloud technologies. The resource will collaborate with the business systems analyst technical leads project managers and business/operations teams in building data enablement solutions across different LOBs and use cases. Key Responsibilities Collect store process and analyze large datasets to build and implement extract transfer load (ETL) processes Develop reusable frameworks to reduce the development effort involved thereby ensuring cost savings for the projects. Develop quality code with thought through performance optimizations in place right at the development stage. Appetite to learn new technologies and be ready to work on new cutting-edge cloud technologies. Work with team spread across the globe in driving the delivery of projects and recommend development and performance improvements. Essential Business Experience and Technical Skills: Ingesting huge volumes data from various platforms for Analytics needs and writing high-performance reliable and maintainable ETL code Strong Strong analytic skills related to working with unstructured datasets Strong experience in building/designing Data warehouses data stores for analytics consumption On prem and Cloud (real time as well as batch use cases) Ability to interact with business analysts and functional analysts in getting the requirements and implementing the ETL solutions. Required 10+ years of solutions development and delivery experience 5+ years of leadership experience in delivery of enterprise scale technology programs Hands on expertise inAzure SQL Synapze Cosmos DB Data Factory Python Spark Scala experience is a MUST Building and Implementing data ingestion and curation process developed using Cloud data tools such as Azure SQL Synapze Cosmos DB Data Factory Spark(Scala/python) Data bricks Delta lake etc. Ingesting huge volumes data from various platforms for Reporting Analytics Data Supply and Transactional (Operational data store and APIs) needs. Strong SQL knowledge and data analysis skills for data anomaly detection and data quality assurance. Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade C Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family 60242 (P) Data Management Local Role Name 60327 Data Engineer Local Skills 6170 SQL Languages RequiredEnglish Role Rarity To Be Defined

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview Now

My Connections Growel Softech Pvt. Ltd.

Download Chrome Extension (See your connection in the Growel Softech Pvt. Ltd. )

chrome image
Download Now
Growel Softech Pvt. Ltd.
Growel Softech Pvt. Ltd.

Information Technology Services

Thane

Approximately 200 Employees

1989 Jobs

    Key People

  • Ravi Jha

    Founder & CEO
  • Sita Sharma

    CTO

RecommendedJobs for You

Bengaluru / Bangalore, Karnataka, India

Bengaluru / Bangalore, Karnataka, India

Bengaluru / Bangalore, Karnataka, India

Bengaluru / Bangalore, Karnataka, India