Posted:2 months ago|
Platform:
Work from Office
Full Time
Description Job Summary Role Value Proposition MetLife Data & Analytics organization is the team of expert technologists responsible for building big data platforms and data services with innovative technologies to enable MetLife businesses to generate insights and value to its customers. The team is the center of excellence in data engineering in MetLife and plays a key role in data enablement through multiple data stores supporting different kind of analytical use cases to be able to derive predictive prescriptive and descriptive insights. The Azure Data Engineer III serves as big data development expert within the data analytics engineering organization of MetLife Data & Analytics. This position has the responsibility of building ETL data warehousing and reusable components using cutting edge big data and cloud technologies. The resource will collaborate with the business systems analyst technical leads project managers and business/operations teams in building data enablement solutions across different LOBs and use cases. Key Responsibilities Collect store process and analyze large datasets to build and implement extract transfer load (ETL) processes Develop reusable frameworks to reduce the development effort involved thereby ensuring cost savings for the projects. Develop quality code with thought through performance optimizations in place right at the development stage. Appetite to learn new technologies and be ready to work on new cutting-edge cloud technologies. Work with team spread across the globe in driving the delivery of projects and recommend development and performance improvements. Essential Business Experience and Technical Skills: Ingesting huge volumes data from various platforms for Analytics needs and writing high-performance reliable and maintainable ETL code Strong Strong analytic skills related to working with unstructured datasets Strong experience in building/designing Data warehouses data stores for analytics consumption On prem and Cloud (real time as well as batch use cases) Ability to interact with business analysts and functional analysts in getting the requirements and implementing the ETL solutions. Required 10+ years of solutions development and delivery experience 5+ years of leadership experience in delivery of enterprise scale technology programs Hands on expertise inAzure SQL Synapze Cosmos DB Data Factory Python Spark Scala experience is a MUST Building and Implementing data ingestion and curation process developed using Cloud data tools such as Azure SQL Synapze Cosmos DB Data Factory Spark(Scala/python) Data bricks Delta lake etc. Ingesting huge volumes data from various platforms for Reporting Analytics Data Supply and Transactional (Operational data store and APIs) needs. Strong SQL knowledge and data analysis skills for data anomaly detection and data quality assurance. Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade C Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family 60242 (P) Data Management Local Role Name 60327 Data Engineer Local Skills 6170 SQL Languages RequiredEnglish Role Rarity To Be Defined
Growel Softech Pvt. Ltd.
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
My Connections Growel Softech Pvt. Ltd.
Information Technology Services
Approximately 200 Employees
1989 Jobs
Key People
Andhra Pradesh
4.0 - 8.0 Lacs P.A.
0.15 - 0.3 Lacs P.A.
India
0.15 - 0.25 Lacs P.A.
0.1 - 0.1 Lacs P.A.
Bengaluru / Bangalore, Karnataka, India
8.0 - 9.0 Lacs P.A.
Mumbai, Maharashtra, India
3.0 - 8.0 Lacs P.A.
Experience: Not specified
Salary: Not disclosed
Bengaluru / Bangalore, Karnataka, India
3.0 - 14.0 Lacs P.A.
Bengaluru / Bangalore, Karnataka, India
3.0 - 11.5 Lacs P.A.
Bengaluru / Bangalore, Karnataka, India
7.0 - 9.0 Lacs P.A.