Home
Jobs

PHI - Lead- Data Engineer

12 - 15 years

40 - 50 Lacs

Posted:1 month ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Deep technology role Experience: 12 - 15 years. Location: Mumbai and/or Bangalore Work from office only Job Profile Summary: PHI intends to build a cloud-native, microservices-oriented, loosely coupled & open technology platform, which is tightly aligned to health-insurance domain, and built expecting to be reused while anticipating change . The PHI platform will be made up of multiple applications supporting different business functions, which are well-integrated and well-orchestrated. The applications could be COTS (Common-Off-The-Shelf) vendor software, or Prudential Group software capabilities, or software built by in-house PHI engineering team. All applications need to adopt common services, platforms, architectural principles, and design patterns. The right candidate will be accountable to deliver technology artefacts to business stakeholders, in the fastest time possible , with least gaps , best quality , and clarity on how the technology and business requirements can be delivered and tested at pace, with minimal gaps, and best quality. Requirement gaps, change requests, non-integrated journeys, bugs in UAT, NFR failures - all these would signal poor quality deliverables by this candidate. Job Description: Deeply understand the long-term architectural direction, emphasizing reusable components and interactions between various applications and accordingly architect data platform needs. Collaborate with TDMs, BAs, data engineers, data owners, business stakeholders, and Solution designers to understand business requirements to architect data solutions considering functionality, interoperability, performance, scalability, reliability, availability, and other applicable criteria. Conceptualize and engineer reusable data solution and development frameworks. Use data-driven insights to guide the development of programs and apps that meet user needs. Follow and contribute to defining and ensuring adherence to engineering and coding standards. Stay innovative by building world-class propositions using bleeding-edge technologies, frameworks, and best practices to bring differentiation through technology. Design, develop, and maintain robust data pipelines for batch and streaming data processing, ensuring scalability and reliability . Implement ETL processes to gather, transform, and load data from various sources into data warehouses and data lakes . Utilize data engineering technologies such as Parquet format for efficient data storage, Iceberg open table format for managing large-scale datasets . Leverage Google Cloud Platform services including Google Cloud Storage, Google Cloud Pub/Sub, Google Cloud Dataflow, Google Cloud Dataproc, Google Cloud Composer, and Google Cloud Bigtable Ensure data quality and governance by implementing monitoring, measuring, and improving data quality throughout its lifecycle Develop and maintain data integration workflows using APIs and standard toolsets Collaborate with cross-functional teams to solve enterprise problems related to data storage, exposure, and sharing in line with security and compliance requirements Build and optimize data infrastructure to support advanced analytics and insights Implement and maintain components of big data technologies for both exploratory and production data science platforms. Who we are looking for: Technical Skills & work experience: MH: Proven hands-on experience of data architecture and engineering for enterprise grade Datawarehouse/lake/lake-house with overall experience between 7 - 18 years MH: Demonstrated ability to understand technology and architectural strategy & processes & its successful translation into engineering solutions MH: Deep with BigQuery for data warehousing/lakes/lakehouses and analytics, leveraging Google Clouds suite of tools for data engineering tasks, Parquet format for efficient data storage, open table formats(Iceberg) and building robust data pipelines leveraging cloud native toolsets, ETL processes, data integration, and transformation techniques to ensure seamless, Google Cloud Storage, Google Cloud Pub/Sub, Google Cloud Dataflow, Google Cloud Dataproc, Google Cloud Composer, and Google Cloud Bigtable, looker studio, powerbi etc. MH: Experience with machine learning models and AI solutions on Google Cloud, utilizing tools like Vertex AI for model deployment and management. MH: Understanding of data governance, security, and compliance measures to protect sensitive data and ensure adherence to industry standards. MH: Proficiency in data quality management, including monitoring, measuring, and improving data quality throughout its lifecycle. Familiarity with data governance frameworks to establish rules, processes, and roles that ensure data is managed effectively from initial collection to final analysis MH: Proficiency in tools like Github (with co-pilot and actions), Build pipelines with relevant technology knowledge, Java, Diverse set of IDEs, Jira, Confluence, Kubernetes, Docker, Google Cloud Platform, Jenkins (Versioning and delivery helper tools), Postman, Websocket client (API development and debug tools). Personal Traits: First and foremost, be an exceptional engineer Highest standards of Collaboration & Teamwork are critical to this role Strong communication skills & ability to engagement senior management on strategic plans, leading project steering committees and status updates etc. Excellent problem analysis skills. Innovative and creative in developing solutions Ability and willingness to be hands-on; Strong attention to detail Ability to work independently and handle multiple concurrent initiatives Excellent organizational, vendor management, negotiation, and prioritization skills Education Bachelor s in computer science, Computer Engineering or equivalent; Suitable certifications for key skills Language Fluent written and spoken English

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
Prudential
Prudential

Financial Services

Newark

RecommendedJobs for You