Principal IT Engineer Applications - Data Engineer

11 - 18 years

25 - 40 Lacs

Posted:9 hours ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Role & responsibilities

Job Summary

This individual contributor primarily supports the Government Programs Data Platform (GPDP) by delivering scalable, automated, and high-performance ETL solutions using Informatica Cloud and Databricks. The engineer will serve as a hands-on SME for GPDP initiatives including Medicare and Medicaid and will collaborate across global teams to support data warehousing, regulatory reporting, and ad hoc data extracts. This role requires strong problem-solving skills, technical leadership, and the ability to mentor and guide junior developers across the full system development lifecycle.

Essential Responsibilities

List primary and specific job duties performed daily in order of importance.

  • Design,develop, and maintain ETL pipelines using Informatica Cloud and Databricks to support Medicare, Medicaid, and regulatory reporting under GPD
  • Provide technical leadership in translating complex business and functional requirements into robust, scalable, and automated data solutions.
  • Develop solutions that are automation-driven and resilient to dynamic source changes, using Shell scripting and Python where appropriate.
  • Optimize and troubleshoot Informatica Cloud, Databricks objects, SQL queries, and Data pipelines to improve performance and reduce failure points.
  • Support full SDLC activities, including SIT, UAT, and production deployments, ensuring compliance with DevOps standards and enterprise best practices.
  • Guide and mentor junior engineers across global teams, ensuring adherence to ETL coding standards and data engineering best practices.
  • Work closely with DBAs, application teams, and business stakeholders to design, review, and implement enhancements to the GPDP stack.
  • Document data flows, transformation logic, testing scenarios, and deployment steps to support long-term maintainability and audit-readiness.
  • Continuously identify opportunities to improve processes, implement automation, and enhance solution scalability.
  • Collaborate cross-functionally to support secure and compliant data sharing and integration with external vendors or partners as needed.

Job Qualifications

List of minimum education and minimum years of experience, level of knowledge, skills, abilities, licensures, certifications and other job-related requirements that must be met to be considered for a position. GCC's cannot hire candidate's that do not meet all of the minimum qualifications. Fewer minimum qualifications and more preferred qualifications broadens the applicant pool.

Minimum Qualifications

  • Bachelor's degree in computer science, Information Systems, or a related field.
  • Minimum eight (8) years of experience in software or data engineering roles.
  • Minimum three (3) years of hands-on ETL development using Informatica On-premises and Cloud.
  • Minimum two (2) years in a technical lead or mentorship role supporting complex data solutions.
  • Strong SQL scripting and performance tuning experience in large-scale data environments.

Additional Requirements

  • Must be a "hands on" problem solver who enjoys developing ETL code and mentoring less experienced developers
    • Informatica PowerCenter, Informatica Cloud, Databricks, Oracle Exadata • Experience with Linux, Python and shell scripts • Expertise in Datawarehouse solutions • Experience in Software Development Life Cycle & Solution design documents

Licenses and Certifications

Preferred Qualifications

List of nice-to-have skills that are not required, but are desired qualifications that would complement the job. These include complex skills, unique knowledge, job experience, added education, certifications, or licenses.

Note: If a skill is required, please list it under minimum and basic qualifications.

  • Experience supporting Medicare, Medicaid, or other regulatory healthcare data platforms.
  • Exposure to enterprise data governance, metadata management, and data catalog tools.
  • Knowledge of secure data sharing, de-identification practices, and vendor data exchanges.
  • Jira and Agile methodologies
  • Nice to have certifications in Informatica Cloud, Oracle, Databricks, and cloud platforms (e.g., Azure, AWS)

Preferred candidate profile

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Peoplefy Infosolutions logo
Peoplefy Infosolutions

Human Resources Technology

N/A

RecommendedJobs for You