Roles Responsibilities
The World Banks Finance IT unit is seeking a dynamic, highly skilled, and motivated a hands-on data professional with deep expertise in data engineering to drive the modernization and optimization of our financial data platforms and delivering actionable insights for the Finance business teams.
The successful candidate will play a pivotal role in maintaining and enhancing existing data pipelines and solutions using Informatica. PowerCenter and contributes to solutions leveraging a modern data stack including Azure Data Lake Storage Gen2 (ADLS Gen2), Dremio, Power BI, Python-based ETL/ELT, and Azure Data Factory (ADF) with particular focus on Capital Markets and Lending data. This role is primarily technical, requiring strong hands-on experience in data engineering, integration architecture, and data platform operations. The incumbent will be expected to gain and apply practical domain understanding over time to deliver robust, compliant solutions that meet business needs.
Key Responsibilities
Data Engineering Architecture (Technical Delivery)
Implementation and maintenance of scalable, secure, high-performance data pipelines and architectures leveraging:
Informatica PowerCenter for enterprise ETL/ELT design, development, administration, performance tuning, and migration.
Azure Data Lake Storage Gen2 (ADLS Gen2) for data lake storage and governance.
Dremio for data virtualization and semantic layer enablement.
Develop and support data integrations and processing using SQL; Python for automation and data processing is desirable.
Contribute to implementing Azure-based data solutions:
Orchestrate batch ingestion and processing in Azure Data Factory (ADF).
Store and govern data in ADLS Gen2 following established standards.
Support data virtualization and semantic layer enablement in Dremio.
Assist with Lakehouse and analytics patterns (e.g., Databricks, Power BI) under guidance.
Implement data quality checks, lineage and cataloging using established tools and processes; adhere to observability practices (logging, alerting).
Integrate Azure cloud-native services (Azure Functions, Logic Apps, Event Hubs, Stream Analytics) to enable real-time and event-driven processing patterns.
Adopt best practices for data quality, lineage, observability, and governance aligned to World Bank and industry standards.
Analytics Platform Enablement (Plus/Value-Add)
Enable analytics use cases by preparing curated datasets and robust dimensional/semantic models that support self-service consumption.
Provide technical support for Power BI data models and reports; build simple dashboards as needed.
Conduct data validation and exploratory analysis to confirm accuracy and completeness of datasets.
Technical Mentorship
Review deliverables to ensure alignment with WBG standards and guidelines.
Coordinate day-to-day activities and communication between geographically distributed teams (e.g., Chennai and HQ), including vendor partners, ensuring timely and quality delivery.
Serve as a bridge between data engineering and analytics teams, ensuring the data models meet analytical and operational needs.
Domain Data Knowledge Development
Proactively deepen practical understanding of capital markets data, processes, and products (trade lifecycle, securities, derivatives, market data, regulatory reporting). Continuously expand domain expertise to improve solution relevance and impact.
Partner with business stakeholders to clarify and translate requirements into scalable technical designs to support regulatory and management reporting needs. Document and translate them into technical specifications.
Participate in solution walkthroughs and UAT; triage and resolve system issues within defined processes and SLAs.
Data Governance, Security, and Compliance
Apply established controls for privacy, security, data quality, lineage, and access management.
Collaborate with data stewards and governance bodies to maintain high standards of data integrity and compliance.
Agile and DevOps Practices:
Actively participate in agile ceremonies and utilize Azure DevOps (ADO) for project tracking and collaboration.
Selection Criteria
Masters degree in Engineering, Finance, or related field with 2+ years of relevant experience; or Bachelors degree with 5+ years of experience.
Minimum 5 years of hands-on data engineering experience, including at least 2 years in a technical leadership role.
Proven, hands-on expertise with Informatica PowerCenter:
End-to-end ETL/ELT development (mappings, workflows, sessions), repository management, versioning, job orchestration.
Performance tuning, pushdown optimization, partitioning, and high-volume data processing.
Administration and operations (upgrades, patches, security, scheduling, monitoring).
Solid SQL development skills, Python for data processing/automation is a plus.