Posted:6 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

🧭 Job Summary:

We are seeking a results-driven Data Project Manager (PM) to lead data initiatives leveraging Databricks and Confluent Kafka in a regulated banking environment. The ideal candidate will have a strong background in data platforms, project governance, and financial services, and will be responsible for ensuring successful end-to-end delivery of complex data transformation initiatives aligned with business and regulatory requirements.


Key Responsibilities:

🔹 Project Planning & Execution

- Lead planning, execution, and delivery of enterprise data projects using Databricks and Confluent.

- Develop detailed project plans, delivery roadmaps, and work breakdown structures.

- Ensure resource allocation, budgeting, and adherence to timelines and quality standards.

🔹 Stakeholder & Team Management

- Collaborate with data engineers, architects, business analysts, and platform teams to align on project goals.

- Act as the primary liaison between business units, technology teams, and vendors.

- Facilitate regular updates, steering committee meetings, and issue/risk escalations.

🔹 Technical Oversight

- Oversee solution delivery on Databricks (for data processing, ML pipelines, analytics).

- Manage real-time data streaming pipelines via Confluent Kafka.

- Ensure alignment with data governance, security, and regulatory frameworks (e.g., GDPR, CBUAE, BCBS 239).

🔹 Risk & Compliance

- Ensure all regulatory reporting data flows are compliant with local and international financial standards.

- Manage controls and audit requirements in collaboration with Compliance and Risk teams.

💼 Required Skills & Experience:


✅ Must-Have:

- 7+ years of experience in Project Management within the banking or financial services sector.

- Proven experience leading data platform projects (especially Databricks and Confluent Kafka).

- Strong understanding of data architecture, data pipelines, and streaming technologies.

- Experience managing cross-functional teams (onshore/offshore).

- Strong command of Agile/Scrum and Waterfall methodologies.


✅ Technical Exposure:

- Databricks (Delta Lake, MLflow, Spark)

- Confluent Kafka (Kafka Connect, kSQL, Schema Registry)

- Azure or AWS Cloud Platforms (preferably Azure)

- Integration tools (Informatica, Data Factory), CI/CD pipelines

- Oracle ERP Implementation experience


✅ Preferred:

- PMP / Prince2 / Scrum Master certification

- Familiarity with regulatory frameworks: BCBS 239, GDPR, CBUAE regulations

- Strong understanding of data governance principles (e.g., DAMA-DMBOK)


🎓 Education:

Bachelor’s or Master’s in Computer Science, Information Systems, Engineering, or related field.


📈 KPIs:

- On-time, on-budget delivery of data initiatives

- Uptime and SLAs of data pipelines

- User satisfaction and stakeholder feedback

- Compliance with regulatory milestones

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You

Chennai, Tamil Nadu, India

Pune, Chennai, Mumbai (All Areas)

Chennai, Tamil Nadu, India