Required Qualifications:
10-12 years of overall experience in data architecting, design, and development of data warehouse systems (Cloud/on premises and Analytics) at enterprise scale.
•Must have implemented end to end platform modernization projects.
- Strong experience with Microsoft Fabric, Azure Synapse, Databricks or similar modern data platforms.
- Proven expertise in data modeling (ER/Studio, ERWin, dbt, Power BI Semantic Models), data integration, and data governance frameworks.
- Good knowledge of Azure DevOps, GitHub Actions, CI/CD Pipelines
- Strong collaboration, communication, and leadership skills in cross-functional environments.
- Must be hands on, deeply technical and exposure to latest features within data platform development, able to do both coding and guide group of junior/mid-level database engineers.
- Must have at least 5+ experience in managing technical team and leading them to deliver large scale solutions.
- Must have excellent communication skills (English) to interface with US clients directly and verify in proficient in all types of communication modes Listening, Reading, Writing and Speaking
- Must have strong experience in Performance Tuning / Optimization Initiatives using DB Profiler tools
- Must have strong experience in taking ownership in delivering other activities apart from Project deliverables like SoW writing, performance management, initiatives, offshore team management, recruitment etc.
- Experience in a variety of business verticals with preference to Insurance and banking.
- Ability to act independently to resolve issues, excellent communication (English) and interpersonal skills, strong critical analytic thinking.
- Ability to manage team a through the entire development life cycle and to manage time effectively.
Responsibilities:
- Define and implement end-to-end data fabric architecture using Microsoft Fabric components (Data Factory, Synapse, Power BI, OneLake, Lakehouse).
- Design data ingestion, transformation, and orchestration frameworks supporting both batch and real-time data integration.
- Develop metadata-driven ingestion, audit, balance, and control frameworks to ensure data consistency and traceability.
- Architect semantic data models and analytical layers for enterprise-wide KPIs, reporting, and AI enablement.
- Create data modeling standards and blueprints for dimensional, logical, and physical models (Star/Snowflake schemas).
- Lead design and implementation of data pipelines and integration patterns using Python, SQL, Spark, and Fabric Data Factory.
- Oversee data ingestion from structured, semi-structured, and unstructured sources into the Fabric Lakehouse or Warehouse.
- Ensure data quality, lineage, and metadata management across all data domains.
- Integrate Fabric platform with Azure and other cloud ecosystems for unified data access
- Define and implement data governance policies, metadata management, and cataloging using Microsoft Purview or Fabric-native capabilities.
- Ensure compliance with enterprise and regulatory standards (GDPR, HIPAA, SOC2, etc.).
- Implement row-level and column-level security, encryption, and access control frameworks.
- Establish data lifecycle management and retention strategies
- Collaborate with business and analytics teams to define KPIs, semantic models, and reporting frameworks.
- Enable self-service analytics through well-structured Power BI datasets and data marts.
- Optimize query performance, data partitioning, and model efficiency for large-scale analytics
- Lead and mentor technical teams on Fabric best practices, design patterns, and reusable components.