Description
Job Title : Lead Data Consultant / Senior Data EngineerExperience Level : 10+ Years (with deep Azure experience)Employment Type : Full-time / Contract
Role Overview
As a Lead Data Consultant / Senior Data Engineer, you will design, develop, and lead the delivery of enterprise-scale, cloud-native data platforms for our clients, with a particular focus on Azure, Databricks, and modern Lakehouse architectures. Working under the direction of the Principal Data Consultant you will help shape go-to-market technical assets and production-ready solutions across data engineering, system integration, governance, and AI/ML-enablement.This is a hands-on consulting role requiring both strong implementation expertise and the ability to influence design, compliance, and delivery decisions across diverse enterprise environmentsparticularly in Financial Services and Insurance (FSI).
Key Responsibilities
- Architect and build Lakehouse solutions with bronze/silver/gold layers using Delta Lake and Databricks
- Implement scalable ETL/ELT pipelines using Azure Data Factory, Airflow, Databricks, and PySpark
- Define enterprise-level data architectures, including lineage, governance, and integration patterns
- Enable agent orchestration and validation pipelines to support dynamic, AIenabled processing flows
- Lead development of data ingestion, transformation, and orchestration frameworks across cloud-native environments
- Guide implementation of data governance and compliance standards specific to FSI, including lineage tracking and access controls
- Collaborate with cross-functional teams (platform, ML, API, compliance) to design integrated data + AI systems
- Develop CI/CD pipelines and IaC (Terraform) modules to automate provisioning and deployment of data infrastructure
- Mentor other engineers and ensure reusable, modular, well-documented assets are delivered
Technical Skills & Experience
Data Platforms & Engineering :
- 10+ years of experience in data engineering, with deep Azure cloud experience
- Proven experience designing Lakehouse architectures and implementing bronze/silver/gold/curated layers
- Hands-on expertise with Databricks, Delta Lake, PySpark, and SQL
- Experience integrating Kafka, RDBMS, and unstructured data into cloud pipelines
- Knowledge of DLT-Meta frameworks, metadata-driven ELT development, and data mesh concepts
- Familiarity with GenAI, and its integration with structured/unstructured data
System Architecture
- Ability to design systems spanning data layers, API integrations, and AI orchestration components
- Knowledge of agentic systems, validation workflows, and event-driven orchestration
Governance & Compliance
- Strong understanding of data lineage, data quality, and access control models
- Experience applying FSI regulatory standards, including auditing and privacy best practices
Cloud & DevOps
- Proficiency in Azure services, especially ADF, Synapse, Blob Storage, Key Vault, and Managed Identity
- CI/CD using GitHub, GitLab, or Azure DevOps, with automated deployment patterns
- Infrastructure automation using Terraform with Git-based workflows
- Development environments : VSCode, Python scripting, Git version control
Non-Technical & Consulting Skills
- Agile delivery experience : refinement, estimation, MVP definition, backlog grooming
- Ability to translate technical solutions into high-level architecture artifacts and documentation (Markdown, Confluence, Lucidchart)
- Comfortable reviewing existing codebases and recommending paths for reuse or modernization
- Proven ability to mentor and unblock delivery teams, with experience guiding junior and mid-level engineers
- Strong communication skills for client interaction, executive presentations, and cross-team coordination
(ref:hirist.tech)