About the Role:
We are seeking an experienced Data Engineering Architect to join our team. In this role, you will be responsible for designing comprehensive data architecture and lead the design and development of scalable, modern data platforms that power business-critical data products and insights. You will work closely with customers and technology partners to deliver data solutions that address complex telecommunications business requirements including customer experience management, operational analytics, digital transformation initiatives and AI/ML enablement. The ideal candidate will bring deep technical expertise across cloud data engineering, data lakehouse architecture, and modular, reusable data components and have strong hands-on experience with Azure, Databricks, Delta Lake, and modern data engineering tools and frameworks.
Responsibilities:
- Design and implement robust, scalable, and cost-effective data architectures at enterprise level using the Azure Databricks Lakehouse platform.
- Architect modern Delta Lakehouse platforms to support structured and semi-structured data ingestion, processing, and analytics.
- Implement robust data integration and orchestration pipelines using platforms Kafka, ADF, Airflow, Event hubs etc.
- Experience in building Customer 360 platforms (CDP,CIH etc )
- Create data architectures that support business-specific use cases including customer journey analytics, CLTV, churn score, market segment and enable reverse ETL.
- Collaborate with domain owners (CRM, billing ,usage) to define data contracts and model domain specific datasets.
- Lead the definition of data modeling standards (dimensional, normalized, data vault) and best practices.
- Establish and enforce data governance, security, and privacy controls aligned with regulatory compliance requirements like GDPR, PII etc.
- Collaborate with data engineers, product teams, business stakeholders and clients to translate business needs into scalable data solutions.
- Evaluate and recommend appropriate big data technologies, cloud platforms, and processing frameworks based on business-specific requirements and regulatory compliance needs.
- Stay current with the latest advancements in data technologies including cloud services, data processing frameworks, and AI/ML capabilities
- Contribute to the development of best practices, reference architectures, and reusable solution components for accelerating proposal development
Qualifications:
- Bachelor's or Master's degree in Computer Science, Data Science, or a related technical field
- 10+ years of experience in data engineering, BI architecture with at least 5+ years in architectural or solution architecture
- Strong experience with: • Azure (Data Lake, Data Factory, Synapse, Key Vault, Azure AI services) • Databricks, Delta Lake, Delta live tables • PySpark,Scala,SQL, Python, Kafka
- Unity Catalog, Alation
- Familiarity with modern data frameworks and tools: • Apache Kafka, Airflow, Flink, NiFi, dbt, Iceberg
- Deep understanding of data lakehouse concepts, data modeling, and pipeline orchestration and performance optimization.
- Proven ability to design reusable, domain-driven data products in a large-scale enterprise environment
- Experience in data governance, metadata management, data cataloging, data quality, lineage and compliance.
- Exposure to data mesh, domain ownership models, and data product thinking.
- Understanding of DevOps, CI/CD in data environments.
- Knowledge of system monitoring and observability tools Prometheus and Grafana
- Experience designing and implementing data lakes, data warehouses, and machine learning pipelines for business use cases
- Excellent communication and presentation skills with ability to explain complex technical concepts to business stakeholders
- Good to have TMforum certifications or telecommunications industry certifications
- Relevant data platform certifications such as Databricks, Azure Data Engineer are a plus
- Willingness to travel as required