Must Have Skills
-
- Extensive hands-on experience with Azure Data Lake (including Blob Storage and Delta), Azure Synapse Analytics, and Databricks.
- Proven expertise in SQL/Azure SQL data warehousing and advanced query optimisation.
- Strong background in data engineering practices, including release management, environment controls, and CI/CD pipeline orchestration.
- Proficiency in Python and Spark for data processing and automation.
- Ability to translate business requirements into effective data and analytics solutions, working directly with stakeholders.
- Proven expertise in applying AI and machine learning techniques to data engineering challenges, with a track record of delivering automation and innovation through production grade AI-driven
Job Description :
The Senior Data Engineer will;
- Design, develop, and own core data engineering services, including data orchestration and CI/CD pipelines.
- Build ETL/ELT processes, data lake architectures, data warehouses, and data marts aligned with the company s reporting strategy and best practices.
- Develop relational and dimensional data models to support complex reporting and analytics requirements.
- Deliver reporting solutions that adhere to architectural principles, standards, and best practices.
- Assess and manage the impact of changes on live systems and processes.
- Apply AI expertise to embed machine learning and automation in workflows, driving innovation and efficiency.
- Work proactively in an agile environment to deliver high-quality solutions on time.
- Mentor and support junior data engineers and developers, fostering technical growth within the team.
Principal Accountabilities :
- Act as a senior contributor in the design and build of Brit s enterprise strategic reporting solutions, ensuring alignment across all components of the data lake and BI platform architecture, coding practices, and standards.
- Analyse business requirements and deliver robust reporting solutions that meet functional and strategic needs.
- Enforce SDLC best practices, coding standards, and design principles across the data lifecycle and BI/MI solutions.
- Assess business requirements and develop effective data solutions.
- Support the development of Brit-specific data management policies, processes, and standards.
- Stay current with technology trends, platforms, and data engineering techniques, providing input to Brit s strategic roadmap.
- Proactively identify opportunities for automation and continuous improvement, leveraging AI and machine learning to drive innovation and efficiency.
- Develop Proof of Concepts to demonstrate the viability of technical solutions.
Required Skills & Knowledge
- Extensive hands-on experience with Azure Data Lake (including Blob Storage and Delta), Azure Synapse Analytics, and Databricks.
- Proven expertise in SQL/Azure SQL data warehousing and advanced query optimisation.
- Strong background in data engineering practices, including release management, environment controls, and CI/CD pipeline orchestration.
- Proficiency in Python and Spark for data processing and automation.
- Demonstrated ability in Microsoft BI development, covering database design, reporting solutions, and full development lifecycle.
- Advanced skills in data modelling, problem-solving, and information analysis, with exceptional attention to detail and adaptability.
- Ability to translate business requirements into effective data and analytics solutions, working directly with stakeholders.
- Proven capability to manage tight deadlines and deliver high-quality solutions under pressure.
- Strong interest in emerging data engineering technologies, best practices, and industry trends.
- Experience with container apps (e.g., Azure Container Apps, Docker, or Kubernetes) for deploying and managing scalable data and AI solutions.
- Proven expertise in applying AI and machine learning techniques to data engineering challenges, with a track record of delivering automation and innovation through AI-driven solutions.
Highly Desirable Skills and Knowledge
- Knowledge of London market insurance
- Extensive experience in designing and building data warehouses and data lakes
- Experience working with financial systems and business processes
- Experience with Infrastructure-as-Code (e.g., Terraform)
- Agile development techniques (SCRUM, Kanban, etc.)
Required Toolsets
- Azure Data Lake Storage, Azure Synapse Analytics, Azure Databricks, PySpark, Azure Container Apps, Docker, Python
- Kimball data modelling
- Experience with AI and machine learning toolsets and frameworks relevant to data engineering, such as Azure Machine Learning, Databricks ML, or similar platforms