We are looking for a passionate and results-driven .NET Developer to join our development team. The ideal candidate will have experience building high-quality, scalable applications using Microsoft technologies and will be comfortable working across both frontend and backend components. You will be involved in the full software development lifecycle from understanding business requirements to solution design, development, and deployment. Responsibilities : Design, develop, and maintain secure, scalable, and high-performance applications using .NET Core / .NET 6+ , C# , and related technologies Develop RESTful APIs and backend services following clean architecture and design principles Work with cloud platforms, preferably Microsoft Azure , to build and integrate services (e.g., Azure App Services, Blob Storage, Azure Functions, Key Vault) Write clean, reusable, and testable code, with a strong focus on quality and performance Collaborate with frontend developers, QA engineers, and DevOps teams to deliver end-to-end solutions Participate in code reviews, architecture discussions, and sprint ceremonies Maintain technical documentation and contribute to knowledge-sharing within the team Investigate and resolve issues in production and test environments 47 years of experience in .NET development Strong problem-solving and communication skills Exposure to multi-app environments and integration-heavy systems Comfortable working in a fast-paced, agile environment Bonus: experience working on B2B customer portals or Sitecore migration
We are seeking a Data Analyst with strong Power BI expertise and experience working across both modern and legacy analytics platforms. You will create analytical solutions, enable business self-service capabilities, and support the transition from legacy systems (Excel, SSRS, OLAP Cubes) to the Microsoft Fabric ecosystem. This role combines reporting, ad-hoc analysis, data investigation, and user enablementwhile collaborating closely with Data Engineers and Platform Engineers to deliver reliable, high-quality insights. Key Responsibilities 1. Reporting & Analysis Develop and enhance Power BI dashboards, including new filters, performance optimisations, and UX improvements. Perform ad-hoc analysis to answer business questions across multiple data sources. Maintain and optimise existing reports in Excel, SSRS, and Power BI. Conduct exploratory data analysis to identify trends, patterns, and opportunities. Support end-of-month finance processes and EDW/IDW operations. 2. Data Investigation & Quality Investigate and resolve data quality issues using systematic “is/is not” root cause analysis. Collaborate with engineering teams to identify, document, and resolve discrepancies. Validate data accuracy across both Fabric and legacy systems. Provide business users with clarity on data definitions, limitations, and correct usage. 3. Self-Service Enablement & Training Empower business teams to access data independently via self-service tools. Provide training and documentation for Power BI and related Fabric capabilities. Guide tool selection (Excel, Power BI, or other Fabric tools) based on requirements. Assist in gathering business requirements for vendor-delivered solutions and validate outputs. 4. Business Insights & Strategic Support Analyse performance metrics to identify improvement opportunities. Support strategic initiatives with predictive modelling and forecasting (where applicable). Present actionable insights and recommendations to stakeholders. Provide data-driven input for business cases and investment decisions. Essential Skills & Experience 2+ years in business intelligence or data analysis roles. Advanced Power BI skills (dashboard creation, DAX, data modelling, troubleshooting). Strong SQL skills for querying and analysis (Fabric & SQL Server). Experience with legacy systems : Excel/VBA, SSRS, OLAP Cubes. Systematic problem-solving approach (“is/is not” methodology). Strong communication skills to explain technical issues and guide tool adoption. Proven experience supporting migrations from legacy to modern analytics platforms. Preferred Skills Microsoft certifications (PL-300, DP-600). Experience with Python, R, or similar analytical languages. Background in finance, operations, or commercial analysis. Familiarity with Microsoft Fabric notebooks or Jupyter notebooks. Statistical analysis or data science knowledge. Experience training business users on analytics tools. Ways of Working Agile/Scrum environment with flexibility for BAU priorities. Close collaboration within a three-role squad : Data Platform Engineer – platform governance & vendor oversight Data Engineer – data pipelines & gold tables Data Analyst – business insights & self-service enablement Hybrid technology environment (modern cloud + on-premises systems). Strong vendor coordination responsibilities for data product delivery. Success Metrics Improved platform performance and cost optimisation. High data quality and reliability across reports. Increased adoption of self-service analytics. On-time, standards-compliant vendor project delivery. Measurable progress in migrating users from legacy to modern tools.
We are seeking a skilled Data Engineer to design, build, and maintain high-performance data pipelines within the Microsoft Fabric ecosystem. The role involves transforming raw data into analytics-ready assets, optimising data performance across both modern and legacy platforms, and collaborating closely with Data Analysts to deliver reliable, business-ready gold tables. You will also coordinate with external vendors during build projects to ensure adherence to standards. Key Responsibilities Pipeline Development & Integration Design and develop end-to-end data pipelines using Microsoft Fabric (Data Factory, Synapse, Notebooks). Build robust ETL/ELT processes to ingest data from both modern and legacy sources. Create and optimise gold tables and semantic models in collaboration with Data Analysts. Implement real-time and batch processing with performance optimisation. Build automated data validation and quality checks across Fabric and legacy environments. Manage integrations with SQL Server (SSIS packages, cube processing). Data Transformation & Performance Optimisation Transform raw datasets into analytics-ready gold tables following dimensional modelling principles. Implement complex business logic and calculations within Fabric pipelines. Create reusable data assets and standardised metrics with Data Analysts. Optimise query performance across Fabric compute engines and SQL Server. Implement incremental loading strategies for large datasets. Maintain and improve performance across both Fabric and legacy environments. Business Collaboration & Vendor Support Partner with Data Analysts and stakeholders to understand requirements and deliver gold tables. Provide technical guidance to vendors during data product development. Ensure vendor-built pipelines meet performance and integration standards. Collaborate on data model design for both ongoing reporting and new analytics use cases. Support legacy reporting systems including Excel, SSRS, and Power BI. Resolve data quality issues across internal and vendor-built solutions. Quality Assurance & Monitoring Write unit and integration tests for data pipelines. Implement monitoring and alerting for data quality. Troubleshoot pipeline failures and data inconsistencies. Maintain documentation and operational runbooks. Support deployment and change management processes. Required Skills & Experience Essential 2+ years of data engineering experience with Microsoft Fabric and SQL Server environments. Strong SQL expertise for complex transformations in Fabric and SQL Server. Proficiency in Python or PySpark for data processing. Integration experience with SSIS, SSRS, and cube processing. Proven performance optimisation skills across Fabric and SQL Server. Experience coordinating with vendors on technical build projects. Strong collaboration skills with Data Analysts for gold table creation. Preferred Microsoft Fabric or Azure certifications (DP-600, DP-203). Experience with Git and CI/CD for data pipelines. Familiarity with streaming technologies and real-time processing. Background in BI or analytics engineering. Experience with data quality tools and monitoring frameworks.
The Data Platform Engineer will be responsible for designing, implementing, and optimising the technical foundation of our Microsoft Fabric environment. This role combines deep platform architecture expertise with governance, security, cost management, and vendor oversight to ensure high performance, scalability, and compliance across enterprise data systems. You will provide architectural leadership for both internal teams and external vendors, driving best practices for building robust, future-ready data products on the Microsoft Fabric platform. Key Responsibilities Platform Architecture & Infrastructure Design and implement enterprise-grade Microsoft Fabric architecture with a focus on performance, scalability, and cost efficiency. Build and maintain core data infrastructure, including data lakes, semantic layers, and integration frameworks. Establish and manage security models, access controls, and compliance frameworks across Fabric and legacy environments. Oversee performance monitoring, capacity planning, and resource optimisation. Implement cost governance practices, including usage monitoring, charge-back models, and optimisation strategies. Design and maintain integrations with legacy systems (SQL Server, SSIS, SSRS, Excel/VBA, OLAP cubes). Governance & Standards Define and enforce enterprise data governance policies, quality standards, and architectural principles. Establish development standards, deployment pipelines, and Infrastructure-as-Code (IaC) practices. Maintain comprehensive platform documentation, architectural guidelines, and best practice standards. Implement data lineage, cataloguing, and metadata management solutions. Design and execute disaster recovery and business continuity strategies. Technical Leadership & Vendor Management Mentor and guide Data Engineers and Data Analysts on platform usage and best practices. Lead architectural governance for external vendors developing data products on the Fabric platform. Review and approve vendor technical designs and implementation strategies. Ensure adherence to architectural standards and integration patterns. Troubleshoot complex technical issues in both internal and vendor-delivered solutions. Coordinate rollout of new platform capabilities and features. Strategic Platform Development Evaluate and integrate new Microsoft Fabric features to enhance platform capabilities. Develop integration patterns for connecting external systems and data sources. Plan and execute platform upgrades, migrations, and scalability initiatives. Collaborate with enterprise architecture teams to align with broader IT strategy. Required Skills & Experience Essential 4+ years in data platform engineering with modern cloud platforms. 2+ years hands-on experience in Microsoft Fabric administration and architecture. Proven expertise in cost management and performance optimisation for large-scale data platforms. Strong security background, including Azure AD, RBAC, and compliance frameworks (GDPR, HIPAA, etc.). Integration experience with legacy systems (SQL Server, SSIS, SSRS, Excel/VBA). Advanced performance tuning skills for Fabric compute engines and SQL Server. Demonstrated scalability planning for expanding data volumes and user bases. Highly Valued Microsoft Azure certifications (AZ-305, DP-203, DP-700). Experience with enterprise data governance tools and frameworks. Background in platform engineering or DevOps practices. Familiarity with data mesh or federated analytics architectures. Experience with monitoring and observability tools.