Working under the Head of IT, you will be responsible for designing and developing new in-house applications to support and optimise a wide range of business processes. This role combines strong software development skills with practical experience in AI/ML integration. You will build solutions that include AI-powered features such as LLMs and RAG, while also working on core application development, data pipelines, API integrations, and system architecture. Your work will help deliver secure, reliable, and scalable software that drives efficiency and innovation across the organisation.
Key deliverables (Essential duties and responsibilities)
IT Projects
Collaborate with the Head of IT and System Analysts to translate business requirements into functional technical specifications. Contribute to planning, estimating, and prioritising IT projects to ensure timely, high-quality delivery. Where relevant, integrate AI/ML capabilities (e.g., LLMs, RAG) into applications to meet business needs. Ensure all deliverables comply with security, data protection, and organisational IT policies.
Software and System Development
Design, develop, and maintain in-house applications that support diverse business processes. Build robust APIs and data pipelines for secure, efficient data exchange between systems. Integrate with approved AI service providers (e.g., Azure OpenAI, AWS Bedrock, local model services) as part of application development. Implement database structures, and integrations for both AI-enabled and conventional applications. Develop and maintain automation workflows using tools such as n8n to streamline processes and improve efficiency. Conduct code reviews, optimise performance, and ensure maintainability of solutions. Experience in containerizing applications using Docker and implementing CI/CD Azure pipelines to automate build, test, and deployment processes. Test, debug, and deploy applications into production environments, including identifying and resolving issues from User Acceptance Testing (UAT).
System Integration & Support
Collaborate with System Analysts to ensure smooth integration of new applications with existing systems. Provide support for troubleshooting, bug fixes, and performance tuning. Prepare technical documentation, deployment guides, and operational handover materials.
Required Skills
Strong proficiency in full-stack programming languages such as JavaScript/Typescript (React), Python, Java, or C#. Experience with database design and management (e.g., SQL Server, MySQL, PostgreSQL, Supabase framework). Hands-on experience building and consuming RESTful APIs or GraphQL APIs. Experience with Docker and CI/CD pipelines, with familiarity in deploying to cloud (Azure/AWS) or on-premises environments. Experience with self-hosting scalable, custom applications using Microsoft Azure (VMs, Load Balancers, automated Disaster Recovery workflows, Web Application Firewalls) Exposure to integrating basic AI/ML features into applications (e.g., calling LLM APIs, implementing simple RAG workflows, deploying pre-trained models). Knowledge of data pipelines. Familiarity with version control systems (e.g., Git). Understanding of software development best practices, including secure coding, scalability, and performance optimisation. Ability to test, debug, and resolve UAT issues independently. Strong problem-solving skills and ability to translate business requirements into technical solutions.
Desirable Skills
Experience with vector databases (e.g., pgvector, Qdrant/Supabase Vectorstore, Azure AI Search) and embeddings. Familiarity with AI orchestration/tooling (e.g., LangChain, LlamaIndex) and workflow automation tools (e.g., n8n). Experience with cloud AI services (e.g., Azure AI, Azure OpenAI, AWS Bedrock) or local LLMs (e.g., Ollama). Implemented role-based access control (RBAC), certificate management, and MFA/2FA systems handling sensitive data and AI models. Developed secure, scalable, and reusable modules/workflows for integrating large language models (LLMs) organisation-wide, including prompt management, and context handling. Understanding of performance and cost optimisation for AI features (e.g., token usage, caching, batching, streaming, model selection). Experience with fine-tuning AI models or integrating more advanced AI capabilities. Familiarity with DevOps practices and CI/CD tooling. Knowledge of enterprise system integration (ERP, CRM, etc.). Experience in performance profiling and application optimisation for scalability.