Designing AI regulation that helps the NHS adopt safely, confidently and at pace

Dr Hatim Abdulhussein, chief executive of Health Innovation Kent Surrey Sussex, explores why when it comes to AI regulation, the question is not whether to enable it, but how to do so in a way that gives the NHS faster access to safe, effective technologies while maintaining public trust.

(c) Health Innovation Network

(c) Health Innovation Network

Artificial intelligence is already reshaping how care is delivered across the NHS, from clinical decision support to workflow automation and ambient voice tools that release time back to clinicians.

The Medicines and Healthcare Regulatory Agency's (MHRA) recent call to evidence on Regulation in AI Healthcare will help to shape the commission's recommendations and address the most pressing challenges in AI regulation. Our submission to the call for evidence set out some priority areas for consideration.

Real-world evaluation and post-market surveillance (PMS) 

Real-world deployments allow for technologies to prove their efficacy and safety in everyday settings, supporting clinicians and patients to build trust in their value, while managing post market surveillance to monitor how performance changes over time to ensure trust and value are maintained. Allowing for real-world examples to build evidence in numerous real-world settings, supports clinicians, patients and regulators to understand how to make the most of new technologies, safely, while also supporting scale of adoption.  

For example, AI-guided clinical coaching, implemented widely in north west London, (UCLPartners/Health Navigator) was able to demonstrate huge productivity gains, including 34% fewer emergency attendances and 25% fewer bed days in prior trial evidence, which is supporting its expansion from initial rollout in Waltham Forest and Havering in 2023/24 to across north east London over a three-year period, reaching thousands of patients and increasing system productivity at pace. Similarly, a real-world evaluation of ambient voice technology (Tortus/Health Innovation Kent Surrey Sussex) demonstrated impact across multiple settings, and also provided rapid learning on implementation and safety, building confidence and supporting safer scaling. 

Once adopted, building AI-specific monitoring into existing surveillance requirements can help maintain trust and support ongoing learning from real-world use.  

Considerations like what constitutes a minor versus material software change, and understanding how system level AI might affect patient flow, staff resilience and clinical decisions, for example (as evidenced in Imperial College Health Partners' evaluation of OPTICA), with clear change management rules for updating AI, will allow safety to remain paramount, even as technologies or data or pathways change. 

If learnings are then enabled to flow back to manufacturers, regulators and providers through practical feedback loops, not just compliance reporting, safer long-term use beyond initial pilots can also be achieved. 

Shared responsibility, clear guidance and personal pathway support 

Safe AI adoption depends on shared responsibility. Manufacturers, providers, clinicians, system partners and professional bodies each have auditable duties. When these are supported by standardised implementation checklists, model policies and shared learning platforms, the NHS can adopt innovation with greater consistency and less burden. Commissioners should support the research and development of AI. Professional bodies and training colleges should support training in AI implementation and shared learnings. 

We also need to join up the process from regulation to adoption. Of course, not every AI tool in healthcare is a medical device and currently innovators and NHS teams often find themselves navigating uncertainty about where tools, such as ambient voice tools, sit in terms of their categorisation. Clear, practical guidance with worked examples would reduce duplication and hesitation for both adopters and developers.  

While low-risk devices can be self-certified and higher-risk tools assessed by approved bodies and registered with the MHRA, NHS teams often experience this as fragmented. A modernised AI and digital health regulations service should provide more bespoke support. There are many digital and AI communication tools that could be utilised to significantly improve the experience for innovators and the NHS alike. 

Regulation as an enabler 

Working at the intersection of health, innovation and place, bringing together NHS organisations, industry, academia and Government to accelerate the adoption of technologies that improve patient care, one lesson has been consistent: innovation is a team sport. It requires frameworks and systems that give innovators confidence, clinicians assurance and patients trust. Regulation is central to that.  

AI presents new challenges for regulators. Unlike medication and medical devices, more factors can affect the performance of an AI system over time and in different patient groups. The MHRA commission brings together expertise from global AI leaders, clinicians and regulators. Richard Stubbs, chief executive of Health Innovation Yorkshire and Humber, is part of the commission.

If we get this right, regulation will not be seen as a barrier, but as the foundation that allows the NHS to adopt AI safely, responsibly and at the pace patients deserve.  

Partnerships: enabling delivery of the National Cancer Plan

Partnerships: enabling delivery of the National Cancer Plan

02 April 2026

Delivering the Government’s new National Cancer Plan will depend not only on policy ambition but on imaginative, purposeful partnerships that fully integrate...

Troubled trust spent £18k on hotel stays for turnaround team

By Lee Peart 02 April 2026

A troubled trust, which sits bottom of the NHS acute league table, spent over £18,000 on hotel stays for its turnaround team, it has been revealed.

Mental ill health top reason nurses leave the NHS

By Lee Peart 02 April 2026

Mental ill health is the top reason nurses leave the NHS, according to Nuffield Trust analysis.