As a paediatrician, I rely a lot on developmental milestones. We know when a toddler should be able to run and hop on one leg, and if they can't do it as expected, we are able to intervene early and help. When I saw that ChatGPT – a large language model (LLM) – had only just turned two years old, it got me thinking about the equivalent for the effective nurturing of AI in healthcare. As chief clinical information officer at Microsoft, I see their promise every day – but I also see how vital it is that we guide their growth, not just marvel at it.
The Government's new Fit for the Future 10-Year Health Plan lays out three radical shifts: hospital to community; analogue to digital; and sickness to prevention. Significantly, the plan declares the NHS must ‘make the move from bricks to clicks' – a powerful image that captures the ambition of its digital transformation agenda. This isn't about swapping paper for tablets: it's about fundamentally reframing patient access, clinician workflows and system infrastructure to support a healthcare system that's smart, searchable and supported by AI. With momentum behind this shift, the challenge now is turning aspiration into operable, trustworthy tools – especially when it comes to developing, deploying and safely integrating AI into the NHS.
Engaging patients with personalised AI
Too often, we design healthcare communication with professionals in mind, not patients. But generative AI could change that. Imagine seeing a wheezy child in clinic and being able to generate an engaging story that features their favourite character helping them use an inhaler versus just giving them a photocopied, generic patient leaflet to take home? Why stop there? If the person looking after their child or their grandmother speaks little English, with today's technology I can generate that story in Bengali featuring a cartoon character famous in Bangladesh. We talk about patient-centred care, but rarely do our digital tools reflect the lived experience of those we're trying to help. AI gives us a shot at radically changing that – at scale and with cultural sensitivity.
Freeing staff to focus on care
Hospitals run on more than clinical care. Safety depends on good audit, regular teaching and clear communication – newsletters, guidelines, handovers, etc. These are the unglamorous but essential tasks that keep services safe. Generative AI could transform them. Think of auto-summarised audit reports, automated meeting notes, or instant slide decks for teaching. And while junior clinicians spend most of their time with patients, senior staff are often pulled into paperwork, governance and planning. If AI can shoulder more of that burden, it could free up senior time for what matters most – clearing backlogs and mentoring the next generation.
Enabling community-based, distributed care
The shift from hospital to home is essential, but it's not just about tech – it's about people. Virtual wards won't work unless we can support patients in their homes and wouldn't it be game changing if family members, carers and neighbours could take part safely? What if AI tools could act as intelligent guides for these citizen care forces? Generating tailored advice, flagging safety issues early, or even acting as a co-pilot for medication administration and wound care? These tools won't replace healthcare professionals, but they might help us extend their reach – creating neighbourhood-based teams that blend clinical oversight with local support. Done right, AI could help build a stronger, more connected system of care.
Critical questions and cautious optimism
But with great potential comes real responsibility. Can we do this safely? Regulation is still catching up to the pace of AI innovation and in healthcare, the stakes are life and death. We'll need frameworks that evolve as fast as the tools they govern – without slowing deployment of safe, effective solutions. We also need patients in the room from the start. How do we ensure the tools we're building are grounded in real experiences, not abstract data? And how do we track their development – making sure the AI is ‘growing up' into a responsible, equitable partner in care? Still, I'm cautiously optimistic. If any health system has the scale, diversity and public ethos to become the Hogwarts of AI wizardry in healthcare, it's the NHS. We can be the place where these tools are safely trained, tested and matured – not just for our supporting our own patients, but setting a standard for health systems around the world.
Let's lead, not follow
Let's lean in and use our collective scale to lead the world in responsible, effective use of AI in healthcare. If we don't do it, I don't know who can. As we see in the headlines daily, we can't afford to wait. My hope is that we all get involved. Let's not hold out for the perfect use case. Like all good learning, this will take trial error, and iteration in a safe environment. But the sooner we start, the better we'll get.