In my last two articles, I wrote about what it means to raise AI responsibly — and the rocky roads we might have to navigate along the way. That got me thinking about the excitement I felt when learning to drive and the anxiety I had when I got my licence.
When Henry Ford launched the Model T in 1908, the world didn't yet have driving tests, speed limits or a shared understanding of what ‘safe driving' meant. But within a few years, roads, rules and driving schools began to appear — the infrastructure of a new era taking shape around a revolutionary machine.
It's easy to forget that even the car — now so regulated and routine — began as an untested experiment that outpaced the rules. Every new technology seems to start this way: full of promise, slightly chaotic and miles ahead of regulation.
Artificial intelligence could be viewed similarly. Large language models act like engines that can learn, predict and generate, but we haven't had much time to agree on how best to drive them, or what skills we should be teaching in a structured way.
If we want AI to transform healthcare safely, we'll need more than just better machines. We'll need a new generation of drivers — people who respect the rules, understand the risks and know how to steer responsibly.
Licensing the unknown
It wasn't until 1903 that the UK introduced driving licences. Even then, they were more like declarations than assessments — you simply signed a form stating you could drive. There was no test, no examiner, no agreed standard of competence. It took another 30 years before the first formal driving tests arrived in 1935.
With AI, we're at that early stage again. The technology might already be on some of the roads in healthcare, from radiology to ambient scribing, but in many ways we're still agreeing the equivalent of speed limits and right-of-way.
Current regulation focuses on approving individual tools or data practices — which has to act as the foundation for being able to test, at scale, how these tools behave in real-world conditions.
And just as early driving schools helped society learn through practice, the NHS is now beginning its own version of driving lessons — testing AI tools in real workplaces, with real clinicians and real results.
For example, the recent Microsoft 365 Copilot trial, the largest of its kind globally, involved more than 30,000 NHS staff across 90 organisations. It tested how generative AI could support everyday clinical and administrative work — summarising meetings, drafting communications and streamlining documentation. The results were meaningful: on average, staff saved 43 minutes a day, the equivalent of five working weeks per person each year, freeing up valuable time to focus on patient care.
The first driving schools
When the first professional driving schools opened in the early 1900s, they weren't just teaching people how to steer and brake — they were teaching a new way of thinking. Drivers had to learn awareness, patience and respect for the shared space they were entering. The same principle could apply to how we introduce AI into healthcare today.
The UK is now starting to design the equivalent of national driving schools — places where innovation and regulation learn together. The Government's proposed AI Growth Lab would create a cross-economy sandbox to test AI-enabled products and services in carefully supervised, real-world conditions. Like a learner car with dual controls, it would allow regulators to adjust the rules temporarily while keeping safety and oversight firmly in place.
This matters, because just as early driving schools shaped not only drivers but also cars — adding mirrors, brakes and better dashboards — initiatives like these will shape both the people who use AI and the systems themselves. Whether it's testing ambient voice tools in clinics through the MHRA's AI Airlock or exploring how radiology regulation might evolve for autonomous models, these pilots show what safe experimentation looks like in practice.
Sharing the road
When Henry Ford began mass-producing the Model T, he couldn't have imagined the world that would follow — theory tests, seatbelts, traffic lights, motorways. Each innovation brought new freedoms, but also new responsibilities. We don't yet know the full rulebook for how AI and automation will reshape healthcare, but history offers a lesson: progress depends on learning how to share the road. AI will need the same balance of education, regulation and culture that driving once did — a shared understanding of what safe and responsible looks like.
The NHS, with its diversity, scale and public trust, is uniquely placed to lead the way. It has already shown that AI can be tested safely, evaluated rigorously and deployed responsibly. Next, we must make that learning collective and help clinicians, managers and patients feel confident using these tools, not just be subjected to them.
If the 20th century taught us how to drive cars, the 21st could teach us how to drive intelligence. And with the NHS as the world's largest ‘AI driving school', we won't be mere passengers; we'll be in the driver's seat.
