Menu
Wed, 16 July 2025
OPINION All
Putting GPs at the heart of the NHS Partner content
By Londonwide LMCs
Health
Health
Health
Health
Press releases

Listen carefully: How AI is changing Britain's GP surgeries

Illustrations by Tracy Worrall

9 min read

Does AI hold the key to fixing our health service? Noah Vickers reports from the ‘wild west’ of GP surgeries getting to grips with the technology

The nature of GP consultations is changing. Where once the doctor might be bent over a pad taking notes, they now increasingly rely on technology, listening in on – and then analysing – these most private and confidential of discussions.

These ‘scribes’ powered by artificial intelligence (AI), otherwise known as ambient voice technology (AVT), produce a transcript of the consultation, often accompanied by a set of summarised notes, which can then be entered into electronic health records. 

Supporters say this technology frees GPs to spend more face-to-face time with patients, delivering exactly the sort of personalised service the public says it wants from family doctors. But critics say AVT – and other uses of AI in primary care – is inadequately regulated, risks increasing workload and could provoke a backlash among patients desperate for human, not machine, intelligence in their healthcare. 

While research suggests it may only save doctors about one minute of documentation time per contact, a recent article in the British Medical Journal notes that there is still a “reduction in perceived workload”. This means doctors can appear less distracted to their patients and can make sustained eye contact with them, allowing for a more engaged consultation.

Yet the sheer range of AI softwares being used in surgeries has set alarm bells ringing at NHS England. The body last month warned GPs that using “non-compliant” AI scribes “risks clinical safety, data protection breaches, financial exposure, and fragmentation of broader NHS digital strategy”. 

For doctors concerned about the speed with which the technology is being deployed, the warning helped demonstrate the need for more oversight of these tools.

“The fact that this is happening tells you very much about the wild west that we’re in,” says Dr James Murphy, a GP in Buckinghamshire. “It does feel a little bit like a free-for-all, perhaps inevitably, with the way that this is being rolled out.” 

Is the regulation fit for purpose? The scrutiny each AI scribe is subject to depends in large part on whether it is classed as a ‘medical device’ or not, which in turn depends on the sophistication of the software. 

Provided that a tool does more than simple transcription, and for example has the ability to generate a summary, NHS England has said it should at least be registered as a ‘class one’ medical device with the Medicines and Healthcare products Regulatory Agency (MHRA). But registering any of these tools relies on the company doing so voluntarily. 

“What will normally happen is these companies – where they’re not playing by the rules – they’ll say that this system they’re building isn’t a medical device, and then that’s how they get round the medical device regulations,” says Dr Benjamin Brown, a GP in Manchester, who has co-founded a company building AI software for clinical use. 

Ambient voice technology being used to record a GP consultation
(Illustrations by Tracy Worrall)

It was only in April this year that NHS England confirmed that all scribes with generative AI capabilities should be considered medical devices. 

“What that means is that you [the company selling the AI product] need to have done proper scientific studies on the accuracy and safety of the system that you’re using,” says Brown. Any of that evidence can then be audited by the MHRA. 

“[The regulation] is all there, it’s just that the companies need to follow it,” Brown adds. “A lot of these companies are start-ups. They’re scrapping for customers, it is a highly competitive area. A lot of the time they’re just trying to get the product out there.”

But the use of these tools is already widespread, prompting calls from GPs for a firmer hand from NHS England. 

“AI is already being used in general practice on a local level by individual practices and it is clearly important that GPs carefully consider the data protection and patient safety implications of this,” says Professor Kamila Hawthorne, chair of the Royal College of GPs (RCGP).

“The RCGP would welcome further guidance from NHS England on the use of AI in general practice and the minimum standards that such software must meet.”

Meanwhile, the government’s 10-Year Health Plan for England sets out its ambition to “embrace” the possibilities of AI scribes, as well as pushing ahead with AI triaging products. 

Murphy says that where AI is already being used to triage patients, there are “a lot of question marks over the quality of the decision-making”.

In his own surgery, patients report low levels of satisfaction with the digital triaging tool on offer to them. The tool was originally developed by one company that was later acquired by a larger firm.

“Suddenly, the back-office support for that product was downgraded massively overnight, but we’re stuck with the contract,” says Murphy. Meanwhile, patients complain they are unable to access appointments, have trouble using it, or find it repeatedly tells them to call 999.

Dr Jethro Hubbard, a GP in Gloucestershire, tells The House he is concerned that as the government searches for efficiencies in the health service, it could eventually result in “an NHS which is mainly AI and impersonal, and completely led by systems which are faceless and patients don’t necessarily trust”.

He argues that in this “two-tier NHS”, where only “the lucky” get to see “the few clinicians who are left”, more people will end up paying to see someone privately.

“I have colleagues who work privately and most of the time what they hear is, ‘I just want to speak to a GP – I get a faceless system or a triage nurse, I never get to speak to an actual doctor’,” says Hubbard.

“If technology could easily replace a GP, Google would have done it 20 years ago. The amount of people who come to see us as GPs – who already know the answer because they’ve Googled it, but want to hear it from someone who’s got letters after their name and has the ability to reassure them – is extremely high. So the demand to see a person will not go away, and people will feel more fobbed off.”

Murphy agrees that for policymakers, reducing the role of fully trained human clinicians will be “just too much of a temptation”, particularly given the levels of debt in parts of the NHS.

Yet Hubbard points out that greater use of AI – particularly for diagnosis – could in fact be less efficient and lead to increased costs. While GPs are experts trained in taking a degree of “calculated risk” in the treatment they recommend to patients, he argues that AI tends to be risk-averse, hampering its ability to make swift and effective prescriptions.

“We’ve already seen this, where if you go to a GP practice which is heavily reliant on allied health professionals, you will tend to come back a couple of times more than if you went to see a GP, because responsibility is generally less likely to be taken at the initial consultation,” he says.

If we do this badly then the impact will be extremely negative – and will be felt within this election cycle

“I fully accept that with general practice, there’s a lot of what we do that can be done by other people. The thing is, it’s about recognising which bits should be done by the GP.”

Hubbard acknowledges that AI brings with it “a real opportunity” for the UK to enter a “golden age” in medicine. “But it’s a balance,” he says, “so if we do this badly then the impact will be extremely negative – and will be felt within this election cycle.”

Brown argues that the effectiveness of any given AI model will be dependent on the ability of clinicians to adjust it.

“When you’re building your own models, you can tweak the parameters, and say whether you want to make it more or less sensitive,” he says. If a model is over-sensitive when triaging patients, for example, it may refer everyone to Accident and Emergency.

As Brown puts it: “If everything’s urgent, nothing’s urgent. It’s about striking that balance, and if you’re not building your own model, it’s very difficult to do that.”

For now, AI – and its potential in healthcare – enjoys a positive reception in Parliament. Peter Prinsley, Labour MP for Bury St Edmunds and Stowmarket, is a former surgeon, who jokes that among other benefits, AI-authored consultation transcripts will rescue patients from “the state of doctors’ handwriting”.

He adds that while the systems are not perfect, patients generally welcome AI-triaging if it means being able to skip the 8am rush for an appointment.

At the same time, Prinsley acknowledges that enabling AI’s full potential will require better equipment across the NHS. “If we’re going to do this at scale, we’re going to need the right sort of hardware,” he says. “We’re going to need better hardware. A lot of the computers I was using were pretty antique.”

Conservative MP James Cleverly meanwhile argues that despite the government being “very good at talking about AI unlocking productivity”, it is failing to grasp enough of the benefits in reality.

The former home secretary points out, for example, that the technology could be used to spot patterns across NHS databases, where information can be stored in inconsistent or hard-to-read formats.

The government recently confirmed that it indeed intends for the NHS to become the world’s first health system to use AI to analyse hospital databases, and catch potential safety scandals early.

But while Cleverly is keen for the government to “turn words into action”, he also acknowledges the need to tread carefully on AI.

“It’s like so many things. If we are sloppy, and we lose the confidence of our constituents, our voters, our citizens, then that will set us back. Better we do it thoughtfully and carefully and get it right the first time.” 

Categories

Health Technology