Menu
Sat, 18 May 2024

Newsletter sign-up

Subscribe now
The House Live All
Technology
Culture
Soaring dementia care costs reach £42 billion in UK – and families bear the brunt Partner content
Health
An international call to G7 leaders for financial commitments to fight neglected tropical diseases Partner content
By Uniting to Combat Neglected Tropical Diseases
Health
Time for a prevention-led model to rebuild the nation’s health Partner content
Health
Press releases
By British Association for Nutrition and Lifestyle Medicine (BANT)

"Therapeutic Tinder": The rise of AI therapy and the risks to patients

Chatbots can be programmed to use CBT to treat common mental health problems (Alamy)

3 min read

AI therapy is on the rise, meeting surging demand in a marketplace struggling to offer supply. But is it a double-edged sword? Zoe Crowther reports

A promotional banner promises to create a “virtual relationship” between you and two “experts”, underneath a photo of a man and a woman against a crisp white background, with beaming white smiles.

Thousands of online social therapies have popped up such as this one, replacing the traditional in-person therapist with chatbots that simulate conversations. The chatbots employ similar techniques, such as cognitive-behavioural therapy (CBT), that many human therapists use to treat common mental health problems like depression and anxiety.

Mental health services in the United Kingdom face a myriad of problems: the NHS has record waiting lists for mental health patients, and The House revealed a postcode lottery in child and adolescent mental health care where some children wait years for treatment.

There is therefore a huge appetite in the sector for innovative ways to speed up assessment and treatment and NHS trusts often offer contracts to private firms to deliver services. But, according to psychotherapist Dr Elizabeth Cotton, online therapy platforms pose a risk to both patients and therapists.

Cotton, a senior researcher at Cardiff Metropolitan University, refers to this as the “Uberisation” of mental health, where services are marketed through online platforms – with similar effects, she says, of taxi apps like Uber.

“I’d say the biggest problem is the advent of these online therapy platforms,” Cotton tells The House. “It creates a precarity between therapist and patient and it fundamentally erodes any chance of having an actual relationship with an actual therapist. I call it therapeutic Tinder – it sort of just sets the parameters of a relationship.”

“I call it therapeutic Tinder – it sort of just sets the parameters of a relationship”

In 2021, the government announced a £36m boost for AI technologies to revolutionise NHS care. Many trusts have since used AI chatbots for patient self-referral and interactive therapy, and AI tools to analyse patient language to train therapists.

While NHS mental health services struggle with an increasing demand for diagnosis and treatment, and a shortage of CBT-trained therapists, the demand for online therapies is likely to grow; and AI will play an ever-more prominent role. “You can get great therapy in this country if you can afford to pay for it privately,” Cotton says. “And if you can’t, the pressure is to use online therapy platforms which are cheaper and can be provided through your employer.”

Although this may help to ease pressure on the NHS, Cotton says she is sceptical of the “diagnostic validity”, longevity and confidentiality of online therapy services. Often delivered by private firms, she suggests there can be a practice of “creaming and parking”, whereby digital providers carry out mental health services for those with only relatively moderate mental health problems and cite high recovery rates, but “push away” those with complex mental health needs.

“So, you simply start to move from a model of patients to clients,” she says. “Anybody with a serious mental health problem that needs help will probably get lost. At no point do you meet somebody who can make a proper assessment and signpost you to an appropriate service, because it’s all standardised.”

There are other questions relating to how these platforms can use patients’ data to create new products and inform mental health diagnoses. Patient confidentiality could be compromised by AI technology that, by its very nature, depends on information sharing.

AI is not a technology of the future; it already pervades platforms used by thousands of people with mental health problems in the UK, Cotton insists. “We are already using it, it’s very fast and it tries to sell a product which it’s actually not delivering: therapy. It’s not therapy.”

PoliticsHome Newsletters

Get the inside track on what MPs and Peers are talking about. Sign up to The House's morning email for the latest insight and reaction from Parliamentarians, policy-makers and organisations.

Read the most recent article written by Zoe Crowther - Women's Issues Have Seen A "Sea Change" In Parliamentary Attention

Categories

Health Technology