Menu
Thu, 28 March 2024

Newsletter sign-up

Subscribe now
The House Live All
By Baroness Fox
Home affairs
Historic wins, inspiring moments and British success: MPs share what they’re looking forward at the Paris Olympics Partner content
Communities
Veterans falling victim to plague of process  Partner content
Communities
Communities
Economy
Press releases

The rise of deepfake porn is devastating for women – we must go further to crack down on this technology

(Alamy)

4 min read

The potential implications and dangers of artificial intelligence have burst onto our front pages and dominated the news cycle over the past month, as experts warn of the possible far-reaching impact of the technology on our lives, jobs and wellbeing.

But another technological advancement that is already having a devastating impact on thousands of women and is suspected to have impacted potentially hundreds of thousands, without their knowledge or consent, has received relatively little attention by comparison.

Alongside the rise of headline-grabbing tools such as ChatGPT, there are a plethora of other easily accessible apps and websites that allow users without technological expertise to generate fake pornographic images that transpose real women’s faces onto explicit pictures. Simple, fast and cheap, the quickly emerging technology provides a terrifying new tool for abusers to target women and girls, with 96 per cent of deep fake images on the web estimated to be pornography and the vast majority featuring female subjects. 

Users can create content featuring real-life women with near-total impunity

The potential harms are wide-ranging, from the use of such images to coerce and blackmail women, to the long-lasting psychological, emotional and reputational damage on victims. Already there have been reports of women losing their jobs as a result of deepfake pornographic images made of them without their participation or consent. Like other forms of online and image-based abuse, there is a misguided tendency – often from those with no direct personal experience of the issue – to dismiss this as a trivial issue that victims should simply be encouraged to ignore, but this entirely fails to acknowledge the enormous impact on many aspects of victims’ lives, as well as the significant damage to their mental health.

Demand for deepfake pornography apps and sites is growing exponentially. While such technology originally focused on exploiting and misusing images of celebrities (itself a violation of women on a mass scale), adverts for the tech now popping up across social media platforms emphasises how users can create content featuring real-life women with near-total impunity.

In a world where legislation is scrambling to keep up (non-consensual deepfake images are illegal in just a handful of states in the United States), it is positive that the Online Safety Bill will criminalise the sharing of such images. But it is breath-taking that the bill, which also aims to address issues such as online harassment, does not mention women once in its 260 pages.
While attempting to tackle image-based abuse after the fact is a step in the right direction, we know from other misogynistic and sexual offences that recourse to justice is often limited and frustratingly slow for survivors, so it is vital that we focus on preventing the abuse (and the creation of such images) in the first place.

Better regulation of tech platforms and accountability for social media sites should be introduced to force them to take proactive steps to prevent their platforms being used to abuse and harass women on an industrial scale. The current aim seems to be giving women tools to find and painstakingly report abusive images, rather than preventing abusive men from making or sharing them in the first place.

As somebody who has personally experienced image-based abuse, I know first-hand what it feels like to be suddenly and unexpectedly confronted with pictures of your own face digitally altered into graphic and abusive messages. The shock, disgust, fear and shame I experienced when a man created images of himself ejaculating on my face, among others, is difficult to describe. The impact is severe and long lasting. 

We have a small window of opportunity now to take real, systemic action before the practice of making and sharing these images becomes normalised to the point of ubiquity. For the sake of women and girls everywhere, and generations to come, we must grab that opportunity with both hands. 

 

Laura Bates, founder of the Everyday Sexism Project and author of Men Who Hate Women

PoliticsHome Newsletters

Get the inside track on what MPs and Peers are talking about. Sign up to The House's morning email for the latest insight and reaction from Parliamentarians, policy-makers and organisations.

Categories

Home affairs