Menu
Mon, 29 April 2024

Newsletter sign-up

Subscribe now
The House Live All
Communities
Inspiring Inclusion: Delivering on our vision that ‘Everyone is Welcome’ Partner content
Communities
A proud patriot – Christina Georgaki reflects on International Women’s Day Partner content
By Christina Georgaki
Culture
UK advertising announces blockbuster SXSW 2024 programme Partner content
Culture
The UK is lucky to have its international students Partner content
By UCL
Culture
Press releases

The Online Safety Bill needs more robust protections for children – we cannot afford further delay

5 min read

Like many Members of the House of Lords, I am a grandparent. As with most grandfathers, when I see stories on the news of children harmed by social media, I am troubled.

When you hear the tragic case of 14-year-old Molly Russell, who took her own life after being inundated with self-harm content on social-media, it’s hard not to worry whether the youngsters near and dear to us are being served up the same horrific content when they go online.

So it is dismaying that the Online Safety Bill, designed to address these threats to youngsters, has faced continual delay. A lot of this is because so much time has been spent working out how to police online debates between adults.

Children have been denied a vital protection from the horrors of online pornography

It’s worth remembering that Parliament voted to greatly strengthen online protection for children back in 2017 but, shockingly, these protections were never brought into force. Age verification, to a more robust standard than in the Online Safety Bill, was required for pornography sites under Part 3 of the Digital Economy Act in 2017. But it was never implemented.

One of the excuses offered for this baffling decision was that the issue would be subsumed within the new Online Safety Bill. But this means that children have been denied a vital protection from the horrors of online pornography for five years.

Instead, civil service and parliamentary time has been absorbed trying to develop a regime which would encourage online platforms to stop adults viewing certain legal content that is deemed “harmful” to them. It’s pretty clear what content is harmful for children. But adults are not children. So the question of what is harmful for adults is much, much harder to answer. And any legislation in this area is much more open to misuse.

I welcome the government’s decision to drop this regime that risked treating adults like children. But what a terrible irony it is that this controversy has delayed the introduction of protections for children themselves.

And how sad it is that the new Bill weakens some of those protections. On the key issue of protecting children from online pornography it does not even meet the level of protection Parliament agreed in 2017.

The Children’s Commissioner has found that Twitter – not sites like Pornhub – is the online platform where most young people view pornography. So, it is clearly right that the Online Safety Bill brings social media companies within the scope of age verification requirements. But, in almost all other respects, the standards to protect children from pornography are lower than corresponding duties in Part 3 of the Digital Economy Act (which the new Bill will repeal).

Firstly, the Online Safety Bill does away with the thorough definition of pornography in the Digital Economy Act, which was based on British Board of Film Classification (BBFC) age classification guidelines. It replaces this with a vague one sentence definition which risks opening up loopholes.

Secondly, the Bill drops the enforcement mechanism requiring the regulator to notify payment-service providers when a platform is failing to prevent children accessing porn. This is astonishing, since going after their income has been shown to be the most effective way of forcing porn companies to remove illegal and dangerous content. In 2020, after years of refusing to remove horrific child pornography and rape videos from its site, Pornhub deleted more than ten million of its vilest videos after Visa and Mastercard announced that they were withdrawing availability of their payment services. Porn companies don’t care about people, but they do care about money. 

Furthermore, whereas the Digital Economy Act provided for legally enforced guidance to govern which age verification technologies meet the appropriate standard, the Online Safety Bill leaves the design of age verification systems in the hands of porn giants themselves. We know what their approach will be. Pornhub’s parent company MindGeek have just created their own free VPN service which children can use to get around age restrictions. Allowing them to create their own age verification systems is akin to letting a burglar install the locks on your home.

Beefing up and properly implementing age verification is a key way to protect children from genuinely harmful content. But reintroducing the controversial requirements for tech firms to address content for adult users, as some are advocating, is not going to help. It is a dangerous distraction from the really serious issue of child protection. Some say legislating against content viewed by adults will have a knock-on effect on children but robust age verification would ensure children do not see such content anyway.

As well as being a distraction, there is a real risk that by re-introducing a vaguely defined category of “legal but harmful” to adults, we will fuel our already hyper-censorious cancel culture. The concept of harm is too often weaponised by activists to try to silence other people’s opinions. In 2017, the Christian Union at Balliol College, Oxford, were banned from the college’s freshers’ fair for fear the presence of Christians could “harm” some attendees. It has become a toxic term that is used to divide, rather than unite.

But this subjective concept of harm is what the previous version of the Bill asked Big Tech companies to apply when deciding what you and I are entitled to post online. The relevant categories of content would have been created by the Secretary of State via Statutory Instrument, meaning minimal parliamentary scrutiny. The Bill even used the term “psychological harm” which government ministers confirmed had no objective clinical basis. Instead under the supervision of Ofcom, Big Tech would have been expected to decide what content risked ‘significant harm to an appreciable number of adults. Significant harm is a concept drawn from child protection, but what does it mean in reference to adults? Given the ambiguity – and the risk of very heavy fines for getting it wrong – tech companies would almost certainly have over-censored.

Peers must not re-open the “legal but harmful” debate. Instead, let’s focus on our grandchildren and get more robust protections for children on the statute book and into effect as soon as possible.

 

Lord Curry of Kirkharle, crossbench peer.

PoliticsHome Newsletters

Get the inside track on what MPs and Peers are talking about. Sign up to The House's morning email for the latest insight and reaction from Parliamentarians, policy-makers and organisations.

Categories

Culture