Menu
Sat, 4 May 2024

Newsletter sign-up

Subscribe now
The House Live All
Culture
Communities
Inspiring Inclusion: Delivering on our vision that ‘Everyone is Welcome’ Partner content
Communities
A proud patriot – Christina Georgaki reflects on International Women’s Day Partner content
By Christina Georgaki
Culture
UK advertising announces blockbuster SXSW 2024 programme Partner content
Culture
Press releases

The Age Assurance Bill will enforce much needed protections for children online

5 min read

Age Assurance is one of the most disputed but least understood areas of digital policy. Often characterized as a fight between privacy and safety – which fails to account for the fact that the tech sector is predicated on surveillance (data collection) and knowing their customer (ad targeting).

As one young boy asked me, why do they know I like Nike trainers but not that I am 12? The answer.  It is more convenient – and more profitable - not to know.

Ofcom reports that 40 per cent of 10-year-olds are on social media, which has a minimum age of 13. This in itself is a sign that something is wrong. But the tech sector’s failure to tackle its relationship with very young children, obscures another wrongdoing. In no other area of life do we treat children of 13 as if they were adult; opening them up to be targeted with pornography, self-harm, extreme diets, pro suicide material, or be manipulated to make in-game purchases or targeted with advertisements and misinformation.

On the contrary, a pub can’t serve whiskey to a 10-year-old, a shopkeeper can’t sell them cigarettes and they aren’t allowed in a strip club – even if that has a business cost. Meanwhile, nothing suggests that the status quo online offers user privacy, our predilections, desires, location, and lifestyle are completely transparent – as is how many steps a day we walk. It’s a lose-lose situation. The system is broken.

Industry knows what they are doing to children online – but nothing suggests that they will take the action necessary to protect them

In 2017, the government promised age assurance – a system that reliably and effectively establishes a user’s age for commercial porn websites. The scope and approach of that promise was riven with controversy, but rather than finding a way through, here we are in 2021 still waiting for the law to come into effect. Meanwhile every second, digital services designed for adults are actively engaging with children with real world outcomes. Including catastrophic rates of sexual harassment in schools and 71 per cent of girls saying they are being driven out of the digital world by misogynistic violence. 

This year children got new protections from the Age-Appropriate Design Code and the Audio-Visual Media Services regulation – without effective age assurance these new protections simply build an ever-greater gap between what is promised and what is delivered. Into this picture comes the Online Safety Bill, with further promises to protect children online but still no rules of the road for age assurance, and even if it did, a regulatory code brought forward by the Online Safety Bill would only come into effect by 2024 at the earliest.

A child who was 11 when the government first promised to introduce age assurance will be an adult by the time anything comes into action. The 400,000 10-year-olds with social media accounts, and the rest of the UK’s nearly 10 million under 18’s can’t wait, they need protection now.

The technology exists to make age assurance an everyday possibility. Examples are plentiful, ranging from biometric voice or face recognition and third-party services that provide verified age tokens, through which a user can seamlessly access multiple platforms and even carefully curated questionnaires and quizzes that flush out underage users.

What is stopping us from acting now is not technology but the persistent and repeatedly debunked notion that industry knows best. Industry knows what they are doing to children online – but nothing suggests that they will take the action necessary to protect them. Just ask the Facebook whistle-blower, the Children’s Commissioner, those that work NSPCC’s Childline, Centre for Digital Hate, or 5Rights Foundation, that I chair.  Each of whom has devastating revelations from the last year of how children are being exploited online.

If doing the same thing repeatedly and expecting a different result is a sign of madness, then waiting for digital services to address this problem voluntarily is a sign of madness. When the companies see the opportunity to turn a profit, they turn a blind eye.

Which is why today I am introducing a Private Members Bill in the House of Lords. The Age Assurance (Minimum Standards) Bill would if passed, require the Ofcom to set out a mandatory code of conduct that ensure age assurance does not infringe on privacy, is safe, reliable, proportionate, effective and above all does not lock children out of services and products that they have a right access.

Its 11 principles, starting with privacy would ensure that a child who is getting their first smartphone today receives the protection they deserve – in months not years.  It will upend the status quo where age assurance is at the mercy of entrench data hungry products and services that want to grow at any cost.

Few people are excited by regulation, others positively hate it. But I routinely see arms covered in scars, teenage girls with body dysmorphia and suicidal thoughts, anxious teens who have been scammed or simply spent money that their families cannot afford. Children traumatized from violent sexual content or contact – even years later – and the tragedy of bereaved parents who have lost children to suicide. We do not accept this anywhere else; we must not accept it online.

 

Baroness Kidron is a crossbench peer.

PoliticsHome Newsletters

Get the inside track on what MPs and Peers are talking about. Sign up to The House's morning email for the latest insight and reaction from Parliamentarians, policy-makers and organisations.

Read the most recent article written by Baroness Kidron - Banning children from social media would be short-sighted

Categories

Culture