If anything, the Online Safety Act doesn't go far enough
3 min read
We regularly hear from children who have suffered sexual and emotional abuse online, or who have been exposed to harmful and dangerous content.
These experiences can have devastating impacts both immediately and long into the future. While the Online Safety Act can’t erase this pain and anger, it can be a vehicle for significant and lasting change.
Thanks to this piece of ground-breaking regulation, algorithms are now being redesigned. Age checks are now in place. Harmful material that promotes eating disorders and suicide should no longer proliferate on social media platforms.
This will – without a doubt – create safer, more age-appropriate online experiences for young users across the UK.
In recent days, concerns have been raised about how children will be able to find ways around these measures – including through the use of VPNs. We must be vigilant to loopholes and tech platforms and Ofcom should take action to address them.
It’s deeply concerning to see the rhetoric around the Online Safety Act shift toward loss of free expression
But we must not lose sight of what’s most important here. Dangerous content spreading online will no longer be the norm. Abusers freely and easily targeting children will no longer be the case.
Children tell us they don’t want to see violence, abuse or exploitation on the internet. This legislation takes crucial steps towards making this happen.
It’s deeply concerning to see the rhetoric around the Online Safety Act shift toward loss of free expression and bureaucratic burden. We need to be clear about what this legislation does – it protects children and young people from the most dangerous and damaging content and is our best shot at keeping them out of harm's way.
It takes adults seconds to prove their age through robustly regulated technology. But without this protection, children are totally exposed to online risks.
The Online Safety Act is not an infringement on adults' rights; instead, it echoes measures that are already in place in the real world to prevent children from accessing alcohol and adult material. If anything, we need to go further. There are substantial gaps in the legislation that must be addressed.
Minimum age limits aren’t enforced across all services meaning that companies can’t check if their users are old enough to be there. Government must close this loophole by amending the act. Until then, Ofcom should use its powers to ensure platforms offer age-appropriate experiences.
Private messaging is another blind spot. End-to-end encryption lets harmful content circulate unchecked. Platforms must not be allowed to design services that fail to protect children.
Then there’s generative AI. This technology is evolving fast. Without new regulation, we risk repeating the mistakes of social media. We must act early. AI firms need a statutory duty of care to prevent harm to children. Progress must not come at the expense of safety. It’s vital that children’s safety online doesn’t become the casualty of technological innovation.
We have potential for huge change through the Online Safety Act – let's get behind it and ensure it is enforced robustly, while also ensuring supporting laws are quickly introduced that respond to the massive challenges presented by the rapid development of AI.
We must not accept anything less than transformational change for children online.
Chris Sherwood is chief executive of the National Society for the Prevention of Cruelty to Children (NSPCC)