A public regulator and tough new standards are crucial to ensuring the internet is safe for everyone
Violence and discrimination, a principle we recognise in the real world, must now be enforced much more strongly in the digital one, writes Stephen Doughty MP. | PA Images
Social media and internet companies have repeatedly failed to recognise, let alone get to grips with extremism on their platforms. Future regulation must ensure they are not allowed to use ‘grey zone’ excuses to justify inaction.
Last year the government unveiled the Online Harms White Paper intended to deal with "online content or activity that harms individual users, particularly children, or threatens our way of life in the UK". The paper proposed a suite of internet regulations that would establish a code of practice for internet companies and a statutory duty of care.
But the scheduled debate on the proposed legislation has been kicked into next year, with legislation apparently delayed.
Lord Puttnam, the Chair of the Lords Democracy and Digital Committee, rightly said a potential 2024 date for it to come into effect would be "seven years from conception - in the technology world that's two lifetimes".
Yet the need could never be greater - not least in light of my own experiences in my Cardiff South and Penarth constituency over the last eight years and in my time on the Home Affairs Committee where I saw repeated failures on the part of social media and internet companies to even recognise, let alone get to grips with the cesspit of extremism on their platforms - even that linked to groups already proscribed by the Government.
There should be a duty to investigate serious complaints that fall short of the legal threshold required for police involvement
As I said in the Commons last month - I have seen online videos glamorising drugs gangs and violence, featuring convicted criminals remain available. That showed young people dripping in blood, disposing of evidence after stabbing somebody— simulated but sinister. I have seen jihadi organisations recruiting and spreading their messages of terror, including proscribed organisations, and others engaged in radicalising young people with white supremacy and antisemitic conspiracies of the left and right.
I have witnessed online attacks on black, Asian and minority ethnic communities, rampant antisemitism, Islamophobia, and attacks on the LGBT+ community. I have had my own experience online attacks, including threats of real-world events, and having to deal with those through the police.
My colleague Chris Elmore has spoken powerfully on this, chairing the APPG on Online Harms. Tackling issues such fake news, whether that is anti-vax, anti 5G or other disinformation - and also of course the crucial risks young people face online from sexual predators and paedophiles. The Shadow DCMS, Home Affairs and Health teams, including my colleagues Conor McGinn MP and Chi Onwurah MP, have also repeatedly raised these issues and pushed for more urgent action.
We need to make sure that the development of the rules regulating and governing technology platforms isn’t just a conversation between the tech giants and the government - it must involve the public and the full spectrum of civil society.
There is an array of issues that any forthcoming legislation must tackle, but I have been working with HOPE not hate on one of the issues that concerns me most - the ubiquity of hate and extremism.
Harassment, bullying and abuse online are now sadly constants in many people’s lives and extreme ideologies have been given far too much latitude. Freedom of speech, one of our most important cherished principle of democracy, should never equate to the freedom to harm or to actively advocate murder. Violence and discrimination, a principle we recognise in the real world, must now be enforced much more strongly in the digital one.
When people can’t feel safe online and experience many of the benefits of the digital world, or suffer real world violence or discrimination due to incitement, grooming or radicalisation online - that could in fact mean a net loss of freedom across our society.
Any future regulation must ensure internet companies are made responsible for proactively keeping hate off their platforms. There should be a duty to investigate serious complaints that fall short of the legal threshold required for police involvement.
Online companies must monitor the ever-changing world of online hate. They must not be allowed to use ‘grey zone’ excuses to justify inaction, not least when it comes to designated extremist organisations, violent criminals and so on. Part of this is looking at the actions of people beyond their platform, and removing those involved in hateful extremism elsewhere.
While tech companies have carefully designed their interfaces to make their products both compelling and compulsive. It’s also right they design to disincentivise, and where necessary - actively sanction - hatred.
We need an Online Harms public education campaign to start a national conversation about digital conduct and decency online.
The idealistic assumptions of the Californian tech companies have been challenged by both the coarse realities of online behaviour and the willingness of both good and evil to exploit the modern day equivalent of the social and political information revolutions facilitated by the invention of the printing press.
But digital realm fundamentally belongs to all of us. With the boundary between the world of digital and real life increasingly and more starkly porous than ever, a much needed public regulator and tough new standards can play a crucial role by bringing together the public, civil society, private sector and the state to take action to protect us all online.
Stephen Doughty is the Labour MP for Cardiff South and Penarth.
Get the inside track on what MPs and Peers are talking about. Sign up to The House's morning email for the latest insight and reaction from Parliamentarians, policy-makers and organisations.