Sat, 22 June 2024

Newsletter sign-up

Subscribe now
The House Live All
BCG Britannia Stakes charity race at Royal Ascot is a win-win for good causes Partner content
Why the future of business is mutually beneficial Partner content
Why the next government must make fraud a national priority Partner content
The UK’s relationship with infrastructure needs a reset. Here’s how to do it. Partner content
By Alex Vaughan, Chief Executive Officer at Costain Group PLC
Press releases
By Betting And Gaming Council

John Mann MP: Fines are a necessary incentive for social media giants to take down extremist content

4 min read

Labour MP John Mann says a voluntary approach has failed and internet firms need to be "hit where it hurts" to convince them to take action against extremist material online.

Max Hill, the Government’s independent reviewer of anti-terror legislation, has, according to a number of news reports, said he does not see how “criminalising” online giants for failing to “do enough” would work, questioning how Ministers would define adequate and timely action.

If true, it appears Max Hill has sipped the internet industry cool-aid and the Government might want to review this reviewer. The new wave of legislation sweeping across Europe, led by Heiko Maas and the German Government, should be welcomed as a necessary and important incentive for action to tackle the upload of illegal content online.

As Chair of the All-Party Parliamentary Group Against Antisemitism, I have been speaking out about, and acting to address, cyber hate for more than ten years. From major policy reports to meetings with social media HQs in Dublin, from international efforts in Palo Alto to Cross-Departmental meetings of the UK Government and voluntary agreements with the internet giants: we have tried it all. The truth is that legal and economic incentives are the best drivers of change in the industry.

The Bill approved by Germany’s cabinet proposes that social media networks will have to assign an individual responsible for taking down or blocking evidently criminal material 24 hours after receiving an initial report. That time period extends to seven days where content is not immediately recognised as illegal. Failure to act could leave the companies with a 50 million Euro bill. Moreover, networks will be required to follow through with complainants, deliver frequent reports of complaints and provide details of decision-making processes.

The European Commission, meanwhile, has been working on a voluntary agreement along broadly the same lines. Improvements have been made but there is still a huge gap to be filled. In 59% of cases, IT companies responded to notifications relating to illegal hate speech by removing the content. This figure rose from 28% six months earlier. In the same six-month period, there was a rise from 40% to 51% of notifications being reviewed within 24 hours (with the exception of Facebook which excels). This upward trend is good; however, 59% is just not good enough.

Bringing the tech giants “firmly onside” has been tried, but has been far less successful than hitting them where it hurts - in their wallets. The voluntary agreements have been numerous. They were signed by Google/YouTube, but despite me challenging them for years about policies for the video service, there was no change. “You do it on copyright, why not on illegal hate material?” I would ask, only to be told that YouTube can’t be the arbiter of what is or isn’t illegal. It was only when the Home Affairs Select Committee took an executive to task, and subsequent press coverage had advertisers nervously reviewing their ad placements to see whether or not they had inadvertently assisted terrorist fundraising efforts, that a system-wide policy change was adopted. All of a sudden, YouTube found it could act.

The internet companies have, rightly, had a certain degree of immunity. In the US, as part of the 1996 Telecommunications Act, providers can remove or supervise content according to their terms of service without being considered a publisher. In the UK, these same corporations have protection from liability through a European Directive harmonised into UK law. There is publisher immunity, but an expeditious take-down on notice service is required. However, both in the UK and the US, laws exist to protect us and if these companies can’t abide by the law they must be liable. That is how we measure ‘enough’: it is compliance with the law and commensurate sanctions.

If they are fearful, let these companies fund – and properly resource – their own regulation.

To bring Britain into step with the rest of the world, we must do better. An online safety commissioner should be created. Let us instigate a more effective complaints scheme for the most egregious failures by Social Media companies to remove illegal content, together with the establishment of an economic incentive for such companies to abide by their own terms of service and the law.

The safety and security of our country is paramount. A healthy democracy requires free speech, yes, but equally the proper and consistent application of the rule of law. There is no coaching, no coddling or collaboration. The internet is wonderful, but platforms cannot be immune from the law, ‘like’ it or not.


John Mann is the Labour MP for Bassetlaw and chair of the All-Party Parliamentary Group Against Antisemitism.

PoliticsHome Newsletters

Get the inside track on what MPs and Peers are talking about. Sign up to The House's morning email for the latest insight and reaction from Parliamentarians, policy-makers and organisations.

Read the most recent article written by Lord Mann - Lords Diary: Lord Mann

Partner content
Connecting Communities

Connecting Communities is an initiative aimed at empowering and strengthening community ties across the UK. Launched in partnership with The National Lottery, it aims to promote dialogue and support Parliamentarians working to nurture a more connected society.

Find out more