Menu
Thu, 28 March 2024

Newsletter sign-up

Subscribe now
The House Live All
By Baroness Fox
Home affairs
Historic wins, inspiring moments and British success: MPs share what they’re looking forward at the Paris Olympics Partner content
Communities
Veterans falling victim to plague of process  Partner content
Communities
Communities
Economy
Press releases

I welcome the Govt's intention to achieve cross party consensus on the issue of regulating online harms

6 min read

Liberal Democrat Lords spokesperson for Digital writes following the conclusion of the Government’s Online Harms White Paper consultation and writes: “It would be a major mistake however to legislate too hastily before the definitions of harm and the scope in particular have been thoroughly debated.”


The Government’s consultation period for its Online Harms White Paper concluded last week. It now has to decide on the way forward in a number of areas which were green around the edges, particularly definitions of harm, principles to be adopted by the regulator and scope of online platforms covered within the broad duty of care it proposes to establish.

John Stuart Mill one of the great founding fathers of modern Liberalism in On Liberty published in 1859 articulated what is called the Harm principle where he argued that, "The only purpose for which power can be rightfully exercised over any member of a civilized community, against his will, is to prevent harm to others”.

I can’t think of a clearer example of where the harm principle should apply than with regulating our social media.

To date, the internet, and social media in particular, have opened up huge freedoms for individuals. But this complete licence has come with too high a societal price tag, particularly for children and the vulnerable.

There is too much illegal content and activity on social media (including abuse, hate crimes, fraud) that goes undealt with by social media platforms and creates social harm. The self harming material on Instagram viewed by Molly Russell before her suicide and the footage of the Christchurch killings are the most recent examples.

We can’t leave it to big private tech firms like Facebook and Twitter to decide the acceptable bounds of conduct and free speech on a purely voluntary basis as they have been to date.

So I and my party -the Liberal Democrats-welcome the fact that the Government’s proposals in the Online Harms White Paper published in early April have now emerged and that it has adopted the suggestion of the Carnegie UK Trust and others who proposed that a statutory “duty of care” should be placed on social media companies with independent regulation to enforce its delivery.

The evidence was clear from the DCMS Select Committee, the Lords Select Committee on Communications, Doteveryone, 5Rights and the Carnegie ​Trust. They have all identified the regulatory gap that currently exists.

A statutory duty of care properly framed would protect the safety of the user and, at the same time, respect the right to free speech, allowing for a flexible but secure environment for users.

There is clearly continuing debate over the ambit of the duty of care. In the first instance the Government have proposed that the new arrangements will apply to any site that allows users to “share or discover user-generated content or interact with each other online”.  This sweeps up too much incidental online activity, in particular below the line comments on mainstream media sites. The imposition of the duty and regulation must however be related to the purpose and function of platforms or sites and the nature of the user generated content involved and where satisfactory alternative regulation exists there should be no need for duplication. So for instance where there is already an independent regulator in place such as IMPRESS we would propose an exemption.

Through codes giving guidance Parliament Government and regulator would have an important role to play in clearly defining the duty of care. However, regulation and enforcement must be based on clear evidence of well-defined harm, and must respect the rights to privacy and free expression of those who use social media legally and responsibly and we welcome the Government’s stated commitment to both these aspects

We also, as the Lords Communications Committee emphasized, need when exercising powers of oversight to understand the distinction between criminal, harmful content and antisocial behaviour.

By the same token upholding the right to freedom of expression doesn’t mean a laissez-faire approach. Bullying and abuse prevent people expressing themselves freely and must be stamped out.

Users must have the ability to report illegal or harmful content (e.g. abuse) to platforms and have their reports dealt with appropriately – including being kept informed of the progress and outcome of any complaint,

Similarly, there must be transparency about the reasons for decisions and any enforcement action, whether by social media companies or regulators. Users must have the ability to challenge a platform’s decision to ban them or remove their content. Will transparency extend to that?

In both cases there should be a designated Ombudsman if the complainant is dissatisfied with the remedy given by the platform.

There is the question of the appropriate independent regulator to enforce the code and the duty of care. We believe this should fall all to OFCOM with it’s clout, experience of drawing up codes in sensitive areas affecting freedom of expression, understanding of how technology and content converge and experience of co-operating with other regulators. Ofcom with others has just published a report on on-line harms and we believe it would be the appropriate regulator. If chosen, Ofcom will need to work alongside other regulators in their specialist field.

We also need to consider the question of whether to bring in scope economic harm caused for instance by algorithmic ranking or the impact on our democracy of behavourial tracking and microtargeted messaging and this of course heavily impinges on today’s discussion of the ethics of AI and how far regulation should go.

But regulation alone cannot address these issues. As 5Rights say, children have the right to childhood. One of the greatest gifts we can give a new generation of children is the ability to question the content that is coming to them. Schools need to educate children about how to use social media responsibly and be safe online and parents must be empowered to protect their children (including through digital literacy, and advice and support for parents on best practice).

As a Liberal Democrat I welcome the clear intention by the Government to achieve cross party consensus on the crucial issue of regulating online harms. It would be a major mistake however to legislate too hastily before the definitions of harm and the scope in particular have been thoroughly debated. Given the complexity of the issues we would wish to see pre-legislative scrutiny of a draft bill setting out the new regulatory provisions. We would however wish to see the designation of the regulator and the setting up of the Centre for Ethics & Innovation on a statutory basis at an early date.

So too to ensure clarity of purpose the guiding principles informing the duty of care and its aims should be stated on the face of a bill.

At the end of the day however we must recognize that this kind of regulation can only do so much. What we need is a change of culture among the social media companies. They should be proactively seeking to prevent harm. The government refer to a culture of continuous improvement on their part being the desired goal. We very much agree and that is a lesson which I very much hope has been taken to heart in the wider tech community.

Lord Clement-Jones is the Liberal Democrat Lords spokesperson for Digital 

PoliticsHome Newsletters

Get the inside track on what MPs and Peers are talking about. Sign up to The House's morning email for the latest insight and reaction from Parliamentarians, policy-makers and organisations.

Read the most recent article written by Lord Clement-Jones - The long-awaited AI Governance White Paper falls far short of what is needed

Categories

Home affairs