Social media giants should face prosecution over online abuse of MP hopefuls, says watchdog
Ministers should back new laws to make social media firms responsible for “vile and threatening behaviour” aimed at election candidates, the official ethics watchdog has said.
The Committee on Standards in Public Life said a new specific offence in electoral law to halt widespread abuse during election campaigns should be brought in within the year.
And they will recommend further legislation to shift the liability for illegal online content onto firms such as Facebook and Twitter – a move they say is currently held up by an EU directive.
The report says firms are not “going far enough or fast enough” to tackle the “intensely hostile online environment” of their platforms.
Committee chairman Lord Bew said the "increasing scale and intensity of this issue demands a serious response".
"We are not alone in believing that more must be done to combat online behaviour in particular and we have been persuaded that the time has come for the Government to legislate to shift the liability for illegal content online towards social media companies," he said.
He added: "This level of vile and threatening behaviour, albeit by a minority of people, against those standing for public office is unacceptable in a healthy democracy.
The report said urging companies to make quick decisions to remove content was a matter of protecting a “vigorous democracy”.
"We cannot get to a point where people are put off standing, retreat from debate, and even fear for their lives as a result of their engagement in politics,” it added.
"This is not about protecting elites or stifling debate, it is about ensuring we have a vigorous democracy in which participants engage in a responsible way which recognises others' rights to participate and to hold different points of view."
The influential committee said that the “companies are not lacking in resources” and that they were also "deeply concerned" about the failure of Facebook, Twitter and Google to collect data for reporting and take down illegal content.
"Their lack of transparency is part of the problem," the report said.
"None of these companies would tell us if they collect this data, and do not set targets for the time taken for reported content to be taken off the platform.
“This seems extraordinary when their business is data-driven in all other aspects."
The report also called on political parties to draw up a joint code of conduct on intimidating behaviour during election campaigns and for the National Police Chiefs’ Council to ensure officers are trained to identify online abuse.