Menu
Thu, 25 April 2024

Newsletter sign-up

Subscribe now
The House Live All
By Dr Vivek Murthy
Health
Communities
Passing The Carer’s Leave Act Partner content
By TSB
By Bishop of Leeds
Press releases

Infodemic: tackling conspiracy theories on social media

Policy@Manchester

6 min read Partner content

In February 2020 the Director-General of the World Health Organisation warned that “we’re not just fighting an epidemic; we’re fighting an infodemic,” because “fake news spreads faster and more easily than this virus, and is just as dangerous.”

There has been large debate around ways to control the spread of misinformation and disinformation, especially on social media platforms, which have been pivotal in the dissemination of conspiracy theories. In this blog, Professor Peter Knight, from the School of Arts, Languages and Culture at The University of Manchester, examines ways in which conspiracy theories on social media platforms can be combatted.

  • The infrastructure and business model of social media platforms have fuelled the spread of problematic information, including conspiracy theories, among the public.
  • Conspiracy theories cannot be dismissed as merely fictitious; instead we must try to understand why there is such mistrust between certain members of the public and large institutions, the government and the media.
  • The increasing self-regulation taken by social media giants is welcome; however, it does not go far enough in solving the infodemic. Regulatory measures must be put into place to ensure that both the social media platforms and the government are transparent with their data and intentions.

There has undoubtedly been a significant increase in the visibility and reach of conspiracy theories and other kinds of “problematic information” in the online environment.  In terms of volume, conspiracy theories are comparatively minor —according to one study, conspiracy theory makes up only 29% of the vaccine hesitancy discourse in English language online spaces, but their visibility and influence is far higher.

During the pandemic, people who have previously shown little interest in conspiracy theories have encountered them at a scale we have rarely seen in the past.

Making sense of uncertainty

It is hard to change people’s minds about conspiracy theories because they are not usually the result of a lack of information, or faulty information. Instead conspiracy theories are appealing because they provide a narrative that claims to make sense of everything in uncertain times. Part of their appeal comes from belonging to a community of like-minded people, executed through the infrastructure of each social media platform to create its own distinctive communities.

Increasingly, the function of conspiracy theories is to delegitimise and disorient, not by persuading you to believe something that’s not true, but rather by persuading you to not believe something that is true. This can be achieved by undermining our faith in scientific expertise, an impartial media, and democratic governance. Therefore fact checking, though still important in the overall struggle against the infodemic, will not succeed on its own. Debunking involves challenging someone’s identity, rather than simply correcting a false piece of information.

Polluting the information ecosystem

The real danger of conspiracy theories on social media during the pandemic is not a particular piece of misinformation here or there, but the pollution of the information ecosystem more generally. For example, our research has shown how a simple query on Amazon for books on coronavirus returns mainly works of conspiracy theorists.

According to a survey conducted by Ipsos/King’s College London, 15% of people in the UK think the purpose of the vaccine is to track and control the population, with another 15% undecided. For those who get their news mainly from social media, it’s between 30 and 40% (depending on the favoured platform), with the highest figures among young people. In the US, 30% believe that the virus was deliberately created and spread, while in the UK that figure is about 20%. 

Belief that medical authorities are deliberately hiding information about the harms caused by vaccines runs to approximately 20% in the UK, but conspiracy-minded anti-vaxx sentiment is higher in other countries (for example, in France it is closer to 40%). However, since the rollout of the vaccine in the UK, vaccine hesitancy (whether conspiracist or not) has reduced to 10-15%, although there are significant variations between different age groups and ethnic communities.

Instead of dismissing conspiracy believers as simply paranoid and delusional, we need to recognise that there can be legitimate concerns about, say, lockdown policy or vaccine safety. We also need to understand why particular conspiracy stories resonate—why, for example, some ethnic minorities might have good reason to be suspicious of government or medical authorities. More generally, conspiracy theories are often the result of a sense of resentment against the elites and grievance about a perceived loss of status, sentiments which we would be foolish to ignore. We cannot hope to effectively combat the growth of conspiracy theories unless we understand the underlying reasons for the erosion of trust in science, politics and the mainstream media.

Rethinking regulation

The pandemic has produced a potential tipping point in the regulation of online misinformation. Social media platforms have moved considerably in the last two years in taking action, first because of reputational damage in the wake of high-profile mass shootings and disinformation in election campaigns, but now because of the urgency of the pandemic. Some of the platforms have made progress in removing content that will lead to immediate harm, both medical and political. They have promoted authoritative health information, added content warnings, and deplatformed individuals and groups in some extreme cases.

Yet their measures are not as effective as they claim to beResearch has shown, for example, that much deplatformed content is still easily available, with posts on the mainstream platforms linking to purged content and groups that appear elsewhere. While deplatforming can be effective in the most extreme cases, it does little to persuade conspiracy theorists to change their worldview. Instead, it often confirms their sense of persecution. For this reason, demoting harmful content and demonetising repeat spreaders of problematic content are better strategies than deplatforming.

Focusing on content moderation ignores the fact that social media platforms have fuelled the problem because they have a financial incentive in stoking controversy. They remain ad-delivery engines that rely on recommendation algorithms to maximise engagement, which in turn can push people to more extreme content and groups. Rather than trying to automate the removal of individual posts, we need instead to focus on infrastructure design of social media platforms.

The pandemic has shown that self-regulation by platforms cannot solve the problem. Instead governments need to introduce regulatory measures to force platforms to uphold the kinds of standards we expect from traditional media and other public fora. In that regard, the UK’s Online Harms white paper and the EU’s proposed Digital Services Act are promising. They will, for example, require the platforms to be more transparent by allowing researchers and regulators access to data to independently verify the platforms’ claims about the effect of their interventions.

At the same time, we need to ensure greater transparency and honesty on the part of scientists, the media and the government. This involves admitting the limits of our knowledge and acknowledging when we get things wrong, not least because many conspiracy theories today are about scientific and political elites operating in cahoots with one another. It also involves actively building trust rather than merely asserting authority (especially within communities who may be wary of institutional powers.) More than anything, we need to understand why some people feel so disenfranchised and disillusioned that they turn to conspiracy theories.

 

Policy@Manchester aims to impact lives globally, nationally and locally through influencing and challenging policymakers with robust research-informed evidence and ideas. Visit our website to find out more, and sign up to our newsletter to keep up to date with our latest news.

 

Categories

Social affairs
Associated Organisation