The contagion of disinformation

Posted on
Phone held in hand

By Ezinwa Awogu – BA philosophy graduate, GDL Law student at UWE Bristol and aspiring solicitor 

More connected than ever, information spreads instantaneously, and amongst that information, none seems to spread quite as viciously as disinformation. To be distinguished from misinformation, disinformation, as defined in 1952 by the great soviet encyclopaedia, is information deliberately designed to spread falsehoods for the deception of the public, usually with an underlining agenda for political, social, or economic gain. Disinformation is often more entertaining, and attention-grabbing than reality, and there it finds its strength over real news. Between the COVID-19 health crisis and the highly influential USA presidential election, we have seen myths, conspiracy theories, and disinformation erupt like wildfires. As the global pandemic has forced increased digitization, a higher rate of IT reliance, and an increased online presence, people are liking, sharing, re-tweeting, and subscribing more and more. The conditions are prime for the contagion of disinformation to spread within the algorithm networks of our social media and news provider outlets.

Battling disinformation in democratic countries is a delicate task, often fraught with debate and controversy. The right of freedom of expression under the common law was incorporated into domestic law in 1998 from the European convention, and the right to freedom of expression (subject to certain formalities, conditions, restrictions, and penalties) was ratified by Article 10 of the Human rights act (1998). Many of these restrictions, however, are intentionally broad and appear to have a high degree of subjectivity making them difficult to apply strictly. This broadness can make it hard to police media content, which on one hand rightly protects freedom of expression but on the other makes it more difficult to identify and combat disinformation. Section 127 of the communications Act (2003) criminalizes the use of an electronic communications network to put out messaging that is ‘grossly offensive or of an indecent, obscene, or menacing character’. However, in practice, enforcement is largely absent, as we all know, offensive and obscene content has flooded electronic communication networks for a long time with few criminal actions brought forward.

COVID-19 conspiracy theories, such as the idea that the virus is part of an elaborate government plan to increase observations and curtail rights, started around January and has culminated in mass no-mask protests with many swearing that the pandemic is fake. Whilst true that the response to the virus has been confusing and unclear on many accounts, the deliberate efforts of some to persist in the spread of conspiracy disinformation works to distract from the reality of the inequalities that the virus had illuminated. Realities such as the disproportionate effect on BAME communities and the worldwide devastating disparities in social welfare and healthcare that the virus has exacerbated are therefore pushed to the wayside with attention-grabbing disinformation headlines taking the spotlight.

The efforts in the summer months by the outgoing Trump administration, amongst other world leaders, to spread disinformation, hailing hydroxychloroquine as a ‘miracle cure’ based on insufficient evidence and inadequate testing, served the political ulterior motive to use hope and optimism as a distraction from criticisms of poor handling of the pandemic. We can see similar attempts to capitalize on the pandemic when we observe the Russian disinformation campaign labelling the Oxford vaccine as the ‘monkey vaccine’ in favour of the Russian vaccine, conspicuously named Sputnik. Most recently, the current saga of electoral fraud claims during the recent USA elections attempts to delegitimize the incoming Biden presidency and stoke the fire for social and political upheaval.

In England and Wales, Law aiding the efficient battle of disinformation is scarce. Ofcom, established under the Communications Act (2003) is a regulatory body set up to enforce certain content standards across TV and radio broadcasting, ensuring accuracy and impartiality, but there is currently no regulatory body set up for social media and online content in the same way, which has become a major source of information communication. There have been proposals to change this, and introduce more regulation and accountability in online platforms, namely in the 2019 Cairncross Review report. Nothing concrete has amounted from this as of yet. Social media outlets have recently been taking it upon themselves, in response to public pressure, to internally implement regulations on the content published on their sights. During the ongoing voter election disinformation campaign, Twitter has been flagging up tweets from outgoing president Donald Trump as misleading. Other popular social media sights such as Facebook and Instagram have displayed instances of some resistance to disinformation, but this has been limited and certainly not widespread enough to effectively battle the contagion of disinformation.

A strong argument can be made in favour of social media giants exercising more of their social responsibility and offering more content regulation. However, constitutional protection of freedom of expression limits the allowance for online content restriction, and admittedly, the more content policing happens, the less freedom is available. Finding the delicate line between personal liberty and public interest is an age-old dilemma that has not appeared to be solved as of yet, so it would seem for the moment that the responsibility lies largely with us the audience. In an age where information is so easily weaponized, it is important to be conscientious consumers with regards to the plethora of information flooding our screens. More than ever, active engagement, independent research, and a degree of critical analysis must be essential activity when choosing which information to accept and which sources to trust. We can no longer afford to be passive recipients of information that may harbor active ulterior agendas.

Useful Reference links

  1. https://www.nytimes.com/2020/11/05/technology/donald-trump-twitter.html
  2. https://www.thetimes.co.uk/article/russians-spread-fake-news-over-oxford-coronavirus-vaccine-2nzpk8vrq
  3. https://www.loc.gov/law/help/social-media-disinformation/uk.php
  4. https://www.bbc.co.uk/bitesize/guides/zyt282p/revision/2
  5. https://www.statnews.com/2020/06/15/fda-revokes-hydroxychloroquine/
  6. https://www.kcl.ac.uk/investigating-the-most-convincing-covid-19-conspiracy-theories
  7. https://www.legislation.gov.uk/ukpga/2003/21/section/127

Back to top