Sharing coronavirus-related fake news can cost lives, a senior Conservative MP has warned.
Speaking exclusively to The London Economic, Damian Collins MP said that people sharing disinformation were putting their friends or family at risk – and called on tech giants to do more to stem the spread of online conspiracy theories.
The coronavirus crisis has proven fertile ground for conspiracy theories, with baseless claims – such as a claim linking the roll-out of 5G technology to the transmission of COVID-19 – seeping into the mainstream.
Seeking to tackle the issue Collins – the former chairman of the Digital, Culture, Media And Sport (DCMS) Select Committee – has partnered with Infotagion, a free-to-access fact checking website that debunks fake news about the pandemic.
“Disinformation can kill you,” Collins – whose committee led an investigation into misinformation during the EU referendum – said.
“There has been a heightened awareness of disinformation on social media since the Cambridge Analytica scandal and the investigations into Russian interference in the American election.
“While those issues were serious, it was less personal than disinformation during a public health emergency. If you take the wrong advice because you have seen disinformation which has misled you, that could impact on your health and that of your family.
“Some people have died as a consequence of taking chemicals that they believe will protect them from the virus. We’ve seen the conspiracy theory around 5G causing arson attacks on phone masts and assaults on telecoms engineers.
“There are real world consequences of this. The coronavirus is the biggest issue the world faces at the moment, and a lot of what’s being shared on social media about it isn’t true.”
Infotagion flags up false information – often submitted by members of the public – and checks its veracity, and directs readers to trusted information sources, like the website of the World Health Organisation.
Collins explained: “Infotagion will not only be responding to disinformation and fact-checking it, but creating an open online library of what people have actually seen during the course of coronavirus.
“There’s a mindset change we need to get people to shift to, from seeing something that they don’t think looks right – rather than the default setting being ‘well, I’ll share it just in case’ to actually saying ‘unless I’m absolutely sure, I won’t share it’. That’s the best way to stop disinformation spreading.”
WhatsApp has imposed a new limit on the mass-forwarding of messages to slow the dissemination of fake news and Facebook has deleted the accounts of conspiracy theorists like David Icke, but Collins believes that tech companies still must do more.
Mark Zuckerberg, Facebook’s chief executive, snubbed three invitations from Collins to appear before a parliamentary inquiry into the influence of fake news on British democracy – and the Tory MP urged the social media giant to “step in” to address the current crisis.
“We need the help of the big social media platforms”
Collins said: “We need the help of the big social media platforms, because they can see if disinformation is being spread in a systematic and malicious way by certain groups.
“What we’ve seen in the various investigations that have been done into disinformation campaigns, when it’s done on a large scale it’s often done by quite sophisticated networks.
“That’s where we need Facebook and YouTube to step in and disrupt networks that are doing it maliciously and spreading stuff that isn’t just opinion, but which is clearly false and potentially dangerous to people.
“There should be a responsibility on big tech companies to do that. Where I disagree with Mark Zuckerberg is that I don’t think that deliberately spreading disinformation about a public health emergency is part of someone’s free speech rights.”
In an interview with the BBC last week, Zuckerberg said that Facebook would remove any content likely to result in “immediate and imminent harm”, but added that the company would allow the “widest possible aperture” for freedom of expression.
Freedom of speech vs freedom of reach
However Collins suggests that such an approach does not go nearly far enough. “There is always that balance between someone’s right to free speech and the harm that speech can cause to others,” he said.
“People have got freedom of speech – but should they have freedom of reach? If someone uses social media to broadcast their opinions and deliberately target millions of people with malicious and potentially harmful information, I don’t think that is the same thing as freedom of speech.
“That is someone seeking to hijack social media to behave like a broadcaster, and I think people with that power should have to exercise responsibility or find that power taken away.”
Parliament needs to “legislate a framework of responsibility” to hold Big Tech to account, he added, accusing Facebook of “marking its own homework” in its response to the Cambridge Analytica scandal.
Collins said: “We need to give a regulator like Ofcom some sort of power of oversight to set responsibilities and obligations on the companies, and give it the power to audit and inspect them.
“Ultimately a platform’s policies really shouldn’t be down to behind-closed-doors meetings between government ministers and Mark Zuckerberg or another company’s executives, trying to make them do a little bit more.
“There should be standards that are set and a regulator should have the power to check whether a company is doing enough to meet those standards.”