As 2020 draws to a close, I’m going to take some time out from this weekly missive. In reflecting back on the year in misinformation, the clear global frontrunner is of course COVID - everything from disputes about the origins of the virus, seeming cures for it, and the very existence of it. All have been part of what researchers call our information disorder.
In the last couple of months the US election took the primary focus of attention for many journalists and fact checkers. But in the last two weeks, with the positive news about vaccines, it’s clear that COVID misinformation is back with a vengeance (not that it ever really went away). More on that below.
But in thinking back on this weekly write-up, I wanted to consider solutions. This is a quick review and I’m surely missing some things (please write back if so), but let’s start here: misinformation is a “wicked problem”. There’s no one simple solution. It’s going to take many solutions implemented across various disciplines for us to make progress in moving from information disorder to something resembling information order.
I think we as researchers still have a job to do to paint a picture of what a digital, social media-fuelled version of such a world looks like, but there are some solutions being mooted.
In a recent article for Foreign Affairs, Nina Jankowicz offered the Biden administration some policy tips. She called for "new governmental structures and legislation." Considering the threat seriously would mean creating "a counter-disinformation czar within the National Security Council and setting up a corresponding directorate" to respond to disinformation campaigns. Furthermore, there would need to be "a federal commission for online oversight and transparency," as well as investment in digital literacy and public media. All of which makes sense.
The Forum on Information & Democracy offered 250 policy recommendations, many of which focus on transparency and the creation of a set of Human Rights Principles for Content Moderation based on international human rights law. Again, there’s a lot of sense in all this.
But I wonder if such suggestions are 1) functionally implementable (transparency may conflict with aspects of data privacy, for instance) or 2) likely at all (will platforms - and their shareholders - have an incentive to take on board these recommendations?). Regulation can take us far but legal documentation is likely to always be behind the innovation at the cutting edge of the tech industry.
Facebook is trying out a “Supreme Court” to help solve these issues. There’s lots of good thinking in here but again, this process will be long and arduous.
There may be other ways of looking to solve these problems: I’m looking at the likes of Ethan Zuckerman and Eli Pariser, who are focused on new types of digital public spaces. I’m inspired by Joan Donovan’s call for a development of librarianship for the internet.
Within the platforms themselves, there is a clear desire for AI solutions, with the idea that a fully automated approach to content moderation might be possible. But if anything, 2020 has shown that they will always need moderators.
That’s a key takeaway from these excellent slides by Amy X Zhang: "Get rid of the notion that you will be able to remove the need for humans to do this work". This is a great presentation on the challenges platforms face and practical attempts that have been made so far.
Even as we build AI solutions, however, the role of a journalist is still paramount. Consider this great report from First Draft about how QAnon content endures on social media through code words. “The next frontier in QAnon moderation may require further investment in detecting QAnon-related images, videos and keywords.”
I’m betting, just as techies are often ahead of their legal colleagues, that journalists will often be ahead of their platform friends in understanding and researching this evolution of language and visuals.
In reporting on the latest misinformation problems, journalists may be a thorn in the side of many a platform executive; another way of looking at it is they are performing a vital public service.
(Another incentive issue worries me here: we see Facebook get the most coverage for its problems and that’s largely because journalists use Crowdtangle. Similarly for Twitter and TweetDeck. There’s no similar tool for YouTube, for example, and so it gets less coverage, even as we all know that it has a big problem here. But why would YouTube create such a tool when it will only bring more scrutiny and the kind of stories we regularly see about Facebook? It’s no coincidence that YouTube CEO Susan Wojcicki has not been called before Congress like Jack Dorsey or Mark Zuckerberg in recent weeks. Kinzen has a beta of such a tool; read about it here.)
Kinzen is working on aspects of these solutions, and we’ve recently raised $2.2m, so watch this space for more in 2021.
Until then, I’ll finish with a brief round up of vaccine misinformation this week. Have a great end to 2020, let’s hope 2021 is better for all of us.
COVID Vaccine Propels More Misinformation
First Draft has published research on recent trends within COVID vaccine misinformation. To summarise, core narratives include:
A Covid-19 vaccine is unnecessary; the immune system is superior
“Big Pharma,” politicians and other key figures behind vaccines are driven by profit rather than public health concerns
Corrupt media outlets serve as “mouthpieces for Big Pharma”
Mandatory Covid-19 vaccines are tools to control populations
Covid-19 vaccines are immoral: they are made from aborted fetuses
One of the COVID vaccine narratives to watch is a comparison with thalidomide. As AP finds, there are significant differences between how the COVID vaccine was developed and the drug thalidomide, rendering any such comparison redundant.
A common COVID vaccine misinformation narrative is that it will somehow change your DNA. Because of the recurring nature of these claims Full Fact has just provided another update showing that RNA vaccines do not alter DNA.
NBC News reports on how the anti-vaccine movement has been growing on Facebook.
With the recent vaccine news, there’s been a big increase in other COVID misinformation. For example, Politifact debunks claims on Facebook that wearing masks will kill people.
The conspiracy that COVID is entirely a hoax is so prevalent right now that multiple fact checkers are producing debunks on it at the same time. Here, the AP explores how an innocent selfie from a doctor led to such claims; here, Reuters reports on a Facebook video which features claims COVID is a scam and the vaccine will implant microchips; Politifact debunks a former Pfizer employee who said that the pandemic was effectively over in the UK; and FullFact debunked another Facebook video which featured claims that COVID is not contagious and is caused by radiation.