Facebook, Twitter and Google are working with a coalition of governments including the UK and Canada to fight misinformation and conspiracy theories around Covid vaccinations.
Formed by the British fact-checking charity Full Fact, the new working group will aim to set cross-platform standards for tackling misinformation – as well as how to hold organisations accountable for their failure to do so.
“Bad information ruins lives, and we all have a responsibility to fight it where we see it,” said Full Fact’s chief executive, Will Moy. “The coronavirus pandemic and the wave of false claims that followed demonstrated the need for a collective approach to this problem.
“A coronavirus vaccine is now potentially just months away. But bad information could undermine trust in medicine when it matters most, and ultimately prolong this pandemic.”
As well as the three technology companies, the partnership includes the UK’s Department for Digital, Culture, Media and Sport and Canada’s Privy Council Office, fact-checkers from South Africa, India, Argentina and Spain, the Reuters Institute for the Study of Journalism, and the journalism non-profit First Draft.
Initial funding support comes from Facebook, which will help Full Fact draft the initial framework for January 2021. The two companies have a history together: Full Fact was the first UK fact-checking partner for Facebook’s anti-misinformation programme.
Video: Fauci: Vaccinating people who disregard Covid as ‘fake news’ could be ‘a real problem’ (CNBC)
“Working together to tackle misinformation is really important, especially bad content around the Covid-19 pandemic right now,” said Keren Goldshlager, the head of integrity partnerships at Facebook. “We’ve seen huge value in partnering with over 80 independent fact-checkers globally to combat misinformation in 60 languages.
“We welcome this effort to convene more tech companies, fact-checkers, researchers and governments to discuss and develop new strategies, so that we can work together even more effectively in the future.”
Vaccine misinformation has long been a challenge for social networks even before the imminent introduction of a Covid vaccine made the issue more urgent. For years, Facebook freely allowed anti-vaccination content, even as its founder, Mark Zuckerberg, spearheaded a $3bn charitable effort to “cure all diseases”. In March 2019, it relented slightly, and banned anti-vax ads that include misinformation about vaccines; in October this year, it went further, and banned all anti-vax advertising, except for that with a political message.
But “organic content” – posts and groups advocating against vaccines – is still allowed. Misinformation in that category is not explicitly banned, although it is eligible to be flagged for review by third-party fact-checkers.
YouTube, too, has only recently begun to take serious action against vaccine misinformation. In October, a week after Facebook’s policy change, Google’s video-sharing site announced a ban on misinformation about Covid vaccinations specifically. Under the policy, videos are not allowed to contain false allegations that a Covid vaccine would kill people or cause infertility, or claim that it would in some way implant microchips in recipients.
The latest actions come after the UK Labour party called for sanctions against social media companies that fail to “stamp out dangerous anti-vaccine content”.
In a letter to the culture secretary, Oliver Dowden, Labour’s Jo Stevens and Jonathan Ashworth warned that the spread of such mis- and disinformation presented a “real and present danger” to public health.
“It has been clear for years that this is a widespread and growing problem and the government knows, because Labour has been warning them for some time, that it poses a real threat to the take-up of the vaccine,” Stevens said.
“This is literally a matter of life and death and anyone who is dissuaded from being vaccinated because of this is one person too many.”