The EU needs a course correction in its approach to disinformation and fake news in the wake of the coronavirus pandemic, writes Juliane von Reppert-Bismarck.
Juliane von Reppert-Bismarck is the founder and CEO of Lie Detectors, a journalist-led news-literacy initiative in Europe.
It didn’t work after foreign trolls interfered with the 2016 US elections, and it won’t work with the coronavirus. The reflex reaction to the infodemic of COVID-19-related fakes – from bogus garlic-based cures to toxic 5G conspiracies – is to deploy vast armies of enforcers online and requisition infinite data sets from platforms, and to hope it works better this time.
The instinct of regulators and policymakers worldwide to hold large platforms like Facebook and Google to account for their role in the proliferation of online disinformation is right. But the temptation to focus on quick fixes poses stark risks. With the necessary political will, better solutions exist on national, EU and international levels that will help rein in the epidemic of disinformation that is sweeping away trust in established facts, in scientific method and in democratic institutions designed to protect us.
The European Commission’s recent Joint Communication on how to tackle COVID-19 disinformation is a perfect example of the problem. It bears strong resemblances to Brussels’ largely unsuccessful efforts two years ago to tackle disinformation. The EU, emerging from lockdown, is starting the long hard sprint to keystone regulatory proposals due at the end of the year. But it has set off in the wrong direction.
The Commission’s strategy, created in 2018, instituted a voluntary code of conduct by the platforms, relying on them to counter the surge in disinformation and fake news. It focused on content moderation and platform transparency via the handover of data and allocated resources to pro-EU messaging. But disinformation has continued to adapt, proliferate and confound enforcers. Even if the big platforms had complied fully with the spirit of that initiative – and the Commission itself rebuked them for failing to do so – it still wouldn’t be enough.
To demand that Facebook increases its army of fact-checkers and Youtube stops recommending conspiracy theories, and to promise to pursue Chinese and Russian trolls, makes good headlines. But in the multi-faceted world of disinformation, a narrow focus on foreign actors leaves us exposed while a pro-EU PR campaign risks simply adding white noise. And Facebook and Google cannot delete their way out of this problem. Moderation and fact-checking can only ever achieve so much given the speed and sophistication with which disinformation travels and the ever-changing platforms used to circulate it. News consumption is increasingly visual and increasingly private. Young people in particular shun regularly moderated spaces such as Facebook and Twitter in favour of the likes of WhatsApp, Instagram, Youtube, TikTok, Twitch and even gaming platforms. So whatever information the big platforms hand over, academic study of large but shallow data dumps will always risk lagging behind what is happening online.
Instead, policymakers should take on the attention economy on two fronts: the supply side and the demand side. Tackling the supply-side – addressing content liability and particularly the “monetising of lies” as the European Commission has called it – would aim to disrupt the model of dominant market players stoking outrage for revenue. The Commission’s COVID strategy asks platforms to come up with their own plan on this, but that risks yet more years passing with no action. Following the money and applying antitrust principles as they develop is a more certain avenue to ensuring this happens. The media industry too, as it has in Australia, needs to be prepared to challenge the platforms in disseminating news.
An obvious vehicle will be this year’s upcoming EU digital services legislation. Let us be clear: this is fruit at the very top of the tree. It will hurt platforms’ bottom line and be hard to get past their legal and lobbying muscle. But a fully comprehensive anti-disinformation strategy must at least attempt to reform the outrage industry at source.
In any case, whatever regulators do on the supply side must be met by at least as much attention on the demand side – the active consumption and sharing of harmful and fake content. Citizens must be equipped with critical media literacy skills to enable them to recognise and resist manipulative content.
Our work aims to build such resilience in Europe by sending journalists into schools – using videoconferencing during the pandemic – to educate schoolchildren and their teachers, and we are planning a project with senior citizens. Many students we meet have a natural grasp of the dynamics of online information. In these days of pandemic, interest in learning critical skills to tell apart facts, opinions and lies is particularly intense – among both the students and their teachers. Did Putin really release lions in Moscow to enforce lockdown, as shown in a Facebook photo now circulating on TikTok? Can I cure myself with bleach? These are questions that students line up to ask journalists via videoconference. There is also a keen interest in the motives behind disinformation. “Why would this unnamed nurse send me a WhatsApp chain letter if it’s not true?” or “How do I know whether a fact-checker is truthful?” What interests children eventually interests teachers and time must be spent capturing this momentum. Equally, older people in care homes are particularly susceptible to disinformation about COVID – fake news that could cost lives.
This is long-term work to address a long-term problem. Given current momentum, policymakers will be thanked for taking sustainable action. And this needs to be expanded into nationwide – in fact Europe-wide – changes to the education system. Critical media literacy needs to be integrated into school curricula, OECD school rankings gauges and teacher training curricula, and it must remain strictly independent from political and commercial interests. Platforms may have funds for corporate social responsibility, but the education that safeguards our democracy is too important a responsibility to be outsourced to profit-driven companies.
The EU needs a course correction in its approach to disinformation and fake news. The repetition of mistakes it made in the past risks both squandering political energy and capital, and driving democratic institutions further into a supplicant relationship to Big Tech. Deleting and data dumps will only take us so far. Combating fake news means trying to disrupt the outrage industry and educating its consumers. There is still time for the Commission to increase its ambition. But it needs to start now.