The German NetzDG law to counter illegal online speech has become a prototype for internet censorship in authoritarian states. The Commission’s proposal for the new Digital Services Act must avoid this template, write Jacob Mchangama and Natalie Alkiviadou.
Jacob Mchangama is Director at the think-tank Justitia. Natalie Alkiviadou is Senior Research Fellow at The Future of Free Speech.
Since the twin shocks of Brexit and the election of Donald Trump in 2016, European democracies have intensified efforts to counter the forces of populism that challenge the established order. This includes fighting back against the dark side of social media and what many politicians – including Chancellor Merkel and President Macron – perceive as a digital tsunami of disinformation and hate speech.
How to counter illegal and harmful online content is also a key issue in the Commission´s soon to be published proposal for a Digital Services Act. But, in a digitally connected world, what happens in Brussels or Berlin has global repercussions.
In 2017 Germany adopted the ‘Network Enforcement Act’ or NetzDG. It obliges social media platforms with a minimum of 2 million users to remove illegal content – including hate speech and religious offense – within 24 hours, or risk steep fines of up to 50 million euros.
In May 2019, Justitia issued a report which documented that at least 13 countries (plus the EU) had adopted or proposed models similar to the NetzDG matrix. According to Freedom House’s reports on Press Freedom/Freedom on the Net, five of those countries were ranked “not free” (Honduras, Venezuela, Vietnam, Russia and Belarus), five were ranked “partly free” (Kenya, India, Singapore, Malaysia and Philippines), and only three ranked “free” (France, UK and Australia). Most of these countries explicitly referred to the NetzDG.
Almost a year later, Justitia´s Future of Free Speech Project has updated its report which paints a bleak picture of the continued erosion of free speech norms. A total of 11 new countries have followed the German template, whether by conscious policy or not. Four of these countries make specific reference to the NetzDG (Kyrgyzstan, Brazil, Austria, Turkey).
Only one of the new countries is free (Austria), 7 are “partly free” (Mali, Morocco, Nigeria, Cambodia, Indonesia, Kyrgyzstan and Brazil) and 3 are “not free” (Ethiopia, Pakistan and Turkey).
All countries require online platforms to remove vague categories of content that include “false information” (Kyrgyzstan, Nigeria and Morocco), ‘blasphemy’/’religious insult’ (Indonesia, Austria, Turkey), ‘hate speech’ (Austria, Cambodia), incitement to generate anarchy (Cambodia) and ‘personal and privacy rights’ (Turkey). Most of these categories are difficult to reconcile with international human rights standards.
Worryingly few of these countries have in place the basic rule of law and free speech protections built into the German precedent as witnessed by Turkey´s rapidly shrinking space for both online and offline dissent.
In fact, the Turkish law has been described as “the Worst Version of Germany’s NetzDG” since it essentially forces online platforms to act as the privatized censorship arm of President Erdogan’s notoriously thin-skinned and paranoid government.
The collateral damage of the NetzDG is not limited to the countries mentioned above, since the vast majority of content deleted by social media platforms is removed pursuant to the relevant platform’s community standards, rather than national laws.
In the first quarter of 2018 (when the NetzDG had entered into force) Facebook removed 2,5 million pieces of content for violating its global Community Standards on hate speech. That number increased to 4,1 million and then 9,6 million in the first quarters of 2019 and 2020 respectively, before surging to more than 20 million pieces of deleted pieces of content in the second quarter of this year.
Moreover, from the first quarter of 2018 to the second quarter of 2020, Facebook’s AI has increased the rate of deleted content flagged by Facebook before any complaint by human users from 38% to 94,5%.
This suggests that the pressure on social media companies to remove more illegal speech has impacted the enforcement of community standards more significantly than the direct enforcement of the NetzDG and other such laws. After all, if a platform like Facebook defines “hate speech” more broadly than legally required and is able to remove almost 95% of all such speech before any users have a chance to view it, relatively few complaints should be expected.
This platform policy of “better safe than sorry” may result in “over-implementation,” by defining “hate speech” so broadly that it covers content that would be protected under international human rights law such as Article 19 of the International Covenant on Civil and Political Rights, which provides significantly stronger free speech protections than Facebook´s Community Standards.
Through this process, laws such as the NetzDG results in “moderation without representation” for social media users in countries where governments are less inclined to clamp down on controversial speech.
The German government’s adoption of the NetzDG was a good-faith initiative to curb hate online and the collateral damage it has wrought was never intended. But the law has provided a blueprint for Internet censorship that is being used to target dissent and pluralism in countries where social media is the only way of speaking truth to power.
This development creates a regulatory race to the bottom that undermines freedom of expression as guaranteed by international human rights standards.
It would be misleading to blame Germany for the draconian NetzDG clones adopted in authoritarian states and flawed democracies. But the fact that the spread of illiberal norms based on the NetzDG precedent has continued unabated should give European democracies and the Commission food for thought when it comes to countering illegal online content while preserving free speech.
[Edited by Samuel Stolton]