After Trump: Rule of law and Big Tech Regulation

DISCLAIMER: All opinions in this column reflect the views of the author(s), not of EURACTIV Media network.

The stunning events in Washington on 6 January not only showed the fragility of modern democracies but also demonstrated the untamed power of Big Tech, writes Marc Sundermann, Paolo Cesarini and Christophe Leclercq. EPA-EFE/JIM LO SCALZO

The stunning events in Washington on 6 January not only showed the fragility of modern democracies but also demonstrated the untamed power of Big Tech, write Marc Sundermann, Paolo Cesarini and Christophe Leclercq.

Marc Sundermann is a law expert and Director of Europe’s MediaLab. Christophe Leclercq founded the EURACTIV Media Network and chairs the MediaLab. Both were on the EU’s High-Level Expert Group on disinformation. Paolo Cesarini is a former European Commission official.

EU Commissioner Thierry Breton and Vice-President Věra Jourová were clear in their reactions to the insurrection. And they are absolutely right: it is scary to see how within the blink of an eye,  powerful persons on earth can be silenced by arbitrary decisions of social media platforms’ CEOs, without due process.

But what are the conclusions?

From symbols of free speech and democracy in the early 2000s, these platforms moved to oligarchies controlling the public space.

Their extraordinary success is largely based on business models built on avoiding any accountability. Their terms of use of the service are their Bible. The liability exemptions of US legislation and Europe’s e-commerce Directive are their Holy Grail.

As private companies, their legitimate goal is to maximize profits. Thus all is fair game for their business: compromising with authoritarian regimes, or blackmailing democratically elected governments if these try to regulate and rebalance economic inequalities.

Self-regulatory efforts have proven inefficient over the last years. And social media are quite flexible with their own set of rules. For years, Donald Trump breached these terms without consequences.

Only when it was clear that he would be out of power, platforms dared to act. Unilaterally and without any checks and balances.

Sold as the defence of democracy, this is appeasing the next administration certainly not the advent of virtuous behaviour. Social media is becoming a systematic threat to democracy.

Start from the basic: illegal is illegal?

What is illegal online should be illegal offline. Guaranteeing the Rule of Law is the key for every democratic sovereign state. This also includes the enforcement of the rules. Platforms should not be allowed to put their own terms of services before the rule of law.

The rule book to tackle illegal content or activities already exists. The problem, however, is speed. Dissemination online is much faster than offline.

And to avoid accountability and to continue “business as usual”, platforms may be slowing down court action (remember how difficult it was to find out where Facebook can be sued in Europe, which led to lengthy court disputes).

Another problem is the vast, grey zone where information manipulations carried out by various domestic or foreign actors, hostile to democracy and largely unchecked, continue to operate through covert means, undisturbed by platforms’ content moderation practices.

Neither CEOs nor politicians to censor: authorities to help steer

In these matters, European public opinions are moving from ‘laissez-faire’ to asserting sovereignty. This sounds good in principle. But will it lead from one censorship to another one? From Charybdis into Scylla?

The solution is neither of them. Legal processes are necessary to weed out what is clearly illegal, for example, calls for violence. But this is slow and it urgently needs to be addressed.

When it comes to countering information which may be harmful but not per se illegal, one should promote good information while not censoring the ‘bad but legal’.

Censorship on steroids is not the solution. There should be a modicum of free expression for outlying views or doubtful statements while reducing the virality of fake news.

Is this doable? Yes!

Take, for example, online ads that are outdated: algorithms know how to serve them to few viewers. Or take an analogy on financial markets: junk bonds with poor ratings are not attractive, but not outlawed. So, how to tackle junk information?

What we really need is moderation on steroids, with external standards, based on assessing the trustworthiness of sources, and not subject to CEOs’ whims. For normal journalism, apart from public broadcasting, standards are set by the journalism profession, and the quality is mostly acceptable.

For social media content, what we need is independent agencies, setting indicators, in turn, used by algorithms. There could be an open market for such indicators, and public authorities could set ‘must carry’ obligations for platforms.

Can the Digital Services and the Digital Markets Acts be the solutions? They can only be welcomed, but cannot address all the challenges.

They are legislative monsters distracting from the urgent rebalancing of the online ecosystem and supporting quality information to fight misinformation.

Appeasing politicians with humble statements is an art that Big Techs know too well, but is rarely followed with action: let’s not be naïve again.

Existing laws are a basis for acting now. At the end of the day, it is clear: we need accountability in the system, and a faster system. Otherwise, the system doesn’t work.

Subscribe to our newsletters

Subscribe
Contribute