EU’s ‘Right to Be Forgotten’ policy sets bad precedent for free expression

DISCLAIMER: All opinions in this column reflect the views of the author(s), not of EURACTIV Media network.

Last week’s announcement that Google will begin suppressing links to URLs not only for searches on EU country-level domains, but also for searches conducted from within EU countries, is bad news, write Jens-Henrik Jeppesen and Emma Llansó.

Jens-Henrik Jeppesen is director for European affairs and Emma Llansó is director for the Free Expression Project at the Center for Democracy and Technology.

The move is the latest development in the debate over the “right to be forgotten”. In 2014, the Court of Justice of the European Union found that under the data protection directive, people in the EU have a right to demand that search engines de-list URLs linking to information that is “inadequate, irrelevant or no longer relevant, or excessive.”

We are sympathetic to people distressed by information about them in the public domain, we understand the desire to suppress such information in certain contexts, and we support targeted and proportionate policies to protect individuals’ right to privacy.

But our overriding concern with the Google Spain v AEPD Mario Costeja Gonzales ruling that triggered the right to be forgotten is that it enables broad restriction of access to lawful, public information, inevitably curbing free expression.

The court left much room for interpretation about which types of removal requests should be granted, and which should not. Companies are now responsible to exercise careful and difficult balancing acts between one person’s privacy rights, and the rights of others to receive and impart information.

Companies face pressures to minimise costs, and there is a powerful incentive to accommodate too many requests, remove too much content, rather than taking on costly and risky lawsuits and legal challenges. There is every indication that these problems will persist under the new data protection regulation.

The “right to be forgotten” concept, and the de-listing right found in the Costeja case, are problematic because they restrict access to lawful and accurate reporting of information that may be of relevance to the public.

The broader the geographical scope of implementation, the greater this negative impact on access to information and free expression.

The implementation model that Google initially adopted of de-listing results on EU member states’ country-code top-level domains (e.g.,, was in that sense the least bad option.

Google has faced increasing pressure from the French data protection authority (CNIL) and other DPAs in Europe, who have insisted on global implementation of the Costeja case and asked that results be de-listed on the .com domain, in addition to the domain of the country in which the request was granted and other EU member state domains.

Google’s recent concession fortunately does not go that far, but it is significant in that it extends de-listing to also include search queries made on the .com domain from a location within the country where the de-listing request was granted. So, someone in France searching on for information about a French citizen who has requested that links be suppressed will see modified search results, though someone searching from Italy (or the U.S.) for information on that French citizen will not.

While Google and European data protection authorities have stressed that the aim of the de-listing process is not to suppress political speech or important information about individuals’ political, commercial, or criminal activities, we have already seen that the de-listing policy can be used to restrict access not only to information about an individual but also to news articles about implementation of the ruling.

We fear that in countries that engage in more severe online censorship and routinely restrict access to information, governments will demand that their censorship laws be applied to global domains so that search results can be filtered out if they’re accessed from their countries.

This would be damaging to people living in countries where the Internet still offers ways to get information that is otherwise censored. That would be a serious step back for dissidents and others who seek to promote human rights and democracy in their countries.

If this approach becomes standard practice for Internet companies, it will also become a barrier for new entrants to the online search and content-hosting business. Individual speakers and small businesses would struggle to implement geo-targeted restrictions to content based on some countries’ laws.

Unfortunately, authoritarian regimes can now point to the arguments put forward by CNIL and other data protection authorities and use them to legitimise their own demands. This is surely not what the DPAs had intended, but it may well be a consequence of the line they have taken.

A version of this op-ed was published on the Center for Democracy & Technology’s blog.

Subscribe to our newsletters