Whistleblowers are showing the world why Facebook is toxic. The European Union has a chance to fix it.

DISCLAIMER: All opinions in this column reflect the views of the author(s), not of EURACTIV Media network.

epa09507931 Facebook whistleblower, Frances Haugen appears before the Senate Commerce, Science, and Transportation Subcommittee at the Russell Senate Office Buidling in Washington, DC, US, 05 October 2021. EPA-EFE/Matt McClain / POOL

Policymakers across the world are alarmed by the algorithmic harms revealed by Facebook whistleblower Frances Haugen — but only the EU has an immediate opportunity to enact legislation that would meaningfully address them, write Katarzyna Szymielewicz, Emma Ruby Sachs, Tanya O’Carroll and Nienke Palstra. 

Katarzyna Szymielewicz is the executive director of Panoptykon Foundation.

Emma Ruby Sachs is the executive director of SumofUs.

Tanya O’Carroll is an independent expert on Technology and Human Rights.

Nienke Palstra is a senior campaigner at Global Witness.

In her testimony before US Congress, Haugen was clear that Facebook cannot be trusted to oversee itself: “I believe there needs to be a dedicated oversight body because right now the only people in the world who are trained…to understand what’s happening inside of Facebook are people who grew up inside of Facebook,” she told Senators. When asked whether the solution would be a regulatory agency inside the US federal government, her answer was short and precise: “Yes.” 

But in practice, the best chance of achieving oversight of Facebook anytime soon is here in Europe, not the US. The Digital Services Act — a draft law promising to overhaul the rules for the largest online platforms — is poised for debate in the European Parliament on 8th November. The timing couldn’t be more fortuitous, which explains why Haugen is travelling to Brussels to address the European Parliament’s IMCO committee in person on the day of the originally scheduled vote. 

Years in the making, the DSA could fix Big Tech’s broken business model by introducing a range of powerful new tools that would force the world’s largest platforms to design their core mechanics — things like advertising and recommendation algorithms — with a view to their impact on society, not just the company’s bottom line. Three tools in the DSA’s toolbox are particularly important to mitigate the algorithmically-charged harms currently undermining human rights, democracies and devastating people’s lives.

First are the new ‘risk assessment’ obligations. Since hate speech, violence, and disinformation are facilitated by the design features of platforms like Facebook, there needs to be a clear obligation imposed on the companies to identify, prevent, and mitigate the risk of these types of content from being distributed and amplified by their products. Through Articles 26 and 27 of the DSA, platforms would be forced to take into account the ways in which their design choices and operational approaches can influence and increase these risks, and then adopt measures to mitigate them. 

Systematically reviewing and repairing the design choices that amplify and spread harmful and illegal content is better than playing an endless game of ‘whac-a-mole’ with harmful content — a game that inevitably leads to complex debates over free speech and censorship. 

Second, the law proposes to regulate the ‘black box’ of Facebook’s recommender systems for the first time. These systems dictate what each user sees, based on their past behaviour and profile of predicted interactions. Article 29 of the DSA will require platforms to provide users with clear information about the main parameters used in such recommender systems, giving individuals much greater control over the way these currently invisible systems shape not only the content they see but also at times their worldview. 

This is an essential step — but it is not enough. The proposed DSA clause should be improved by mandating that recommender systems no longer be based on data mining and profiling, unless the user has expressly given specific, informed, and unambiguous consent (i.e., turning data profiling off by default). Doing so would be a critical step towards neutralising a key mechanism in the way that harmful content currently spreads on Facebook. Even more significantly, the clause should allow third parties to offer alternative recommender systems on very large online platforms, which would increase the diversity and choice available to users over time. 

Third, any regulation is only as good as its enforcement. The DSA must therefore create stronger powers for regulators to meaningfully supervise Facebook and to take action when the company fails to address identified risks. One effective way to do this would be by establishing independent, technically-capable oversight at the EU level, as has been proposed by MEP Alexandra Geese of the Greens. A fallback option would be to give greater powers to the European Commission to step in when a platform fails to take action to mitigate identified risks. Either way, the supervising power needs both independence and muscle to be able to hold to account some of the world’s most powerful companies. 

MEPs must also be extremely cautious about the eleventh-hour proposal from the JURI Committee to include a media exemption clause in the DSA. Inclusion of such a clause would not only shield thousands of orchestrated disinformation and hate networks from the DSA’s measures —  it would also prevent platforms from taking voluntary remedial action themselves. In short: we’d end up worse off than before on the very issues Haugen and whistleblowers like Sophie Zhang, Yael Eisenstat and others before her risked so much to expose. 

Given their unprecedented profits, it comes as little surprise that Big Tech’s powerful lobbying block is working around the clock to co-opt the legislative process and waterdown any DSA rules which might threaten their surveillance commerce business model. Yet as Haugen’s disclosures made plain, it is precisely this model which threatens human rights and the public interest. That’s why over 80 European organisations have signed a declaration demanding a DSA that protects us from Big Tech’s harms. 

On 8th November, MEPs will hear directly from Frances Haugen in the lead up to the critical vote on the DSA.  Let’s hope one former Big Tech insider can convince them of what is at stake if they fail to seize this historic opportunity. 

Subscribe to our newsletters

Subscribe