Former Facebook employee Frances Haugen called on EU lawmakers to “set a gold standard” and take a tough stance in regulating big tech and safeguarding democracy during her testimony before the Parliament on Monday (8 November).
The Facebook whistleblower, who has leaked thousands of internal documents to the press, was invited to provide her testimony in the context of the ongoing negotiations on the Digital Services Act (DSA), a major EU proposal to regulate online content and services.
“The Digital Services Act that is now before this parliament has the potential to be a global gold standard. It can inspire other countries, including my own, to pursue new roles that would safeguard our democracies, but the law has to be strong and its enforcement firm. Otherwise, we will lose this once in a generation opportunity,” Haugen told MEPs.
Haugen praised the content-neutral approach of the DSA, stressing the need to make Facebook more transparent and its data accessible to researchers, NGOs, and journalists. She warned against the exception for trade secrets, arguing it would provide a loophole to refuse access to data.
“Only Facebook gets to look under the hood. Facebook cannot remain the judge, jury, prosecutor, and witness,” Haugen said.
Haugen made the example of Google and Twitter, which have made their data available and that are therefore accountable to public scrutiny.
A spokesperson from Meta, known as Facebook until recently, told EURACTIV that the company has “established industry-leading efforts to provide visibility into the impact of our products through our Open Research and Transparency Initiative and Community Standards Enforcement Report.”
For the whistleblower, the risk posed by Facebook is a systemic one, hence the DSA should not be limited to illegal content but also the recommendation of harmful content that violates the terms and conditions. The argument is that promoting harmful content is at the core of Facebook’s business model, as it generates more engagement hence profit.
Facebook argues that if users did find the platform full of harmful content, they would not go there in the first place.
“We’ve always had the commercial incentive to remove harmful content from our platform. People don’t want to see it when they use our apps and advertisers don’t want their ads next to it,” the Meta spokesperson added.
Haugen also criticised a potential opt-out of content moderation rules for media outlets, noting that it would undermine the effectiveness of the law if such an exception was allowed.
Asked by Christel Schademose, the leading MEP on the DSA, her view on liability provisions for members of platforms’ senior management, Haugen offered a cautious response.
“If you feel really confident in the law that you’ve passed, then executive liability can be a good thing, or at least makes them take it a lot more seriously. But at the same time, if the law is not good, it can cause counter and side effects,” she noted.
Another argument the former Facebook employee raised is that the company does not provide equitable safety standards across the board, focusing in particular on countries that the social media fears might regulate it.
She stressed that content-based solutions do not scale up to more fragile places and that AI-driven tools fail to understand the context. In her view, the answer would be to make the platform smaller and slower, but that would go against the company’s interest.
“We are on track to spend more than $5 billion this year alone on safety and security — more than any other tech company, even adjusted for scale, have changed our systems to prioritise posts between family and friends knowing it would hurt time spent on Facebook,” the Meta spokesperson added.
In addition, Haugen expressed ‘extreme concerns’ on Facebook’s bid to spearhead the metaverse, a 3D version of the internet based on Virtual Reality (VR), considering it problematic to give Facebook access to users’ homes via VR sensors.
The whistleblower also intervened in the debate around forcing platform users to provide identification that could be used by law enforcement. In her view, it would not be a solution to prevent online abuse, as abusers would use IDs from countries where it is easier to get them and electronically change their geographical location through a Virtual Private Network (VPN).
Haugen also cautioned against interoperability of online platforms, a flagship proposal from the more progressive political groups. She makes the example of email systems, which are interoperable because the content cannot be changed afterwards.
In her view, making platforms interoperable would mean it would be impossible to take the content down at a later stage, as platforms could be built in a foreign jurisdiction precisely for the purpose of harvesting such data.
The DSA is currently being discussed in the EU Parliament and Council. While the Council might already reach an agreement next week, MEPs are further behind and the different political groups are struggling to agree on key points of the file, EURACTIV has learned.
[Edited by Alice Taylor]