Facebook whistleblower calls on CEO Mark Zuckerberg to resign

Mark Zuckerberg is the CEO of Facebook since he founded the company in 2004. [Rokas Tenys/Shutterstock]

*This article was updated with comments from Meta, formerly known as Facebook.

Frances Haugen, the whistleblower behind a massive leak of internal documents coming from Facebook, has called on CEO Mark Zuckerberg to resign.

“I think the company will not change if he is CEO – Facebook will be stronger with someone who is willing to focus on safety,” she said at the Web Summit in Lisbon on Monday (1 November).

Haugen, a former product manager at Facebook, accumulated tens of thousands of documents before leaving the company in April following what she presented as a moral crisis prompted by the company’s unethical choices.

She consequently released the document to the press, shedding light on controversial decisions from the company that triggered public outrage.

“There has been a pattern of behaviour and things where they have consistently prioritised their profits over general safety. And I feel very grateful that you can take me seriously,” she told the audience.

‘Too big to crash’ and ‘too big to care’, Facebook faces fallout from EU policymakers

Facebook’s service outage on Monday (4 October), combined with a series of revelations about the social network’s management, have prompted policymakers to ask for regulation of online platforms with a shared sense of urgency.

In particular, Haugen pointed to Facebook’s recommender system as prioritising the most polarising and divisive content.

While that might cause fights around the dinner table in the United States, she pointed to the dramatic real-life consequences such extremist content can have in difficult realities that “almost universally don’t have the basic safety systems from Facebook.”

A Meta spokesperson told EURACTIV the argument “that we deliberately push content that makes people angry for profit is simply untrue. We are on track to spend more than $5 billion on safety and security in 2021 – more than any other tech company – and have 40,000 people to do one job: keep people safe on our apps”.

Haugen cited the case of Ethiopia, where ethnic cleansing has recently taken place. The African country counts 100 million people, six different languages and 95 dialects, a reality that in her view shows the inadequacy of Facebook’s content moderation system.

“When the foundation of your security is based on knowing things language by language, it doesn’t scale up to the most fragile houses in the world,” she said.

For Haugen, Facebook presented content moderation as a false choice between censorship and free speech. She noted that non-content-based solutions existed and were more effective, but would require making the platform smaller and slower.

Asked whether she was scared, the whistleblower replied that “there are a million, maybe 10 million lives on the line in the next 20 years. And compared to that, nothing really feels like a real consequence.”

The former Facebook employee also criticised the recent choice of the company to reconvert towards the metaverse, or virtual reality.

“There’s a meta-problem at Facebook, which is that over and over again Facebook chooses expansion in new areas over sticking the landing on what they have already done,” Haugen argued.

Facebook changes its name to 'Meta' as part of rebrand

Facebook has officially changed its name to Meta, as the company redirects its efforts towards the development of the “metaverse”, a move critics say is an attempt to distract from recent damaging revelations about it’s conduct.

“I find it unconscionable that, as you read through the documents it states very clearly that you need more resources on very basic safety systems. And instead of investing in making sure that these platforms are the minimum level of safe, they are about to invest 10,000 engineers on video games,” she said.

The Meta spokesperson defined the comparison as ludicrous. “It is not as though a company can only build new technology or invest in keeping people safe. Obviously, we can and must do both of these things at the same time – and we are.”

“These past few weeks have shown how important it is to be clear about what we stand for – and that’s what we are doing by setting out our vision and responsible innovation principles for the metaverse and working with others to build it. There are complex trade-offs involved which is why we need industry regulation,” the spokesperson added.

Haugen noted that Facebook had a different organisational structure than other social networks such as Twitter.

At Facebook, the content moderation team reports to different branches of the company from the public affairs team, an internal structure Haugen said has a chilling effect for content moderators when the author of the controversial content is a high-profile politician.

Facebook has previously accused Haugen of only having released a selection of documents, giving them a negative spin.

For her part, Haugen suggested releasing all of Facebook’s internal documents to public scrutiny, arguing that companies like Google and Twitter are much more transparent.

Haugen also urged policymakers to oblige Facebook and other platforms to disclose their data.

Transparency obligations and algorithmic accountability are currently being discussed in the Digital Services Act (DSA), an EU legislative proposal on online content and services.

Make online platforms accountable for their algorithms, leading MEP says

EU lawmakers will battle over whether online platforms should be required to open their algorithms to scrutiny, making them accountable for fundamental rights violations, after the European Parliament published its initial revisions to the planned Digital Services Act. The new blueprint also includes stronger opt-in and enforcement measures.

[Edited by Frédéric Simon]

Subscribe to our newsletters