Facebook to test reduced visibility of political content in Ireland, Spain and Sweden

Facebook has been testing in the United States and other countries since February 2021 a new policy of limiting the reach of political content. [Shutterstock]

Facebook is reportedly extending the testing of making political content less visible in its News Feed to at least three EU countries following positive user feedback.

The social media giant’s plans to expand their policy to Ireland, Spain and Sweden were reported in American news website Axios. The policy of reducing the visibility of political stories was launched at the beginning of the year in the United States, Canada, Brazil and Indonesia.

“We plan to keep civic and political groups out of recommendations for the long term, and we plan to expand that policy globally. To be clear, this is a continuation of work we’ve been doing for a while to turn down the temperature and discourage divisive conversations and communities,” said Facebook CEO Mark Zuckerberg when announcing the policy last year.

Facebook’s move seems to be driven by the polarised US political climate, which reached a peak in the Capitol Hill attacks on 6 January, when supporters of outgoing President Trump stormed the US Congress. Social media were pointed to as key factors in inciting the rioters, and are frequently criticised for spreading political disinformation.

The online platform says the positive user feedback is encouraging them to expand the pilot to several other countries, including Ireland, Spain and Sweden. “Before believing what FB’s user surveys say, you would want to see transparency re the methodology,” argues Robin Mansell, professor of new media at the London School of Economics and Political Science (LSE).

Political content online accounts for 6% of Facebook content, according to the platform’s own assessment.

Depoliticised content

Emma Llansó, Director of the Free Expression Project at the Centre for Democracy & Technology (CDT) welcomed the fact the social network was testing “how users respond to political content.”

“Engagement-driven ranking of content on social media can be exploited by bad actors seeking to drive disinformation and divisiveness with sensational posts,” Llansó added.

However, for Ralph Schroeder, professor at the Oxford Internet Institute, the policy “is not a long-term strategy for Facebook or other companies, since they also want to be a means for political expression, unless the content is harmful.”

Schroeder notes that “like other digital media, Facebook has problems in dealing with troublesome content,” adding that content moderation for these platforms is an “ongoing dilemma”.

Political content definition

Under the new policy, Facebook users will have to proactively join a political group, which is a measure intended to avoid the creation of echo chambers and rabbit holes where the algorithm increasingly suggests extremist or harmful content.

This approach is seen as an “improvement” by Josephine Ballon, head of legal at the German NGO HateAid. However, Ballon stresses that “Facebook does not provide a definition of ‘political groups’ or ‘political content’ on their website or in the community guidelines or elsewhere.”

“It is not clear if that includes only content and groups set up directly by political parties or also other initiatives/organisations that are related to a party. Without a public definition the criteria that is used to assess this question is completely untransparent, cannot be verified and we also have to assume that it can be evaded or circumvented easily,” Ballon added.

Similarly, for LSE’s Mansell the risk is that “what is defined as political by FB may be different from what is deemed necessary for political deliberation.”

CDT’s Llansó also urged the social media to empower users to select their preferences, as “ultimately, Facebook should be working towards giving users more control over what’s in their NewsFeeds.

Discussions on transparency requirements for online platforms’ algorithms and enabling users to modify the preference defined by recommender systems are currently taking place in the context of the EU Digital Services Act.

Make online platforms accountable for their algorithms, leading MEP says

EU lawmakers will battle over whether online platforms should be required to open their algorithms to scrutiny, making them accountable for fundamental rights violations, after the European Parliament published its initial revisions to the planned Digital Services Act. The new blueprint also includes stronger opt-in and enforcement measures.

Political news outlets

Llansó urged the social network to be transparent in its political content because it “could have huge ramifications for journalism, advocacy, and discussions of political topics in general.”

News media reporting on politics might be the big losers from Facebook’s new policy, as reducing the visibility of political news might mean a drop in traffic for online media. Facebook has acknowledged these concerns and committed to a “gradual and methodical rollout.”

Wout van Wijk, executive director at News Media Europe, defined Facebook’s approach of suppressing an entire category of content as a “very slippery slope”.

“It seems Facebook cannot get a grip on the proliferation of disinformation and hate speech on its platform and is looking at these rather extreme and damaging measures in an ultimate effort. Once again, it’ll be big tech dictating what type of content is consumed by Europeans, and what not,” van Wijk warned.


[Edited by Benjamin Fox]

Subscribe to our newsletters