Commission questions YouTube, TikTok, and Snapchat over recommender algorithms

The European Commision has sent a request for information on the design and operation of recommender systems to YouTube, Snapchat, and TikTok, a senior Commission told Euractiv on Wednesday (2 October).

This article is exceptionally available for free! Want access to more exclusive content like this? Discover all the benefits of Euractiv Pro.

Request a trial
Content-Type:

News Based on facts, either observed and verified directly by the reporter, or reported and verified from knowledgeable sources.

SHUTTERSTOCK/CHRISTOPHE LICOPPE

Eliza Gkritsi Euractiv 02-10-2024 13:04 3 min. read Content type: News Euractiv is part of the Trust Project

The European Commission sent a request for information on the design and operation of recommender systems to YouTube, Snapchat, and TikTok on Wednesday (2 October).

These requests are under the Digital Services Act (DSA), the EU's landmark content moderation regulation.

The recommender systems, algorithms that suggest content to users, are at the 'heart' of systemic risks that platforms suggest to users, a senior Commission official said on Wednesday.

Recommendation algorithms are broadly considered to have possible harms, with digital rights and research organisations pointing to examples ranging from extremism to vaccine disinformation.

The information is meant to both deepen and broaden the Commission's understanding of these systems, probing, in particular, their underlying structure, and to figure out whether they drive users to harmful content, said the official.

YouTube and Snapchat will answer questions about the parameters used by the recommendation algorithms and their role in amplifying systemic risks, such as those related to elections and mental health, as well as the protection of minors, stated the Commission in a Wednesday press release. Under the DSA, users should be given options as to which algorithms are recommended, an official said.

TikTok will also be asked to provide information as to whether its systems can be manipulated by malicious actors, as well as its effects on civic discourse, said the official. A previously opened investigation looked into risks concerning the protection of minors, such as suicide-related content, they said.

"For Snapchat, we are asking questions on whether the recommended systems whether the recommended systems recommend illicit drugs" and content, they said.

The Commission is not addressing Meta, Facebook and Instagram's parent company because the same questions are covered in the formal proceedings opened against the company before the summer, said the official.

The EU executive has put some of its top officials on the case, another official said. "Everybody who has drafted the request has either a PhD in computer science or algorithmic harm," they said.

The companies have to submit the information by 15 November, after which the Commission can assess the next steps.

TikTok and Snapchat have received the request and will cooperate with the Commission, spokespeople told Euractiv separately.

Meanwhile, YouTube has "for years invested in products, policies, and systems" to protect the community from disinformation and support their mental health, a company spokesperson said. They added that they will continue these investments and work with the Commission to ensure compliance.

[Edited by Rajnish Singh]

Subscribe to our newsletters

Subscribe