EU telecoms ministers have urged digital companies to ensure their anti-disinformation capacity is adequate in Central Eastern Europe, a primary target of Russian hybrid warfare.
EU governments gathered in France for an informal meeting on Tuesday (8 March) to discuss how to counter online disinformation from the Kremlin, following the Russian aggression of Ukraine almost two weeks ago. Representatives from Google, YouTube, Meta and Twitter were invited to the discussion.
“The battles initiated by Russia in the current conflict are raging not only on the ground but also on the Internet”, reads the unanimously adopted joint statement, which urges platforms not to become vendors of disinformation.
Since the beginning of the Ukrainian conflict, online platforms have tried to contain Russian war propaganda by taking down the accounts of state-controlled media Russia Today and Sputnik, which were eventually hit by an EU ban on 2 March.
At a press conference, French digital minister Cédric O said they had a frank discussion with the large content platform, stressing that even though the platforms made some efforts already, it was now a question of “putting pressure on the platforms to do even more”.
EU ministers explicitly made two requests. First, platforms should respond more quickly to requests made by governments when they report “fake news” or an account of dubious origin. Second, platforms are asked to increase their moderation teams in all languages.
“The major platforms must strengthen their fact-checking capabilities. They need to make sure to have enough native speakers and local contact points on the ground. This must be the case in all European countries, especially the ones in Central and Eastern Europe, which have been prime targets of Russian propaganda in the past,” Slovenia’s digital minister Mark Boris Andrijanič told EURACTIV.
In turn, online platforms called for a clearer regulatory framework that would specify how they should operate in this sort of circumstances, a request that is due to be addressed once the Digital Services Act (DSA), EU’s rulebook on content moderation, is finally adopted.
“If the DSA had been in service a few months ago, the level of preparedness of the platforms would have been higher,” France’s O said.
The DSA is at the final stage of the negotiations, and France, currently at the helm of the EU, has the ambition to finalise the legislation by the end of its Presidency. The upcoming regulation follows a risk-based approach and introduces clear rules on dealing with harmful content and disinformation.
“Most platforms were not prepared for what happened, which is not surprising given the scale of the crisis,” Slovenia’s Andrijanič added.
Until now, content moderation rules have been a matter of self-regulation, and all major content platforms have adhered to the EU Code of Practice on Disinformation – due to be revised shortly, to fill in some critical gaps identified by the European Commission.
Initially, the informal ministerial meeting was intended to tackle bringing more women into the technology sector, the environmental transition, and challenges related to the Metaverse. The French Presidency programme adjusted the agenda last minute to address the critical situation in Ukraine.
[Edited by Nathalie Weatherald]