New signatories to join EU anti-disinformation code amid calls for improvement

“I welcome all new signatories on board”, Commission Vice-President for Values and Transparency Věra Jourová said. [EPA-EFE / JOHANNA GERON / POOL]

While “significant changes” related to monitoring and transparency are needed in the European Commission’s Code of Practice on Disinformation according to a report published this week, 16 new prospective companies have signed up to help with its drafting.

The new participants include civil society groups, software companies, and marketing agencies, building on the Code’s September expansion to encompass actors other than the major tech companies who joined in 2018.

LEAK Commission pitching disinformation measures in Digital Services Act

The European Commission is defending its plans to tackle online disinformation from criticism from national governments, according to a working paper leaked to EURACTIV.

Their addition coincides with the publication of a report by the European Regulators Group for Audiovisual Media Services (ERGA) outlining recommendations for improving the code, a self-regulatory tool through which organisations and companies commit to better tackling disinformation on their platforms.

“I welcome all new signatories on board”, Commission Vice-President for Values and Transparency Věra Jourová said.

“We need a revised Code against disinformation, with a strong monitoring framework based on performance indicators. It is now for the current and prospective signatories to deliver. The only code we will accept is a strong and ambitious code implementing our guidance fully, addressing its current shortcomings,” she added.

New participants

Signatories to the original code were limited to major platforms such as Google, Facebook, Twitter and TikTok, but have expanded since the Commission assessed the tool as part of the European Democracy Action plan launched last December. The EU executive has called for its revision by the end of the year.

The guidance published by the Commission following the review said the original code required substantial improvements, including increased monitoring and transparency measures and broader participation.

Commission sets the bar for anti-disinformation measures

The freshly published Guidance on Strengthening the Code of Practice on Disinformation illustrates the European Commission’s expectations on the anti-disinformation measures for online platforms. While the Code is non-binding, the measures are likely to become mandatory following the adoption of the Digital Services Act (DSA).

Accordingly, the potential signatories who joined the drafting process in September included actors other than the largest tech companies. Among them were platforms such as Vimeo and Clubhouse, an ad transparency company, and advocacy group Avaaz.

The latest round of possible subscribers to the code includes Twitch, Adobe, The Bright App, journalist safety organisation Reporters Without Borders, research body The Netherlands Organisation for Applied Scientific Research, and communications company Havas.

The participants will join the drafting of the strengthened Code of Practice, which will eventually function as a co-regulatory tool with the Digital Services Act (DSA). Though currently non-binding, some of the provisions it includes may subsequently become mandatory under the DSA.

However, as EURACTIV reported in October, there are concerns in the Commission that lengthy negotiations over the DSA could draw platforms’ focus away from the code. Even if an update is agreed upon, signatories could use negotiation delays to argue against full compliance.

Commission pushes for 'timely' update of disinformation code of practice

The review process for the Code of Practice on Disinformation has gained eight new potential signatories including businesses and civil society groups, but the Commission worries over the slow pace of the process. 


ERGA’s report identifies insufficient data, the lack of uniform reporting,  a publicly-available database of authoritative content sources, and the absence of explicit content definitions or a common repository for information related to mis and disinformation policies as critical issues within the code.

It also notes that the tool contains too few commitments for reporting on the automated and AI systems in place for tackling disinformation and that researchers and national regulators both need to be guaranteed access to data to facilitate monitoring.

Speaking at an event on the report on Monday (15 November), Krisztina Stump, head of media convergence and social media at DG CNECT, the Commission’s digital department, said that improved data access would be crucial in facilitating monitoring and accountability.

She added improvements in the quality of such data would also be needed, as what is currently being provided is “repetitive and unclear” and should instead be “fit for purpose and digestible”.

Based on its findings, ERGA has offered 10 recommendations for improving the code. Among them are measures to provide more detailed and contextualised data, develop standardised approaches to structuring reporting and to verify implementation across EU countries, and clarify what kind of content signatories need to take action.

ERGA said it would be beneficial for the codes monitoring, accountability, and transparency credentials if a single repository of policy information were established. In addition, a database of trustworthy sources within the Transparency Centre, proposed under the Commission’s guidance, should be considered.

An update should also include a concrete requirement for providing data on the deployment of automated systems, should mandate research access to data, and should strengthen the signatories’ existing commitment to installing an independent auditor to oversee their enforcement of the code, it concludes.

“It is important to stress that in light of the Digital Services Act (DSA) proposal, ERGA sees the strengthened Code as an opportunity to test some of the proposals in the DSA related to access to data, audits, external oversight, or risk-mitigating measures”, the report adds.

[Edited by Luca Bertuzzi/ Alice Taylor]

Subscribe to our newsletters