Platforms prepare for new anti-disinformation commitments in revamped code of practice

The Code of Practice on Disinformation has been revised with stronger commitments on how to tackle disinformation online. [Feng Yu/Shutterstock]

The new Code of Practice on Disinformation, seen by EURACTIV before its publication on Thursday (16 June), contains a set of commitments related to online advertising, tacking manipulative practices, transparency and access to data.

In May 2021, the European Commission presented its guidance on strengthening the Code of Practice on Disinformation based on assessing its weak spots. The new code has been in the making ever since but was repeatedly delayed for various reasons.

First and foremost, online platforms wanted to see where the Digital Services Act (DSA), the EU’s rulebook on content moderation, would land, as the soft law instrument closely related to the binding regulation.

Secondly, the list of signatories has been significantly extended from social media like Twitter, TikTok and Facebook to many more different actors, such as advocacy group Avaaz and messaging app WhatsApp.

Commission sets the bar for anti-disinformation measures

The freshly published Guidance on Strengthening the Code of Practice on Disinformation illustrates the European Commission’s expectations on the anti-disinformation measures for online platforms. While the Code is non-binding, the measures are likely to become mandatory following the adoption of the Digital Services Act (DSA).

The new code includes a set of voluntary commitments that will become a code of conduct under the DSA for very large online platforms, namely those with at least 45 million users in the EU. For smaller actors, the approach is more flexible as they can choose the commitments they want to subscribe to as appropriate for their services and adequate for their capacity.

Ad placements

Signatories that offer advertising services commit to putting in place policies to demonetise disinformation and clamp down on advertising containing disinformation. For instance, by enhancing their control over monetised content and ads’ review process.

Each commitment comes with a set of qualitative and quantitative reporting requirements. For instance, the signatories will have to disclose which policies have been enforced and provide the number of actions they took to enforce them broken down by country or language.

The code also mandates cooperation with relevant players that are part of the monetisation of the online value chain, for instance, fact-checkers, advertisers, trade associations, e-commerce websites and crowd-funding systems.

Code of Practice on Disinformation revision extended into 2022

The process of updating the Code of Practice on Disinformation has been extended to the end of March 2022, as 26 new potential signatories join the revision. 

Political advertising

In terms of political advertising, the signatories would commit to a definition in line with the one included in the EU’s proposal for regulating political ads. If within one year from the code’s entry into operation there is still no agreement on such a definition, the task force described below will establish a working definition.

The relevant signatories would commit to a consistent approach to political and issue-related ads, providing clear labelling on their nature, providing transparent information to users, putting verification systems in place and publishing their relevant policies.

In addition, the code entails repositories for political and issue-related ads, which must be easily accessible for users and researchers based on a set of minimum functionalities.

Integrity of service

Another commitment provides a common understanding of impermissible manipulative behaviour like the creation of fake accounts, hacking, impersonation, malicious deep fakes, fake engagement, non-transparent paid messages by influencers, and coordinated manipulative actions.

Signatories operating AI systems or disseminating AI-generated content, such as deepfakes, would have to comply with the requirements of the Artificial Intelligence Act.

Exchange of information is again seen as an essential point to identify attempts of cross-platform influence operations.

Users’ empowerment

In terms of empowering users, signatories would enhance their efforts in the area of media literacy, adopt safe design practices in their systems, and open up their recommender systems.

Furthermore, they would also provide users tools to assess the accuracy of sources via fact-checking organisations, labelling from authoritative sources and the possibility to flag misleading content. Messaging apps, in particular, will have to support users’ critical assessment without affecting encryption.

Users whose content has been affected by content moderation would be able to appeal against the decisions, and their complaints would have to be handled promptly and fairly.

Researchers’ empowerment

The relevant signatories would commit to providing automated access to non-personal and anonymised data to researchers. Such access would be provided by an independent body that will vet the researchers and their research proposals.

Fact-checkers’ empowerment

Signatories would put in place a framework for cooperating with the fact-checking community in a transparent, non-discriminatory and financially sustainable way. The fact-checkers’ work would have to be fully integrated into the platform services in all EU member states and languages.

Transparency centre

The implementation would be made publicly available via a transparency centre.

Permanent task force

At the heart of the revamped code’s governance is a permanent task force that would include all the signatories, representatives of the European Regulators Group for Audiovisual Media Services (ERGA) and the European Digital Media Observatory (EDMO) and the European External Action Service.

The European Commission will chair the task force, which will take decisions by consensus.

Monitoring

The signature of the code will be followed by a six-month implementation period. One month after that, the signatories will have to provide a baseline report detailing how they have implemented their commitments.

Signatories will have to come up with a list of structural indicators to measure the effectiveness of the code within nine months from the signature. In special situations like elections or crises, the Commission might request the signatories for specific reports.

As mandated under the DSA, very large online platforms commit to external audits on their application of the code.

[Edited by Nathalie Weatherald]

Subscribe to our newsletters

Subscribe