The review process for the Code of Practice on Disinformation has gained eight new potential signatories including businesses and civil society groups, but the Commission worries over the slow pace of the process.
The new prospective signatories expressed their willingness to undertake the additional requirements that a strengthened version of the Code would include at a signatories assembly meeting on Thursday (30 September).
Also released this week were a series of reports detailing the steps taken by some of the Code’s biggest signatories to tackle COVID-19-related disinformation during July and August.
“I am pleased to see new actors from relevant sectors answering our call and engaging in the overhaul of the Code of Practice”, said Commission Vice-President for Values and Transparency, Věra Jourová.
“I encourage others, including platforms, messaging services and actors in the online advertising ecosystem, to join as soon as possible to co-shape the process.”
Concerns remain within the Commission, however, over both the likelihood of reaching an agreement on the strengthened Code by the end of the year and the scope for compliance to the provided guidelines by signatories.
The new potential signatories join platforms such as Facebook, Google, Twitter, and TikTok, amongst others, that have already signed up to the Code, a self-regulatory instrument designed to commit companies to certain standards when it comes to tackling disinformation.
Video platform Vimeo and social network Clubhouse are among the new additions, but this round also includes a number of organisations with disinformation-specific expertise, such as advocacy organisation Avaaz and WhoTargetsMe, a political ad transparency browser extension.
Luca Nicotra, campaign director at Avaaz, told EURACTIV that the group’s aim was to make sure the Code was a success: “Europe cannot afford a repeat of the previous Code, where platforms marked their own homework, giving themselves high grades for poor performance.”
While describing the Commission’s Guidelines as “strong”, he warned: “Our involvement is not a blank check, quite the opposite. We will only be involved – and sign up to the final text – if the Code delivers what’s needed to fight the problem of disinformation plaguing our societies.”
Launched in 2018, the Code underwent an assessment by the Commission as part of the European Democracy Action Plan announced in December 2020.
A number of significant flaws were identified in the original version, and guidance on its improvement published in May this year called for a strengthened slate of measures including wider participation, better tools for users to recognise false content and increased fact-checking monitoring and transparency.
However, EURACTIV has learned that the Commission is concerned that the attention of platforms is currently focused on negotiations on the Digital Services Act (DSA), to the detriment of the Code. Thus, the EU executive’s call for a “timely” revision, notably by the end of the year was requested.
Even if it is completed by then, officials worry that its signatories might use the shorter timeframe in which it was agreed to resist complying with the measures included in the Commission’s guidance.
“The strengthened Code cannot fall short of the expectations we have set out in our Guidance from May,” Jourová added.
The EU executive is also looking to further expand the current pool of signatories, specifically to include messaging services such as WhatsApp. Expanding participation might, however, make decision-making more complicated, especially as the nature of the organisations involved become more heterogeneous.
The Code is an instrument of soft law, hence it is voluntary and non-binding. However, some of its provisions, especially those anticipated in the Commission’s guidance, may become mandatory once the DSA is adopted.
Related developments in the DSA’s progress, however, could throw the Code’s efforts off course. A spokesperson from EU DisinfoLab told EURACTIV that the Code would not be effective as long as a potential media sector exemption from the DSA was being proposed and attracting the support of MEPs.
Members of the media and publishing industry are pushing for a sector-wide carve-out from the legislation due to concerns that it could lead to big tech companies curbing their freedom of expression in the name of compliance. A number of anti-disinformation organisations in Europe, however, have expressed alarm at the impact this could have on their work.
The Commission is also set to propose legislation on transparency in political advertising by the end of the year.
In line with a pandemic reporting initiative launched by the Commission in 2020, existing signatories of the Code also released data points related to the steps they took in July and August to combat COVID-19 disinformation.
A number of initiatives taken by various platforms focused on the provision and promotion of verified information about vaccinations. Twitter, TikTok, Microsoft, and Facebook all reported specific efforts in France, which has struggled with vaccine hesitancy and misinformation during the pandemic.
On Wednesday (29 September), YouTube announced that it would ban any content that contained disinformation concerning any approved vaccine, an expansion of its previous prohibition on those making false claims about COVID-19 jabs. The switch came, the platform said, after it realised that generalised mistrust of vaccinations contributed to COVID-19 vaccine-specific hesitancy.
In the two months covered, Facebook alone reported having removed more than 110,000 pieces of content in the EU for violations of its and Instagram’s COVID-19 and vaccination misinformation policies. This number climbed by another 40,000 in August.
The results of the reporting project, the Commission said, are being fed back into the Code’s review process, in part to inform the strengthening of its monitoring framework.
[Edited by Luca Bertuzzi/Zoran Radosavljevic]