LEAK Commission pitching disinformation measures in Digital Services Act

Press conference by Executive Vice-President Margrethe VESTAGER and Commissioner Thierry BRETON, on the Digital Services Act and the Digital Markets Act in Brussels, Belgium on Dec. 15, 2020. [Shutterstock]

The European Commission has been pitching the Digital Services Act’s (DSA) measures to tackle online disinformation to national governments, according to a working paper leaked to EURACTIV.

The working paper leaked to EURACTIV illustrates how the European Commission is presenting to EU governments how the DSA will tackle disinformation. Following the publication of the proposal last December, several EU countries were reported as requesting that the DSA be beefed up to address harmful content, notably Germany, Austria and Finland.

The Commission presents its approach as risk-based and focusing on the systemic level rather than the specific content. Facebook is mentioned several times as a platform where fake and harmful content was disseminated, together with TikTok.

One of the main legislative initiatives on the Commission’s digital agenda, the DSA is intended to regulate digital services that mediate goods, services and content.

Other initiatives have been proceeding in parallel to discipline illegal content, notably a Regulation on preventing the dissemination of terrorist content online that was adopted by the European Parliament on 28 April.

In its presentation, the Commission argues that the DSA addresses disinformation in four key ways. Firstly, through a co-regulatory approach that includes stakeholders in the definition of minimum criteria and how they will be designed, in a bid to allow the flexibility needed to address risks in a fast-moving digital environment.

Secondly, tackling illegal content and systemic risks. The latter is particularly the case for very large online platforms, defined as those reaching more than 10% of European consumers (i.e. around 45 million people).

Article 35 of the DSA requires all online platforms to issue Codes of Conduct that illustrate how the platform will tackle these risks. The presentation states that the Commission ‘may invite very large online platforms to take part in Codes of Conduct (Article 35) to address specific risks’. In addition, very large platforms will need to define how they intend to manage these systemic risks in their crisis protocols.

The third element mentioned is user empowerment, related in particular to advertisements as they have been identified as key amplifiers for spreading disinformation. Users will need to be able to see in real time when content is sponsored, who the advertiser is and why they are being targeted. Under this provision, platforms would be required to keep a repository of ads and advertisers, explaining the targeting criteria.

Finally, the Commission contends that the DSA as enforcing a diligent approach, providing binding risk management obligations on online platforms. Enforcement includes annual audits and the scrutiny of the Digital Services Coordinator, an independent authority to be appointed by EU member states. According to the DSA draft, the Digital Services Coordinators will form the European Board for Digital Services, which should ensure consistent enforcement at EU level.

[Edited by Benjamin Fox]

Subscribe to our newsletters

Subscribe