Make online platforms accountable for their algorithms, leading MEP says

MEP Christel Schaldemose, rapporteur for the Digital Services Act (DSA), presented her draft report to the European Parliament's Internal Market and Consumer Protection (IMCO). [Shutterstock]

EU lawmakers are to battle over whether online platforms should be required to open their algorithms to scrutiny, making them accountable for fundamental rights violations, after the European Parliament published its initial revisions to the planned Digital Services Act. The new blueprint also includes stronger opt-in and enforcement measures.

Christel Schaldemose, the Danish MEP who will pilot the file through the Parliament, has put algorithm accountability among a raft of reforms she wants to make to the bill.

With these rules, the European Commission would scrutinise the platforms’ automated functioning to make sure fundamental rights are respected. In the recently published Guidance on Strengthening the Code of Practice on Disinformation, the EU executive also requested that online platforms open up their algorithms and illustrate how they are fighting disinformation.

The draft report on the Digital Services Act (DSA) was submitted to the European Parliament’s Internal Market and Consumer Protection Committee (IMCO) on 28 May.

It will be debated on 21 June and MEPs will be able to submit their amendments until 1 July.

The European Commission presented the DSA together with the Digital Markets Act (DMA) in December 2020, as the two major legislative proposals to regulate the digital space. The DSA regulates the obligations of digital services that mediate goods, services, and content and is meant to replace the e-Commerce Directive, in place since 2000.

The DSA’s approach is to introduce general obligations for all online intermediary services as well as cumulative obligations proportionally to their function and size. Platforms that are considered as systemic will also be subject to a new oversight structure.

The rules are intended to enhance the protection of consumer and fundamental rights online, including new procedures to enable faster removal of illegal content.

Commission sets the bar for anti-disinformation measures

The freshly published Guidance on Strengthening the Code of Practice on Disinformation illustrates the European Commission’s expectations on the anti-disinformation measures for online platforms. While the Code is non-binding, the measures are likely to become mandatory following the adoption of the Digital Services Act (DSA).

Advertising is another aspect the DSA is seeking to discipline, notably by introducing transparency requirements that would show users who the advertiser is and why they are being targeted. Schaldemose is proposing to turn off targeted advertising, making it easier for users to opt out. Influencers would also need to make explicit when content is promoted.

The Danish MEP also proposes switching off by default the recommender systems, which enable online platforms to present to users the content that they deem is more for them. She also proposes to “allow consumers to modify their profiles used by the platform to recommend content.”

The definition of recommender systems has also been extended to include suggested content and the way platforms prioritise and rank content on behalf of users.

In terms of consumer protection, Schaldemose suggests extending liability for illegal or dangerous products to online marketplaces, in case the seller does not have a legal representative in the EU.

The EU lawmaker also included in the draft report a remedy for platforms’ decisions.

For instance, if content is removed (or not), the concerned users should be able to know how and when the platform will process their concerns. In Schaldemose’s view, a single point of contact should be made available for users as well as national authorities.

On enforcement, meanwhile, Schaldemore’s suggests “clear time frames for online platforms to assess notifications of illegal content issued by users and to remove illegal content that has a very high impact and may pose a greater threat to society or significant damage to an individual.”

The report also includes stronger provisions for the Digital Service Coordinator, which would be able restrict access to online platforms that have consistently infringed the DSA obligations in terms of harmful or illegal content.


Arba Kokalari, the centre right EPP spokesperson on the file, believes the new text poses excessive obligations on small businesses.

“New bureaucracy would be a deadly blow against Europe’s most promising start-ups and innovative companies – our future unicorns. It is essential that we do not drown our SMEs in complex obligations if we want European companies to thrive and to compete globally,” she told EURACTIV.

German Green politician Alexandra Geese meanwhile agrees with the intention to protect users, but sees the proposed mandatory opt-in solution as insufficient.

“Google and other large platforms easily obtain consent, because they offer so many services that they can make themselves indispensable. Users do not have a free choice,” she said.

Similarly, while welcoming the provisions on the recommendation systems, the Green MEP warned that consent might be obtained through “manipulative practices”.

In addition, Geese criticised lack of clarity in what she called the “Trump paragraph”, a provision added by Schaldemose that online platforms need ex-ante approval from the relevant judicial authority to block persons of public interest.

Leftist deputy Martin Schirdewan has called for a total ban on personalised advertising.

“The transparency proposals in the DSA draft report are half-hearted and protect neither our personal data nor our democracy from harmful micro targeting,” he added.

[Edited by Zoran Radosavljevic/Benjamin Fox]

Subscribe to our newsletters