Parliament adopts ambitious stance on EU’s future regulation of digital platforms

The European Parliament has overwhelmingly backed a series of reports which could have a profound impact on the future of the platform economy, supporting measures such as a possible ban on targeted advertising, reporting procedures for illegal content, and better detection of fraudulent vendors.

The Digital Services Act represents the EU’s ambitious plan to regulate online services and will cover areas of the platform economy ranging from liability for content, market dominance, and advertising, to safety, smart contracts, online self-employment, and future governance frameworks. [Shutterstock]

The European Parliament has overwhelmingly backed a series of reports which could have a profound impact on the future of the platform economy, supporting a possible ban on targeted advertising, reporting procedures for illegal content, and better detection of fraudulent vendors.

The three texts backed on Tuesday evening seek to establish Parliament’s position on the future regulation of digital giants ahead of a package of measures that will be unveiled on December 2 by the European Commission.

Although the reports are not legally binding, they deliver the EU executive a clear message on Parliament’s stance on the Commission’s upcoming Digital Services Act (DSA).

The Digital Services Act (DSA) represents the EU’s ambitious plan to regulate online services and will cover areas of the platform economy ranging from liability for content, market dominance, and advertising, to safety, smart contracts, online self-employment, and future governance frameworks.

Targeting advertising ban?

One of the more contentious issues that had divided MEPs earlier on Tuesday was the inclusion of a potential ban on targeted advertising put forward by Tiemo Wölken, a German socialist lawmaker who drew up a report for the Parliament’s committee on legal affairs. The full Parliament eventually backed the proposition.

The report states that targeted advertising practices must come under tougher regulation than less intrusive forms of advertising that do not require the acquisition of such granular forms of data.

By extension, MEPs pressed the Commission to look into future options with regards to a potential phase-out of targeted advertising practices that may lead to a complete ban.

Platforms should also provide the opportunity for users to use digital services anonymously, Wölken’s text states, and in terms of content moderation, avoid “ex-ante control measures based on automated tools or upload-filtering of content.”

Responding to the adoption of his report on Tuesday evening, MEP Tiemo Wölken said that the Parliament had sent a “strong signal” to the EU executive.

“We want to see a Digital Services Act which protects fundamental rights of users and avoids Upload filters which would block content without human oversight. We also called for a strong European entity to ensure the transparency of social media platforms,” he told EURACTIV.

“And finally, we called on the Commission to consider a phase-out of targeted advertisements and even a total ban within the EU. That’s a very important decision and I am expecting that the Commission looks at this point really carefully.”

Counterfeit and illegal sellers to be targeted in Digital Services Act, Vestager says

A crackdown on the sale of counterfeit and illegal goods across online platforms is likely to feature in the European Commission’s upcoming Digital Services Act plans, the EU’s Vice-President for Digital Affairs, Margrethe Vestager has said.

Counterfeit goods & harmful / illegal content distinction

Meanwhile, in the Parliament’s internal market committee, MEPs called upon platforms to take better measures to detect and take down false claims and rogue traders, such as those selling dangerous or counterfeit goods online.

On this point in particular, the Parliament seems to be on common ground with the  European Commission. Earlier this year, the EU executive’s vice-president for digital, Margrethe Vestager, revealed that a crackdown on the sale of counterfeit and illegal goods across online platforms was likely to feature in the Digital Services Act.

In addition, MEPs supported the importance of distinguishing clearly between harmful and illegal content as part of a report by Maltese socialist lawmaker Alex Agius Saliba adopted by the Parliament’s internal market committee.

Harmful content, hate speech, and disinformation should be subject to enhanced transparency obligations, the text states, as well as improving digital literacy among citizens.

The report has parallels with industry positions. In the same vein, trade association EDiMA, which counts members such as Google, Amazon, Facebook, and Apple, says that there should be clear legal distinctions between illegal and harmful content online.

Moreover, Saliba’s text also introduces a legally enforceable notice-and-action mechanism, which will allow users of online services to notify platforms about illegal content in a timely manner, and also references the need for the Commission to introduce an internal market instrument imposing ex-ante obligations on gatekeeper platforms.

“These proposals will give more protection to our consumers, increase transparency, and help in creating a more contestable online ecosystem,” Saliba told EURACTIV on Tuesday evening.

“It was not an easy task to reach a political agreement on more than 1000 amendments but finally today we can say that a cross-party agreement has been achieved on these proposals. Now we only hope that the Commission in December will be as ambitious as we were in this report.”

Digital Services Act should avoid rules on 'harmful' content, Big Tech tells EU

New EU measures regulating the web should avoid, in the first instance, rules on the hosting of online content deemed “harmful” but not “illegal”, a Brussels trade association representing the world’s largest online platforms has said.

Preserving freedom of expression online

Finally, the Parliament’s civil liberties committee played host to a report on fundamental rights in the Digital Services Act by Kris Peeters, a Belgian christian democrat.

As part of the non-legislative resolution backed on Tuesday, MEPs said that any future legally-binding content removal procedures should be “diligent, proportionate and non-discriminatory,” in order to preserve freedom of expression and information.

Platforms should also come under new rules for transparency in terms of monetisation, and microtargeting “based on characteristics exposing physical or psychological vulnerabilities” which MEPs believe is an area that should come under greater oversight in the future.

Responding to the report’s commitment to protecting fundamental rights, lobby groups praised Kris Peeters’s text on Tuesday evening.

“We welcome the message sent by the European Parliament today that fundamental rights are high on its agenda as it finds its position on the Digital Services Act,” said Iverna McGowan, Director of the Center for Democracy & Technology’s (CDT) Europe Office, an NGO which receives funding from some of the world’s largest digital platforms and associated foundations. “The Parliament is lending its support to crucial free expression.”

The reports adopted on Tuesday are likely to feed into the European Commission’s final touches on the Digital Services Act proposals, set to be presented at the beginning of December.

'Content removal' unlikely to be part of EU regulation on digital services, Jourova says  

The European Commission has given its clearest indication yet that obligations on digital platforms to remove content are unlikely to feature in far-reaching EU efforts to regulate the web, due to be presented before the end of the year.

(Edited by Frédéric Simon)

 

Subscribe to our newsletters

Subscribe
Contribute