‘Content removal’ unlikely to be part of EU regulation on digital services, Jourova says  

The European Commission has given its clearest indication yet that obligations on digital platforms to remove content are unlikely to feature in far-reaching EU efforts to regulate the web, to be presented before the end of the year.

European Commissioner for Values and Transparency Vera Jourova. [EPA-EFE/FRANCISCO SECO]

The European Commission has given its clearest indication yet that obligations on digital platforms to remove content are unlikely to feature in far-reaching EU efforts to regulate the web, due to be presented before the end of the year.

Věra Jourová, the European Commission Vice-President for Values and Transparency, held a video call with Twitter CEO Jack Dorsey on Tuesday (22 September), in which she gave an insight into the the EU executive’s intention to introduce future obligations under the Digital Services Act and the Democracy Action Plan.

Jourová revealed that during the meeting, she highlighted the Commission’s intention not  to necessarily introduce future rules that would force platforms to remove harmful online content or disinformation, focussing instead on how such content spreads online.

She particularly noted the Commission’s dedication to preserving free speech online.

“In order to address disinformation and harmful content we should focus on how this content is distributed and shown to people rather than push for removal,” she said, adding that EU efforts in this field would be detailed further in the Commission’s upcoming presentation of the Digital Services Act and the Democracy Action Plan.

The Digital Services Act (DSA) represents the EU’s most ambitious plan to regulate online services, and will cover areas of the platform economy ranging from liability, market dominance, online advertising, to safety, smart contracts, online self-employment, and future governance frameworks.

Meanwhile, the Democracy Action Plan will hone in on disinformation in the context of external interference and manipulation in elections.

For his part, Dorsey is understood to have highlighted Twitter’s commitment to bolstering how potentially harmful or spurious content is “discovered and labeled”. He also rallied the benefits of media literacy in stifling the spread of such material.

Jourová informed Dorsey that the Commission would examine in detail Twitter’s plans to label accounts or individual posts, in a bid to improve transparency in the online political debate.

Moreover, while Jourová did also stress the importance of ensuring the transparency of algorithmic processes as part of future rules, she also noted that the overall debate on platform regulation should be more centered on the “plurality of debate, openness, and ability of people to have more control and understanding of what they see and why they see it.”

Big names speak out

Meanwhile, in Brussels, public consultations have now closed both on the Digital Services Act and the Democracy Action plan, both of which had been subject to heavy lobbying from an array of stakeholders.

In terms of liability, tech giants such as Google believe that the ‘core principles’ of the 2000 e-Commerce directive, the precedent to the Digital Services Act, should be maintained, particularly the country-of-origin principle, but a liability regime for illegal content should be introduced. Facebook called for a new framework for dealing with content that is not illegal, but harmful.

In addition to weighty feedback on industry plans, EU member states themselves have been keen to speak out on their positions. EURACTIV recently obtained the response of Poland’s Ministry of Digital Affairs to the Digital Services Act plans, where they called for a “forward-looking, technologically neutral and flexible solution, enough to respond to the rapid pace of changes observed in the digital sector.”

In this vein, Poland believes that the DSA should maintain the liability exemptions for online platforms included in the 2000 eCommerce directive and a ban on a general monitoring obligation, but that the EU should review provisions relating to the removal of illegal content.

In terms of ex-ante regulation, Warsaw supports rules aimed at making large online platform companies act as gatekeepers, but also says that a precise definition of those companies will have to be adopted.

On the subject of online fake news, Poland recognises the need to limit the spread of information and also highlights the need to distinguish between the notion of illegal content and legal but harmful content, which includes disinformation.

European Parliament developments

Meanwhile, in the European Parliament, committees are in the stage of adopting their various reports on the Digital Services Act.

EPP MEP Kris Peeters’ text was backed by Civil Liberties members on Tuesday (22 September), which included maintaining the fundamentals of the eCommerce directive – including the limited liability provisions and the ban on general monitoring obligations.

The report also rallies the importance of consumer rights and forms of redress for online actions by automated technologies, as well as greater transparency for online political advertising.

The Internal Market committee’s report, led by Maltese socialist MEP Alex Agius Saliba, is set to be voted on Monday 28 September.

(Edited by Frédéric Simon)

Subscribe to our newsletters

Subscribe
Contribute