Visegrad Four want to distinguish between ‘illegal’ and ‘harmful’ content in Digital Services Act

shutterstock_1253519296 [Shutterstock/Harvepino]

According to the Visegrad Four countries, the Commission’s Digital Services Act is necessary but Europe must avoid censorship and any other forms of violation of the right to freedom of expression. One of the most pressing topics seems to be the removal of illegal and harmful content from social platforms.

Czechs are quite careful when it comes to imposing new legal duties by the future Digital Services Act.

“We need to define what the obligations of digital services are, and also make sure that there is a clear legal framework for how exactly platforms should become aware of illegal content – without removing it by monitoring users’ content. This is the so-called notice and action mechanism,” Czech MEP and Vice-president of the EP, Marcel Kolaja (Greens/EFA), told  EURACTIV.cz.

According to him, the DSA should ensure the swift removal of illegal content. However, automated tools proved to have many unintended side effects.

The Czech government’s position concerning the obligation to remove illegal content is quite clear.

“When it comes to liability of platforms, the current liability exemption for intermediary service providers should be maintained. However, we do believe that time has come to review provisions relating to procedural aspects of removal of illegal content from the internet,” stressed stressed Patrik Tovaryš, Head of Information Society Services Unit from the Czech Ministry of Industry and Trade.

“There is an absolute need to distinguish between the notion of illegal content and legal but harmful content. This is imperative as there usually is a distinction between criminal conduct such as sharing terrorist content and sharing disinformation, which many users share with faith it is true,” Tovaryš pointed out.

David Nosák from the Center for Democracy and Technology shared this view and warned:

“The e-Commerce Directive prohibits member states from imposing obligations on digital services to monitor the content they transmit or store, which is an important principle that should be upheld”.

For the Slovak government, the issue of the responsibility of online platforms for content, including a question of users uploads, is an important part of the legislation-in-making.

Slovakia advocates “maintaining the principle of limited liability, as well as the prohibition of the obligation of general monitoring”, said the Ministry of Investment, Regional Development, and Informatization.

Moreover, the government does not think that platforms should be responsible for illegal and harmful content, placed there by their users, of which they are unaware.

Slovakia expects the new legislation to define and put in practice a “notice and action” mechanism so that the EU does not have to oblige platforms to monitor their networks in general.

According to Deputy Minister of Investments, Regional Development and Informatisation Marek Antál, companies “must act swiftly and effectively if they are informed of illegal and harmful content by the competent authorities of the member states”.

Representatives of companies also agree. According to the Slovak Alliance for Innovation Economy (SAPIE), a non-profit platform with more than 100 members, if platforms become content editors, we could see the widespread deletion of posts through algorithms that would negatively affect users, including businesses.

The Alliance points out that strictly preventive setting of deletion algorithms is the least costly option, yet it can create the most damage. In connection to the price of setting up services that would have to check content quickly, yet widely, SAPIE warns against the negative impact that this decision would have on small businesses that cannot afford highly sensitive tools.

Work on the DSA has been closely observed also in Poland by non-governmental organizations dealing with the right to privacy in a digitalized society or civil liberties on the Internet, such as the Panoptykon Foundation and the Digital Center Foundation (Fundacja Centrum Cyfrowe).

They recommend no longer treating Internet platforms as ‘hosting providers’ because their business model has changed. They also recognize that the self-regulation model has failed.

Regarding content moderation, NGOs point out that this matter would be partially solved by transferring control over its rules to the community or public trust entities.

They suggest keeping platforms responsible for infringing content, but the basis must be actual knowledge of the infringement and a clear notification from the user or the authorised body. According to NGOs, the system of notification of illegal content must be transparent and efficient, and users must have a clear appeal procedure available.

“The future framework should consider the new and diverse actors that have emerged in the online space and update the definition of information society service providers and hosting service providers so that it is clear which kind of services are covered by the law,” said Magdalena Piech, chair of European Tech Alliance-EUTA.

“Any new rules should be technology-neutral to be able to adapt to future developments,” said the chair of EUTA, which includes the largest Polish online retail platform Allegro

According to Hungarian MEP Anna Donáth (Renew Europe), the EU must create unified criteria and a joint regulatory framework, and a solution could be an independent EU institution observing compliance with rules on transparency, content moderation, and political advertisements.

She also said online platforms must play a role in the fight against disinformation but must do so transparently in terms of decision-making and algorithms, and there must be a body to turn to with legal remedies.

Hungary’s Institute for Media Studies, the research body of the Media Council of the National Media and Infocommunications Authority (NMHH), believes that the relationship between freedom of speech and disinformation “poses a challenge to all legislators, both on the national and EU level.”

Non-factual statements, it says, are a part of discourses and “as such, cannot be excluded from the scope of the freedom of speech simply due to their lack of factuality,” so their presence in the public discourse can only be restricted based on strict criteria.

Overall, Hungarian actors are generally cautious against removing a too wide range of content online, favoring an approach based on multiple methods (e.g., media literacy training, flagging, etc.), transparency from the side of online platforms, and an opportunity to seek legal remedies against content moderation decisions.

[Edited by Zoran Radosavljevic/Samuel Stolton]

Subscribe to our newsletters

Subscribe