This article is part of our special report Thinking of the whole ecosystem: EU’s plans to regulate the platform economy.
The EU should focus on regulating illegal content as part of its upcoming Digital Services Act, which aims to present an ambitious new regulatory framework for online services, but platforms should also be allowed to take ‘voluntary measures’ to remove harmful content, MEP Dita Charanzová says.
Dita Charanzová is a Renew Europe MEP who represented the group as part of the European Parliament’s initiative report on the Digital Services Act for the Internal Market Committee.
EURACTIV’s Samuel Stolton heard about what she had to say on the plans.
You negotiated for Renew as part of the Internal Market’s initiative report on the Digital Services Act. For you, what were the most important compromises to reach?
It’s hard to pick what the most important compromise was because the report was so long and we touched on many important things. But for me, the most important thing was that all groups supported that the fundamentals of the E-Commerce Directive should be maintained. These are the country of origin principle, the limited liability regime, and the ban on general monitoring.
In addition, I was very happy to see the support given to a Good Samaritan clause being included in the future DSA. While I believe that there should be no general monitoring requirement, if we want platforms to take more voluntary measures against unwelcome content then we must give them the legal means to do so.
Reports have recently surfaced that the Commission may think about establishing a sanctions regime for platforms that host illegal content online, what are your opinions on this prospect?
I think that as long as we respect the current system and rules of the E-commerce Directive, there is limited need of such a sanctions regime. That said, if a website is blatantly hosting illegal content, and is taking no measures to remove such content, then it is potentially liable and might need to be subject to sanctions. This, however, must be a measure of last resort.
Before then, we must follow the notice and action mechanism. But nothing should weaken the current limited liability protections under the Directive.
In light of the various terrorist attacks in Europe, do you believe that this will have an impact on not only how ‘illegal’ content is regulated, but also ‘harmful’ content, or is it important to maintain a clear distinction between the two?
Clearly, these attacks will put pressure on Brussels to regulate more on this. Nevertheless, we must maintain a distinction between illegal and harmful content. The DSA must only regulate illegal content. Illegal content is clearly set down in law, and there is a judicial process for deciding if something is illegal or not.
When it comes to harmful content though, this is very much in the eye of the beholder. Something that is not harmful in the Netherlands may, for instance, be seen as harmful in Poland. This said websites should take their social responsibility and take voluntary measures to ensure what they think is harmful content is removed from their websites. But this must not endanger their protections under the E-commerce Directive. There is a difference between social responsibility and liability.
With regards to the leaked blacklist of prohibited practices that recently emerged, what’s your take on these?
My take is that any measures that are potentially included in the Digital Markets Act must primarily be applied on a case-by-case basis. The practices of a video streaming platform are nothing like those of a marketplace, or those that allow user-generated content, or those that are closed ecosystems. To apply blanket regulations to everyone may lead to unnecessary and burdensome regulation of platforms without any benefit for consumers or fairness in the marketplace.
The DSA will also be presented alongside the Digital Markets Act, which aims to rebalance fair competition in the platform economy. Under what conditions do you envisage that its market investigation tool will be put to use? In the platform economy, what conditions have to be met in order to determine if a market is close to ‘tipping’?
I believe that the EU should primarily act when there is a market failure and the investigation tools should be used to determine if there is such a failure. Primarily I do not think it is the role of the European Commission or anyone else to go looking for problems that are not apparent. If businesses or consumers do not feel that there is a market failure, then we should primarily let the market continue to function. Any actions against market tipping must be rare and exceptional and be based upon solid evidence.
To how much of an extent do you believe that the EU should reform some of the core provisions of the eCommerce directive, if at all? What are the potential risks or benefits of doing so?
I do not believe that the core provisions should be reformed, I think that they are as valid today as they were 20 years ago. The potential risk of reforming them is that we will break a system that works. Today in Europe we have an open and free internet. Much of the reason for innovation and growth online is because we set some basic rules, and got out of the way. If we regulate too much, we risk that we will create new barriers instead of fixing the limited ones that exist today.
A lot of talk on the Digital Services Act and the Digital Markets Act has centred on the operation of the tech giants, but in what ways could Europe’s SMEs be impacted by the new rules?
A lot of European SMEs rely upon the large platforms for their businesses to help them enable their businesses. European SMEs have been able to grow far more because of the platforms than they would have without them.
The key to any regulation is to ensure that the platforms continue to be useful for our SMEs, but also that any rules that apply to the platforms do not trickle down to SME users. While the large platforms have the money and time to implement regulations, SMEs do not have the lawyers or staff to be implementing transparency reports and many other rules. We, therefore, have to be careful that any regulation of the large platforms is targeted and proportional.
[Edited by Sam Morgan]