Thriving digital markets have made our lives easier and supported our economies, but it is about time that we put an end to illegal content that both harms our consumers and businesses.
Simon Kollerup is Denmark’s Minister for Industry, Business and Financial Affairs.
Christine Lambrecht is Germany’s Federal Minister of Justice and Consumer Protection.
Digital developments in the past decade have shown us that a major challenge lies ahead. We leapt into the digital age so quickly that we struggle to keep apace in consumer protection and in regulating hate speech.
This is especially of concern when discussing the impact of online marketplaces and social media, where illegal content flourishes, where fake shops, illegal products and damaging and punishable statements appear repeatedly, and where consumers and smaller companies across Europe remain cautious because they cannot trust platforms to take their fair share of responsibility.
Fundamental change is needed. We must take the first important steps without delay. The Digital Services Act is currently being negotiated in the Council of the European Union and the European Parliament, and while there are still disagreements to overcome, we all know that there is an urgency to address these challenges effectively.
In Germany, young people aged 14 to 24 were asked what they personally consider to be the greatest risks when using the Internet. Sixty-one percent of respondents cited fraud in online shopping and other online transactions as one of their top concerns; nine percent stated they had been personally affected by this. Forty percent cited the risk of insults and hateful comments online, while twenty-seven percent were personally affected. It is clear: fraud and hatred are high up among the dangers faced on Internet platforms.
A product safety test conducted by European consumer organizations revealed that, out of over 250 products purchased on online platforms such as Wish, Amazon, eBay and AliExpress, sixty-six percent failed to meet European safety standards. Findings included everything from faulty labelling to toxins.
In Denmark, a survey concluded that on the Facebook pages of news media outlets there are issues both with the deletion of too much legal content and with illegal content not being taken down.
These challenges are of course global in nature. A lot of the corporations involved have budgets larger than many European countries and their platforms span the entire globe. It is impossible for any country to exercise effective control by itself, which is why it is crucial that we agree on strong and meaningful regulation to set a new benchmark globally.
In order for us to succeed in doing this, we see a need for fundamental changes to the Digital Services Act as it currently stands:
We need platforms to take responsibility, and to prevent and counteract illegal content. Online platforms must take all reasonable technical and organizational steps to prevent illegal content relating to the sale of goods and services from becoming visible to consumers on their platforms. If we are to effectively reduce the amounts of illegal content online, we also need a robust notice and action mechanism for illegal content to be taken down. The draft Digital Services Act provides a long list of requirements for notices submitted by users, but it does not specify what action must be taken in response to illegal content by the service provider. In fact, the draft Digital Services Act neither requires any action by the service provider to remove or block illegal content, nor does it set any time frame for doing so. With such provisions missing in the Digital Services Act, it is indispensable that national legislators are authorized to ensure the efficient removal of illegal content and to set corresponding deadlines for such removal. This will make it possible to limit the spread of illegal hate speech and products.
On top of this, we must ensure that platforms and the traders operating on them provide a valid point of contact, thus establishing clear-cut responsibility for responding to inquiries from consumers or authorities. This should also apply to platforms from third countries, where safeguards are potentially incomplete. In addition, it should be possible to communicate with the point of contact in any language in which the provider offers its services.
Moreover, we want online platforms to assume responsibility for their sellers’ obligation to comply with consumer legislation. Online platforms should also be obliged to correct the self-classification of a commercial trader who claims to be a private individual when it is clear that the seller is indeed a commercial trader. The platforms’ due diligence obligations should be linked in with the general liability exemption for e-commerce platforms. It seems contradictory for platforms to be able to invoke a civil liability privilege where they breach their duty of care.
It is high time we make a change. We want to support the thriving digital market, and we want technology companies to be able to deliver their services – which, after all, have made our lives so much easier. But what we cannot allow is for this to be an excuse for letting illegal content flourish.