EU lawmakers reached an agreement on the EU rulebook for digital content and services, formalised by the adoption in the leading parliamentary committee.
The compromised text on the Digital Services Act received broad support in the internal market committee (IMCO) on Tuesday (14 December), following an agreement brokered by the lead negotiator Christel Schaldemose.
“We have been too slow to acknowledge that we also need to protect our online environment from pollution in the form of the spread of illegal products, harmful algorithms and the harvesting of personal data,” Schaldemose said.
The DSA is horizontal legislation introducing obligations on content moderation and illegal products and for transparency and accountability for the algorithms used by online platforms.
The DSA introduces due diligence obligations based on an asymmetrical approach, with tougher measures for very large online platforms, those with more than 45 million users, and exemptions for small and micro companies via a waiver system.
“Not everything in this law is needed for our start-ups and would just harm innovation and be a burden. They will be exempt from some parts of the law,” liberal MEP Dita Charanzová told EURACTIV.
SMEs that do not pose “significant systemic risks” will be able to request a waiver to the European Commission, a measure that was the subject of intense debate between the centre-left, keener on consumer protection, and the centre-right, traditionally more business-friendly.
Moreover, MEPs adopted an amendment that would require the European Commission to “pay specific attention to SMEs and the position of new competitors” when periodically revising the regulation.
The new rules will also set out obligations for removing illegal online content. Following a principle established in the e-Commerce Directive, platforms will not be obliged to monitor all the content they host. Still, they will have to take swift action when requested by a competent authority.
Individuals and organisations will also flag suspicious content or appeal to a removal decision, which platforms must justify adequately. National authorities might also require a platform to reinstate erroneously taken down content.
Platforms will have more discretion for legal but potentially harmful content, but they will follow stated terms and conditions.
In a last-minute addition, MEPs voted in favour of obligations on pornographic platforms to ensure users uploading content have been verified, human review is in place to detect abusive content, and that the individuals who are depicted without their consent can request the immediate removal of the content.
A more controversial point in the content moderation debate was a proposal for exempting the media sector, based on the argument that online platforms should not control editorial content for which publishers are legally liable.
The exemption did not find support in the IMCO committee for fear the media curb out would open a dangerous loop for spreading disinformation. However, the associated committee on legal affairs (JURI) can table amendments during the plenary vote in January and will likely raise the issue again.
“It is frustrating that public service media has to be under the content moderation of a private entity like Facebook,” Schaldemose told reporters, noting that there were divisions also inside the different political groups.
Algorithms and transparency
Very large online platforms will have to conduct specific risk assessments due to their ‘systemic’ role in society. These measures might have far-reaching implications for the inner working of platforms as algorithms have been accused of spreading disinformation or even harming the mental health of teenage girls.
Based on the parliament’s text, the largest platforms would also have to provide an alternative version of their recommender systems not based on profiling and enable the user to see on what criteria the platform is adapting content for them.
More substantial transparency obligations were also introduced, and platforms will have to open their data for external review. However, they will still maintain their ‘trade secrets’.
This exemption was partially removed in the compromise text, as lawmakers feared that online players could use it to withhold key information from the public. Still, green MEP Alexandra Geese told EURACTIV the group will likely ask for a complete removal during the plenary vote, together with other amendments against microtargeting and for including the environment in the risk assessment of big online platforms.
The compromise text also introduces measures against dark patterns, techniques designed to manipulate users to extort their consent for processing personal data. Platforms would be allowed to reach out to users multiple times but not prevent them from taking an autonomous decision.
Moreover, a cross-border coaling of MEPs, including Schaldemose, pushed for a total ban on targeted advertising but faced strong opposition. The landing point was banning targeting only for minors, following a similar compromise reached in the DSA’s sister proposal, the Digital Markets Act.
However, the proposal for a total ban is likely to re-emerge via the civil liberties committee (LIBE), an associated committee that might table amendments during the plenary vote. Asked by EURACTIV what her voting indication would be in that case, Schaldemose said that “it depends on the wording.”
[Edited by Alice Taylor]