Online platforms should remove posts promoting terrorism within one hour after receiving complaints, according to a draft European Commission document that leaked on Tuesday (13 February).
The Commission wants “online marketplaces and social media companies” to take down posts that contain illegal content more quickly. The EU executive has focused its attention on terrorist material, which firms should remove within one hour.
A one-hour timeline for platforms to remove users’ posts is the shortest deadline yet that the Commission has given online firms for taking down illegal material.
The draft, which was published by NGO European Digital Rights, is dated February 2018, but the exact date when it will be published is marked “XX”, meaning the Commission has not yet finally approved the draft. A Commission spokesperson said there will be an announcement on tackling illegal content in the “coming weeks”.
The leaked document is titled a “recommendation for measures to effectively tackle illegal content online” and outlines steps for platforms to remove content more quickly. It is not a binding legislative proposal.
Platforms should make “swift decisions as regards possible actions with respect to illegal content online without being required to do so on the basis of a court order or administrative decision”, the draft said.
The new recommendation carves out a toughened EU approach to policing online platforms. Separately from the demands on removing terrorist material, six platforms – Facebook, Twitter, Youtube, Google+, Microsoft and Instagram – are part of a voluntary Commission-led agreement to remove posts containing hate speech within 24 hours after they are notified.
Julian King, the British EU Commissioner in charge of the European security union, has pushed the new approach through internal negotiations within the Commission, EURACTIV.com has learned.
Comments on the draft from King’s cabinet, seen by EURACTIV, include one suggesting softer language to delete two of the adjectives describing “effective, appropriate and proportionate measures” to deal with illegal content because that “would be clearly toning down the message” from a different strategy paper on illegal content that the Commission published in September.
The September communication calls for “the effective removal of illegal content, increased transparency and the protection of fundamental rights online”.
The UK government has been amping up pressure on social media companies.
Several hours before the recommendation on illegal content leaked on Tuesday, British Home Affairs Minister Amber Rudd announced new government-funded software that can allegedly detect up to 94% of videos automatically that are posted online by the Islamic State. The tool is voluntary but Rudd told the BBC she will consider “legislative action if we need to do it”.
The Commission’s recommendation suggests that online platforms remove terrorist content more quickly through “the greater use of technology,” including “automated means for the detection of illegal content”.
Youtube and Facebook already use artificial intelligence to detect illegal material.
The document also suggests platforms “set out in their terms of service that they will not host terrorist content”.
Aides to a handful of other EU Commissioners suggested adding more detail to the draft document, according to language seen by EURACTIV. One comment from the cabinet of EU Home Affairs Commissioner Dimitris Avramopoulos asked “What does this mean?” about a sentence specifying the role of “online market places and social media companies” for removing illegal content.
Another comment from the cabinet of EU Vice-President Andrus Ansip, who oversees digital single market policies, suggested specifying what information platforms should report to national governments about the posts they remove, and what member states should report to the Commission.
There were clearly divisions between different Commission offices over some details in the recommendation. In the leaked draft, one line specifying that platforms should remove illegal content within one hour includes a tracked change with a crossed out suggestion for that time limit to be “two hours”.
The Commission does not generally comment on leaked documents.
A spokesperson referred to a different document that the EU executive published in January on its progress towards creating a security union.
“The Commission is urging online platforms to step up and speed up their efforts to prevent, detect and remove online terrorist content as quickly as possible, and is looking into more specific steps to improve the response to terrorist content online, before deciding whether legislation is needed,” the document from January said.
The leaked recommendation sparked backlash from civil liberties NGOs and lobby groups technology sector.
European Digital Rights and nine other associations asked Commission President Jean-Claude Juncker and six Commissioners, including King, Ansip and Avramopoulos, on Tuesday “to refrain from proposing any new initiative in this area without first engaging into structured, targeted and multi-stakeholder dialogue”.
The groups’ letter said the plans would be “detrimental to maintaining a sturdy e-Commerce framework for the EU, and risks engendering a regulatory framework that is not properly tuned to sectoral dynamics”.
The eCommerce directive is an 18-year-old EU law that outlines when online companies can be liable for illegal material.