EU lawmakers back one-hour deadline to remove online terrorist content

A frame grab from video released by the Federal Bureau of Investigation on 07 October 2014 shows an alleged Islamic State militant (L) holding a hand gun as he claims to be inside the Syrian 17th Division Military Base just outside Raqqa, Syria. [EPA/FBI]

Lawmakers in the European Parliament’s Justice Committee have backed measures to clamp down on the spread of online terrorist content, including an obligation for platforms to remove offending content within one hour of reporting or face fines of up to 4% of their global turnover.

The decision came following criticism from British Conservative MEP Daniel Dalton, the lead rapporteur on the file, who hit out at the European Commission’s recent comments that Parliamentarians had been dragging their feet in adopting the measures.

EU Security Chief Julian King had said recently that he has “some difficulty understanding the motivation of colleagues in the European Parliament who are seeking to delay and time-out” the plans for regulating online terrorist content.

“No one here is in the pocket of big tech,” Dalton said on Monday, in response to accusations that Parliament had delayed its vote due to opposition from tech giants.

According to Dalton, the Commission thinks that anyone wishing to scrutinise their plans is “wasting time” and that the file had been the subject of a “blatant smear campaign” in the press.

'Regulation will not solve Facebook's problems,' Commission says

Better regulation won’t divert attention away from Facebook’s shortfalls in other areas, European Justice Commissioner Věra Jourová told, criticising Mark Zuckerberg’s recent call for tighter regulatory frameworks on internet services.

The decision came despite pressure from a coalition of trade associations who wrote to the Parliament’s Justice Committee ahead of the vote urging them to reject the one-hour timeframe.

“This extremely short deadline, coupled with onerous sanctions, would entail over-removal of lawful content which will negatively impact the freedom of expression and related fundamental rights of European users,” the letter read, adding that small and medium-sized enterprises would “not be able to comply” with the orders outside of working hours.

The Greens had attempted to scrap the one-hour time-limit but ultimately failed in the vote on Monday.

Nevertheless, the text supported by MEPs states that the competent authority for issuing removal orders should contact companies that have never breached the rules to provide them with information on procedures and deadlines, at least 12 hours before issuing the first order to remove content they are hosting.

EU institutions at loggerheads over online terrorist content

MEPs are seeking to “delay and time-out” the adoption of the regulation against online terrorist content, the EU’s security chief Julian King said on Wednesday (21 March). His comments came as the Commission criticised online platforms for dragging their feet in removing graphic video footage following the Christchurch attacks.

The regulation would see hosting providers forced to remove terrorist content or disable access to it within one hour from the time of which it is reported.

If in breach of the regulation, service providers could face fines of up to 4% of their global turnover. However, under the plans passed by Parliament, they will not be obliged to monitor the information they transmit or store, nor have to actively prove illegal activity.

Content disseminated for educational, journalistic or research purposes should be protected, MEPs agreed.

Elsewhere, the UK government has put forward new plans to take out punitive measures against internet sites should they fail to manage the dissemination of ‘online harms’ including terrorist content.

A new independent regulator will be established to ensure companies meet their commitments. The body may be afforded powers to issue substantial fines, block access to sites and to impose liability on individual staff members.

In a statement, UK Prime Minister Theresa May said that “online companies must start taking responsibility for their platforms, and help restore public trust in this technology.”

Digital Secretary Jeremy Wright added that the “era of self-regulation for online companies is over.”

Back on the continent, following the Justice Committee’s decision, the Commission’s regulation against online terrorist content will be voted on by the full Parliament next week. After the EU elections, the new parliament formed will then be responsible for negotiating on the plans with the Council of Ministers.

[Edited by Frédéric Simon]

Up to 400 online platforms hosting terrorist content, Commission says

Between 200 and 400 online platforms are currently hosting content that could lead to terrorist radicalisation, the European Commission has said.

Subscribe to our newsletters