MEPs are seeking to “delay and time-out” the adoption of the regulation against online terrorist content, the EU’s security chief Julian King said on Wednesday (21 March). His comments came as the Commission criticised online platforms for dragging their feet in removing graphic video footage following the Christchurch attacks.
New Zealand Prime Minister Jacinda Ardern has been just as critical, saying that “we cannot simply sit back and accept that these platforms just exist. They are the publisher, not just the postman.”
The EU’s proposal for regulation against online terrorist content aims to quell the dissemination of extremist material online, within a set timeframe.
The Commission’s original proposal, which was delivered towards the end of last year, sets a one-hour time limit to remove the offending content from the time of which it has been reported to the competent authorities.
If in breach of the regulation, service providers could face fines of up to 4% of their global turnover.
Member states swiftly adopted their position on the plans in December, with Austria’s Interior Minister Herbert Kickl praising the straightforward manner in which EU ministers were able to reach agreement.
“With this agreement, we want to send a strong signal to internet companies about the urgency of addressing this issue,” he said at the time.
However, the progress of the file has not gone so smoothly in the European Parliament, a cause of concern for EU security chief Julian King.
Divisions have caused the committee adoption of the draft report to be delayed until April. A vote was originally due to take place this week, but progress has been stymied due to divisions over the timeframe of the removal orders and the definition of terrorist content itself.
“I have some difficulty understanding the motivation of colleagues in the European Parliament who are seeking to delay and time-out our outstanding legislative proposal on a Regulation on terrorist content online,” King said on Wednesday.
Citing the recent New Zealand attacks, he said the online terrorist content regulation “would make it obligatory to take down such hateful content – whatever the source – as quickly as possible, and to ensure it is not re-uploaded,”
Moreover, Migration Commissioner Dimitris Avramopoulos, who is widely touted to take up the security portfolio if and when King leaves his post in the Commission due to Brexit, said on Wednesday that countering the radicalisation of citizens should be “an existential priority.”
The original live video of last week’s Christchurch attacks had been viewed 4,000 times before it was removed, Facebook revealed earlier this week. Although the video was taken down, “the damage had already been done,” Avramopoulos said.
Within 24 hours, according to its own estimations, Facebook had blocked 1.2 million copies of the footage at the point of upload and deleted another 300,000. “But this footage should have never gone online in the first place,” Avramopolous told reporters.
“This is why we need legislation with teeth and sanctions in place to force these companies to comply and remove terrorist content,” he added, expressing his hope that Parliament “will decide to help us to be on the right side of history.”
However, while King and Avramopoulos believe the larger players in the industry need to step up their game in the swift removal of graphic material, the MEP responsible for steering the file through the Parliament, Daniel Dalton, said attention should be more geared towards the smaller outfits that host terrorist content.
“It’s not necessarily the big platforms that have the problem,” he told EURACTIV in a recent interview. “Many of them are doing their own voluntary and proactive measures as it is now.”
“But there are quite a few smaller platforms who are either inundated with offending content, or they are basically not responding to the authorities’ requests for the removal of content.”
[Edited by Zoran Radosavljevic]