The European Commission’s proposal to fight child sexual abuse material (CSAM) online is still pending, while the EU has become a ‘destination of choice’ when it comes to hosting such content, according to a new report.
The Internet Watch Foundation (IWF)’s annual report published on Tuesday (April 26) shows that Europe is the “global hub” for hosting online child sexual abuse material (CSAM).
In 2021, the EU was hosting the majority of abusive imagery, with 62% of all the CSAM globally being traced to a member state.
“Depressingly, Europe remains the destination of choice for criminals and predators looking to host some of the internet’s very worst content”, said Susie Hargreaves, Chief Executive of the IWF. “We are talking about some of the most severe content – images and videos showing children and babies suffering rape and sexual torture”, she added.
The IWF is a British NGO focused on child protection.
Out of the 156,300 URLs which included CSAM hosted on servers in EU member states, 66% were from the Netherlands. The country also accounts for 41% of the global hosting.
Analysts from IWF attribute this to the excellent internet infrastructure that the country offers, as well as the low cost of hosting solutions. Moreover, the Dutch legal framework can make it more difficult or take longer to have the content removed, IWF told EURACTIV.
“The largest obstacles we are currently facing concern the ongoing shift to private domains such as personal clouds and chat applications like Telegram,” said Expertisebureau Online Kindermisbruik (EOKM), a Dutch hotline for CSAM.
The Netherlands is followed by France and Latvia in hosting the most content, representing about 9% each in the EU and 6% globally.
Experts also highlight that while the EU continued to host the most CSAM in 2021, there was a higher proportion of abuse imagery coming from the United States compared to the previous years – 21% of it was hosted in the US globally compared to 5% in 2020.
The Commission’s proposal yet to come
The European Commission has been working on legislation to tackle this issue for some months and, after several postponements, is expected to present its proposal on 11 May.
The debates that may ensue promise to be tricky between privacy advocates and child defenders, as AI-powered monitoring and encryption backdoors could be put forward as a way to detect and remove CSAM on a large scale.
“New legislation must also address issues associated with end-to-end encryption. If the risk to children cannot be mitigated, companies should pursue other ways of improving privacy before seeking to encrypt”, Hargreaves told EURACTIV.
In April 2021, Google said it backed the need for a temporary derogation from the ePrivacy directive allowing tech companies to scan more proactively for CSAM in electronic communications.
The European Digital Rights (EDRi) association, however, called for the Commission’s proposal to make sure the measures will be “in line with the EU’s fundamental rights obligations” and “lawful and objectively necessary and proportionate to their stated goal”.
The interception of private communications should be limited to “genuine suspects against whom there is reasonable suspicion” with the necessary safeguard, according to them. The CSAM proposal should also “respect encryption as a vital security measure”.
“We are at a crossroads. It’s a golden opportunity to make a real difference to the lives of children, protecting them from the perils of predators online”, Hargreaves added about the upcoming legislation, stressing that the Commission “must focus its proposals on risk and harm caused and not just simply target the biggest players”.
[Edited by Luca Bertuzzi and Benjamin Fox]