The European Commission is to put forward a generalised scanning obligation for messaging services, according to a draft proposal obtained by EURACTIV.
The text marks a victory for child advocates, but a setback for privacy activists. The European executive is to unveil on Wednesday (11 May) its proposal to fight the online circulation of child sexual abuse material – CSAM in short.
“Providers of hosting services and providers of interpersonal communication services that have received a detection order shall execute it by installing and operating technologies to detect” CSAM upon request by the competent judicial authority or independent administrative authority, the draft regulation states.
The text says that the technologies used to this end must be “effective”, “sufficiently reliable”, “state of the art in the industry” and “the least intrusive” so that they won’t “be able to extract any other information from the relevant communications than the information strictly necessary to detect.”
The obligation also requires tech platforms to conduct risk assessments and “reasonable mitigation measures” that are targeted and proportionate. They will need to report to both the national coordinating authority and the newly-established, built-for-purpose EU agency in The Hague – stationed “at the same location as its closest partner, Europol” – the proposal stresses.
These are the reports on which the judicial authorities will base a detection order. The risk assessment obligations also fall on software providers.
This new EU Centre on Child Sexual Abuse (EUCSA) will act as a facilitator for national authorities and platforms. Its purpose will be to provide detection technology options to the companies and to operate databases of indicators for CSAM that providers will have to comply with when processing their detection obligations.
The European Commission places the protection of children online above all else, to the displeasure of privacy defenders who feared an indiscriminate and disproportionate intrusion into our personal communications.
“The proposal takes into account the fact that in all actions relating to children, whether taken by public authorities or private institutions, the child’s best interests must be a primary consideration”, reads the text’s preamble.
In other words, the Commission considers that “whilst of great importance, none of [the fundamental rights to respect for privacy, protection of personal data and to freedom of expression and information are] absolute and they must be considered in relation to their function in society”.
This mass monitoring of messages was made possible by the ePrivacy Directive derogation that was adopted last July. It allows platforms to carry out these scans, as long as they are used solely to tackle CSAM.
The derogation has been met with criticism, particularly that it is lacking safeguards and a legal basis, and also due to the fact that it was only supposed to act as an interim measure until new legislation took over or the negotiations on the ePrivacy regulation were concluded.
But for many privacy advocates, the tools that tech can provide cannot be the only solution to a bigger, social problem.
“There are plenty of problems in combating child abuse material, such as an overburdened police force and poor international cooperation. This proposal does not solve those problems, it does not help children and it does harm innocent citizens”, said Rejo Zenger, policy advisor at the foundation Bits of Freedom, a member of the European Digital Rights network.
The proposal was also forcefully rebuked by liberal MEP Moritz Körner, who called it nothing short of a ‘Stasi 2.0’.
“Instead of fighting these heinous crimes by disproportionately giving up the basic rights of all EU citizens, it would be better to invest significantly more in the equipment of the police, the European police authority Europol and in the cross-border cooperation of the relevant authorities”, Körner said.
A fundamental question the Commission’s proposal raises is the future of encrypted communications. It refuses to “incentivise or disincentivise” the use of any technology, including end-to-end encryption, as long as it meets the requirements of the regulation.
End-to-end encryption is “an important tool to guarantee the security and confidentiality of the communications of users, including those of children”, the proposal says, simply stressing that “providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them or their employees for purposes other than” detecting CSAM.
[Edited by Luca Bertuzzi/Nathalie Weatherald]