Security experts have warned that the tools used by tech giants to detect child abuse online pose serious security and privacy risks, raising concerns around upcoming EU legislation.
The technology in question is the so-called client-side scanning (CSS). While other law enforcement techniques to detect child abuse and other serious crimes in encrypted messaging services, like WhatsApp or Telegram, rely on backdoors to access the encrypted data, CSS allows for on-device analysis of personal data.
The technology is already in use in the US, where Apple announced in August it will use CSS to scan photo libraries stored on iPhones to detect images of child abuse.
While Apple later backpedalled in the face of backlash from cyber-activists and said it would postpone the plans, similar techniques are already used by other tech giants, like Microsoft, Google, or Facebook.
As CSS would be installed on all devices, the report, published by leading security experts on Thursday (14 October), warns that the technology is much more invasive than previous proposals to soften encryption and that it could be repurposed in the future as a general mass surveillance tool.
These content recognition tools are driven by AI and are designed to detect images that display child-abusive behaviour by comparing them to known images of child abuse – the so-called “perceptual hashing.”
The ePrivacy derogation, adopted by the European Parliament in July, enables service providers to put in place voluntary measures to scan private conversations in encrypted messaging services.
Europe’s stance on encryption
The ePrivacy derogation is only interim legislation that will be replaced by a sector-specific regulation to tackle child abuse, which the European Commission should propose in early December.
The new legislation may introduce provisions that oblige or incentivise companies to scan private encrypted conversations to tackle online child abuse, said Chloé Berthélémy, policy adviser at European Digital Rights.
The European Commission has already made the fight against online child abuse one of its top priorities. In its strategy for a more effective fight against child abuse, the Commission identified end-to-end encryption as a hurdle for tackling online child abuse, as it opens the doors for criminal purposes.
While the Commission says these screening methods would not humper privacy, the report of the cyber security experts’ hints to the contrary.
“CSS by its nature creates serious security and privacy risks for all society while the assistance it can provide for law enforcement is at best problematic,” the report reads.
As CSS would make personal devices searchable on an industrial scale, it would also affect law-abiding citizens. Furthermore, the report highlights that the usage of the technology would result in considerable pressure to expand its scope, which would further intensify the degree of surveillance.
“The measures required to limit the obvious abuses so constrain the design space that you end up with something that could not be very effective as a policing tool,” said Ross Anderson, one of the authors of the report.
“If the European institutions were to mandate its use, (…) they would open up their citizens to quite a range of avoidable harms,” he added.
[Edited by Luca Bertuzzi/Zoran Radosavljevic]