Pressure ramps up on Apple to back out from its ‘surveillance’ plan

Apple's plan to scan messages and images to detect child sexual abuse materials has been strongly criticised. [Shutterstock]

An open letter signed by more than 90 civil society organisations on Thursday (19 August) urged Apple to abandon its recently announced plan to introduce scanning features to detect child sexual abuse material.

The international coalition of policy and rights organisations is urging Apple to abandon its plan to build ‘surveillance features’ intended to scrutinise content and communications from its devices to prevent the dissemination of child sexual abuse material (CSAM).

The iPhone-maker announced on 5 August its intention to introduce child safety features to inspect children’s messages on iMessage to identify images of nudity and the photos uploaded on iCloud for images of child sexual abuse.

“We are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children,” the letter reads, arguing that “algorithms designed to detect sexually explicit material are notoriously unreliable.”

Potential abuses

Apple’s announcement prompted strong criticism from policy groups and experts that the new features might compromise users’ privacy and even be used by authoritarian regimes to target individuals.

Following the backlash, the Cupertino-based company made some significant changes to its new programme, notably adding the requirement that an image had to be flagged in multiple countries before being disclosed to the competent law enforcement authorities.

The coalition is not satisfied by the additional caveats, considering that creating image surveillance tools and a backdoor to the messaging service would open the door for abuses.

“Once this capability is built into Apple products, the company and its competitors will face enormous pressure — and potentially legal requirements — from governments around the world to scan photos not just for CSAM, but also for other images a government finds objectionable,” the letter states.

Apple stressed it would oppose any request to detect images not related to child protection, but it did not commit to exiting a market where it would be legally obliged to do so.

New EU law allows screening of online messages to detect child abuse

The European Parliament adopted on Tuesday (6 July) the final version of the ePrivacy derogation, a temporary measure enabling providers of electronic communication services to scan and report private online messages containing material depicting child sex abuse. The provisions …

 

Last month, the European Parliament adopted the final version of the ePrivacy derogation, an interim measure that allows tech companies to voluntarily track child sexual abuse material online. The temporary measure was similarly criticised for compromising users’ privacy and enabling the indiscriminate monitoring of private communications.

“The right to communicate securely underpins the pillars of democracy, including press freedom, the presumption of innocence, privacy and freedom of expression and association,” said Iverna McGowan, director of the European office of the Centre for Democracy and Technology, the organisation that coordinated the open letter.

However, McGowan notes that the ePrivacy derogation preserved the integrity of end-to-end encryption, whereas Apple’s move to create a backdoor might bolster government demands to provide special access to encrypted communications.

“As we have seen over the past weeks with the Pegasus scandal, democracy and human rights are in need of more protection from government surveillance — not less,” McGowan added.

Pegasus shows no backdoor will be used only by good guys, Proton CEO say

Commenting on a recent cyber-surveillance scandal, a tech leader said encryption has made mass operations impossible and its integrity should be maintained at all costs, but he also pointed to Big Tech’s data collection practices as a major source of privacy vulnerabilities.

 

In an interview with the Wall Street Journal, Apple vice president Craig Federighi invited researchers to test its child protection features, arguing Apple’s technology could be held accountable by external review.

However, since 2019 Apple has been in litigation for copyright violations with Corellium, a start-up that facilitates research by replicating smartphone software without the physical device. While the two-year lawsuit appeared to be close to a conclusion last week, Apple decided to appeal after Corellium issued a research grant to scrutinise the iPhone software.

“iOS is designed in a way that’s actually very difficult for people to do an inspection of system services,” Matt Tait, chief operating officer at Corellium told MIT Technology Review.

[Edited by Benjamin Fox]

Subscribe to our newsletters

Subscribe