The European Commission’s lack of substantial response to concerns over the use of Clearview AI technology by EU law enforcement authorities has drawn the ire of MEPs on the European Parliament’s Civil Liberties committee.
US firm Clearview provides organisations – predominantly police agencies – with a database that is able to match images of faces with over three billion other facial pictures scraped from social media sites.
It has previously come under fire for its mass-harvesting of facial images from social media.
On Thursday (3 September), the European Commission’s Zsuzsanna Felkai Janssen of DG Home was pressed by MEPs to provide more clarity on the concerns related to the use of the technology in Europe, after it emerged that certain police forces had been using it.
This included an admission from the Swedish Police Force that they had been using the controversial Clearview AI software, after which the Swedish data protection authority announced an investigation into the company’s practices.
This provoked a series of written questions being submitted to the Commission, from both a contingent of Renew MEPs and a separate question from GUE/NGL MEP Stelios Kouloglou. The EU executive responded to both questions on 17 July but those replies were deemed to be inadequate by MEPs.
“The commission is not aware of the use of the Clearview AI application by law enforcement authorities in the EU and whether the application processes the data of EU citizens,” DG Home’s Felkai Janssen told MEPs on Thursday.
She added that any activity involving law enforcement authorities using facial recognition technologies must be subject to the EU’s General Data Protection Regulation and the Police Directive.
MEPs reacted angrily to this response, claiming that it was insufficient in addressing the concerns originally raised.
“We know that police forces in the European Union are, are using it [Clearview AI]. It’s been confirmed that the Swedish police has used Clearview, and others are using it too,” Renew’s Sophie in’t veld said.
“Are we going to accept this attitude by the European Commission? I want there to be a solution to these kinds of issues. This technology is being used on EU citizens and the European Commission is doing bugger all about it.”
The Dutch MEP then pressed Civil Liberties Chair Juan Fernando López Aguilar to write to the Commission and insist that “we get a full a political answer” on the matter.
For his part, López Aguilar described the Commission’s response to MEP’s concerns on Thursday as “absolutely unacceptable in political terms”.
EDPB doubts on Clearview AI legality
After the Parliament’s well-publicised concerns on Clearview AI, the umbrella organisation for data protection authorities in the EU, the European Data Protection Board, wrote to MEPs, noting how the use of Clearview AI technology by police agencies in the EU may violate EU data protection law.
In a June communication, the EDPB said that it is “of the opinion that the use of a service such as Clearview AI by law enforcement authorities in the European Union would, as it stands, likely not be consistent with the EU data protection regime.”
Earlier this year, it transpired that Clearview AI had scraped more than three billion facial images from social media sites including YouTube, Facebook and Twitter, without obtaining the permission of users.
The company had not, at the time, disclosed whether any of the images had been harvested from EU citizens. If this were to be the case, the software may violate the GDPR, Article 4 (14) of which covers the processing of biometric data.
Despite the lack of clarity here, an investigation by Buzzfeed News found that the company wants to expand its service to the European market, with nine European countries including Italy, Greece, and the Netherlands being touted as potential partners.
As a result of emerging concerns in Europe related to the scandal, the Commission had informed EURACTIV that they were “following the dossier and remain in close contact with national data protection authorities and the European Data Protection Board” over the issue.
[Edited by Sam Morgan]