New EU rules that will allow online messaging services to scan content for suspected abusive material should ensure that ‘error rates are as low as possible’ in identifying legal adult sexual content transmitted in online communications, an EU source close to the talks told EURACTIV.
Negotiators representing national ministers are holding technical talks with MEPs on new rules to allow online platforms to monitor online communications for examples of child sexual abuse material, in the so-called ‘e-Privacy derogation’ proposed by the Commission in September 2020.
In December, the EU’s telecoms code was widened to afford protections under the bloc’s ePrivacy directive, meaning that online messaging services would be unable to scan content for potentially abusive content.
The Commission’s September proposal was introduced as a means to enforcing an interim derogation from these safeguards, which would again allow online messaging services to monitor online communications.
With talks on the Commission’s plans ongoing this week, negotiators hope to start political discussions soon.
Despite positions between the two parties being quite far apart, EU member states would like to wrap up an agreement by the end of February, EURACTIV has been informed.
Adult ‘sexting’ dilemma
One of the sticking points in the technical talks so far, an EU source said, was the fact that the European Parliament is insisting on ‘human overview and intervention’ of content that has been flagged as potentially incriminating for containing child abuse material.
However, EU nations would prefer to see an ‘algorithm only’ approach, whereby purely technical means are used to flag potentially incriminating content, which is then transmitted directly to law enforcement agencies for further review.
“The use of algorithms in this process is so important in this case because you don’t have to rely on mandatory human review of content that may be consensual and legal, like adult sexting for example,” the source said. “Such a scenario could result in human moderators regularly looking through our intimate messages.”
“In this respect, mandatory human review makes no sense.”
By contrast, the source said, the sole use of algorithms would be able to make use of a range of data streams known as ‘key indicators.’
Such indicators would cover who has transmitted the content itself and whether the user has been flagged before, as well as whether the content has a high circulation rate as part of a specific messaging group, allegedly an indicator of potentially incriminating material.
And while the algorithm’s ‘key indicators’ can never guarantee a 100% success rate in catching child sexual abuse material, the EU source said that software providers have an interest in ensuring that error rates are as low as possible.
“We have to lower the error rates as much as possible and I think the providers are the most interested in this,” they said.
Anti-grooming tech & data storage deadlines
Elsewhere, the Parliament and the Council hold differing views on so-called ‘anti-grooming’ technologies, used to detect potential examples of grooming being conducted in online communications.
Parliament requests that such technologies should be subject to prior authorisation by the competent data protection authority, while the Council believes this could potentially cause obstacles further down the line.
Moreover, the subject of data storage in cases where child sexual abuse material has been identified also appears to be a sticking point between negotiators with Parliament demanding that law enforcement agencies are only able to store the material for a maximum of three months.
“We would prefer a situation whereby the data is able to be kept only until is necessary for the intended purposes, meaning until at the end of any necessary judicial investigation,” the EU source said.
Calls for fast progress
For their part, the Commission has been keen to foster progress on the file.
During Home Affairs Commissioner Ylva Johansson’s speech at the EU Internet Forum recently she urged Council and Parliament negotiators to come to an agreement, and revealed that the Commission is working on “permanent legislation” that makes it “obligatory for Internet companies to report and remove child sexual abuse.”
The temporary regulation that the Commission has thus far put forward is limited in time to December 2025.
Meanwhile, earlier this week, a letter from a cross-section of MEPs emerged, calling on negotiators to make progress.
“We are following with deep concerns the development of the current negotiations, given that the lack of an agreement is putting children in severe jeopardy,” the letter said.
“Every day counts, because every day without the adoption of the temporary derogation, means that countless number of children are left unprotected, because tech companies can no longer detect their abuse online and police can no longer save them, with also impunity of perpetrators.
[Edited by Benjamin Fox]