As the European Commission is due to present a new proposal next month on how to fight child sexual abuse material (CSAM), Beeban Kidron strikes a more cautious tone, warning the terms of the ongoing policy debate are not the right ones.
Kidron is a former film director, member of the UK House of Lords and chair of the 5Rights Foundation, an NGO focused on child protection in the digital sphere.
- Regulation should focus on security by design.
- CSAM is to interact with the DSA, AI Act and GDPR to address the complexity of the issue.
- Children safety online is a global issue at a tipping point.
The CSAM proposal is due to address the screening of online messages to detect child abuses, a measure put temporarily in place by the ePrivacy derogation. Do you think these measures should be continued in the new proposal?
There is no yes or no answer. We’re in this argument about one piece of technology, but we’ve taken our eyes off the problem and we need a more nuanced approach. End-to-end encryption is an important part of all our safety, but it is not the issue here. The issue is what drives investments in child safety.
In the UK, this summer we saw incredible investments in child safety thanks to the Age Appropriate Design Code. The regulation came in, money and product teams were put on that issue. In Europe, there are no proof points, no KPIs, no comparison of data. The point is what this regulation is going to demand to ensure children are not at risk.
Which are the platforms where you find more child sexual abuse material?
The vast majority is in the dark web, in hidden corners. That is a fight for law enforcement. However, most problems are due to careless online platforms. 75% of kids are on social media where a strange adult could contact them directly. There are features that make children feel that they must sexualise themselves, as the Facebook Papers showed one in three teenage girls on Instagram had body comparison issues.
Everybody’s talking about this thing down the line, takedown, but there are some basic things we are not insisting upon. We need to go upstream and consider any feature that might facilitate grooming, or drive distribution. If YouTube, Facebook, OnlyFans and the likes can’t work it out, they shouldn’t be running their service because it’s dangerous. If they can work it out, then they need to show us how, so we have trust.
How is the online environment not safe for children and what can be done to make it safer?
There are important issues in the attention economy: bombardment, separation, exaggeration, addiction. The problems we have as a society affect children too, but they don’t have other life experiences. That’s why the AI Act is as important as the CSAM Act. The DSA, with its risk assessment, might have a profound impact if it is properly enforced. Same for the GDPR. If we create a demand for safe services, rather than obsessing about the technology, we get to a different place.
Some platforms say that if their services were not safe, people would not use them.
How can they say that when they’ve been found to create ethnic unrest in Myanmar, in Ethiopia? That’s a cynical and ugly thing to say, but I am not surprised. We go in cars, we know we might crash, but we train the driver, we check the car, we have rules of the road. If they have no interest in making their services fit for users, then they have to open it up to scrutiny so people know exactly what choice they’re making.
Another discussion that spurred controversy is requiring users to be identified, as that would mean the end of anonymity online. How can children be identified?
Platforms are weaponized to know who you are. The question is that they don’t want to know the age because they would have legal responsibility. We did some research showing platforms showing that, for the purposes of advertising, they know how old users are. For the purposes of protection, they don’t.
How is the UK’s legislation developing compared to the EU?
Europe has created the weather with the GDPR. The European Commission has recognised the tech question is as vast as an industrial revolution and it has taken a moral stance, demanding technology to reflect our values. In the UK there is a particular recognition of child safety thanks to the Code and in the upcoming Online Safety Bill.
However, the issue is much bigger than both Europe and Britain, it’s about building a global consensus towards what good looks like. We see similar discussions in Australia, Canada, California, the African Union, Brazil. My big hope is that we have stopped arguing about the problem, and we’re all trying to negotiate solutions. We’re at a tipping point, and just like GDPR set a trajectory we’re still in the middle of what that means.