One-third of primary-school-age children and half of teenagers fear they are addicted to digital services. Additionally, 10% of 12–13-year-olds believe they are addicted to porn, and 15% of children have stolen money to fund loot box purchases in online games.
Leanda Barrington-Leach is the Head of EU Affairs and the 5Rights Foundation.
Parents are struggling, teachers are struggling, children suffer, and mental health professionals see the impact. Together, they ask for a change to the digital world that was not designed with children’s interests in mind.
The principle is not in dispute: children must be protected, and their rights apply online and offline. The European Parliament and Council agree that tech companies should ensure a high level of safety, privacy, and security for children.
Yet as they negotiate the Digital Services Act (DSA), the devil is in the detail. Last-minute compromises risk directly undermining this aim by creating a proliferation of loopholes that will be exploited by tech companies who prioritise profit over protection.
To deliver for children, the DSA must do four things:
First, it must recognise that a child has specific rights set out in international law – until 18, not younger, as specified by tech companies when it comes to opening accounts. A child of 13 or 16 is not an adult, and in many ways, older children are at greater risk since younger children access fewer products and services, have greater adult supervision, and spend less time online. All children deserve protection.
Second, children’s rights must apply wherever they are, not just on social media or platforms aimed specifically at them. As a basic rule, any service accessible to children must be within scope, including online games, smart speakers, ed-tech and augmented or virtual reality worlds.
Third, the DSA must tackle the disease, not just treat the symptoms. The digital world is a system designed to grab and hold users’ attention – including children. Tinkering around the edges with prescriptive requirements such as parental controls or limited screen time does little to lift the real risk of children accessing online harm or meeting harmful strangers. A tech-neutral and future-proof approach is essential, requiring services to ensure safety, privacy and security by design and default for children.
Fourth, it must ensure tech companies can no longer claim ignorance of children on their sites by requiring effective, proportionate and privacy-preserving age assurance systems.
The power tech companies yield is prolific. Parents, teachers and children feel powerless. Regulators are not. It’s time to change the equation and lay the foundations for the digital world that children and young people deserve.