Ukraine: The DSA cannot let filters blind us to war crimes 

DISCLAIMER: All opinions in this column reflect the views of the author(s), not of EURACTIV Media network.

"The DSA must protect our rights by including meaningful safeguards to uphold the fundamental rights of Internet users. It must also maintain the prohibition of general monitoring obligations, let that be automated or non-automated." [Shutterstock / ImageFlow]

The Digital Services Act (DSA) must protect our rights by including meaningful safeguards to uphold the fundamental rights of internet users, especially for those engaged in the indispensable work of documenting war crimes. 

Eva Simon is a senior advocacy officer at the Civil Liberties Union for Europe (Liberties), a European network of independent non-governmental organisations which aims to safeguard the human rights of everyone in the European Union. 

Caroline De Cock coordinates the Coalition for Creativity (C4C), which brings together libraries, scientific and research institutions, digital rights groups, technology businesses, and educational and cultural heritage institutions that share a common view on copyright. 

Russia’s invasion of Ukraine has once again brought war to Europe. As the fighting intensifies and drags on, so does the humanitarian crisis, deepened by the shocking and still unfolding evidence of war crimes.

We learn about these war crimes from videos recorded by journalists and individuals, many of them uploaded to social media. Citizens, journalists, human rights investigators, and advocates document what they see in the war zone.

The digital equipment most of us own, and the possibility for anyone to use big platforms, such as YouTube, Twitter, or Meta, means almost anyone can share information and collect evidence of war crimes. These videos are important for those who seek truth, who are on a mission for peace, and for those who seek to ensure justice. 

António Guterres, the United Nations Secretary-General, was “deeply shocked” by the images of civilian casualties in Ukraine circulating on social media over the last weeks. This showcases the role of online platforms in supporting the collection and preservation of such evidence to help create a basis for calls for independent investigations and effective accountability. 

Parallels can be drawn with the Syrian Archive. The unrest in Syria began in 2011 as part of the wider Arab Spring protests escalating into an armed conflict. From the start, people uploaded videos of the conflict to YouTube. As the situation escalated into a war, YouTube was used more extensively, and an organic war archive was created.

However, in 2018 Google started deleting videos from the Syrian Archive. Videos of war crimes were removed because automated filters based on machine learning flagged them as inappropriate content. As a result, evidence was lost forever. 

Having a war right at the border of the EU and witnessing the terrible videos from Bucha, Mariupol, and other areas of Ukraine where Russian troops tortured and killed civilians, reminds us of what is at stake. We must learn from the Syrian Archive’s failure and ensure that such videos are preserved, now and in the future.  

What infrastructure we use and create for such purposes, what we require from platforms big and small, how they fulfil a public service role in archiving and sustaining such videos, and how they are required to deal with inappropriate content are all crucial considerations.

This is where the Digital Services Act (DSA) comes in. It is supposed to regulate Big Tech companies to create a safe digital space where the fundamental rights of users are protected and to establish a level playing field. It also aims to create an environment where lawful content, including user videos documenting evidence of war crimes, can be safely stored without being compromised or deleted. 

In the case of the Syrian Archive, inaccurate automated tools removed the videos because these lacked understanding of linguistic or cultural nuances, and could not differentiate between journalistic resources and war propaganda. That contextual blindness still exists in today’s automated content moderation tools, and they still lead to situations where completely legitimate content is wrongfully made inaccessible.

We must learn from these mistakes, and ensure that journalists, activists, or anyone else can share their opinions or (video) evidence without the threat of it disappearing, simply because online platforms are coerced into implementing poorly working automation tools.

The DSA requires Big Tech companies to introduce risk analysis to predict and mitigate the negative effects on the exercise of fundamental rights, such as privacy, free speech, the prohibition of discrimination or the rights of minors. This should not be interpreted as requiring the mandatory use of automation.

Similarly, under no circumstances should co-regulatory measures such as the Code of Practice on disinformation impose pressure on online platforms to remove content so swiftly that it would necessitate the intensified deployment of automated tools.  

The DSA must protect our rights by including meaningful safeguards to uphold the fundamental rights of internet users. It must also maintain the prohibition of general monitoring obligations, let that be automated or non-automated. Also, it should certainly not impose, either directly or implicitly, the use of mandatory upload filters or other content moderation automation.​​

The DSA must preserve users’ privacy online and the continued prohibition against general monitoring by online platforms is an essential element thereof. In parallel, the ability to use the internet anonymously and through encrypted services offers crucial safeguards against monitoring. Such safeguards are in the text proposed by the European Parliament but now need to be embraced by the Council too. 

The DSA is going through the final stages of the trilogue negotiations. As the European Parliament rightfully recognizes in its mandate, it is crucial to prohibit the mandatory use of upload filters. Such prohibition is the only way to avoid disproportionate limitations on access to information, freedom of expression and personal data protection.

The EU legislators must provide sufficient safeguards to minimise the risk to fundamental rights stemming from opaque automated decision-making: it is essential to protect a safe and free internet, and to offer a safe harbour for those engaged in the indispensable work of documenting war crimes. 

Subscribe to our newsletters

Subscribe