Don’t throw out the Digital Services Act’s key accountability tools

DISCLAIMER: All opinions in this column reflect the views of the author(s), not of EURACTIV Media network.

The Digital Services Act (DSA) is a landmark legislative proposal to regulate online content and services. [Shutterstock]

The Digital Services Act’s provisions on risk assessment and audit must stay and be improved if we are to hold tech companies accountable for harmful business models and rights violations, write Nienke Palstra, Emma Ruby-Sachs, Claudia Prettner, and Jesse Lehrich.

Nienke Palstra is a senior campaigner at Global Witness.

Emma Ruby-Sachs is the executive director at SumOfUs.

Claudia Prettner is a legal & policy adviser at Amnesty International.

Jesse Lehrich is the founder at Accountable Tech.

As temperatures rise in Brussels this summer, the negotiations for the EU’s Digital Services Act are similarly heating up.

It may have gone unnoticed last month that the draft opinion from the civil liberties committee  – which now has joint responsibility for the file in the European Parliament – written by German Pirate MEP Patrick Breyer, makes a radical proposal to delete two key accountability provisions for very large online platforms in the regulation: on risk assessments (Article 26) and audits (Article 28).

The rationale given by Breyer for deleting these provisions are concerns over freedom of expression – broadening the scope of the legislation to content which may be harmful but legal – and the power given to platforms and private auditing bodies over public authorities.

However, deleting these Articles would be counterproductive to protecting the very right Breyer is concerned about. Risk assessments and audits are needed to hold tech platforms to account for harmful design choices, such as recommender and online advertising systems, and to ensure risks stemming from their services are properly countered.

Platforms’ algorithmic recommender systems are more than just that. All too often they act as manipulation machines, designed to amplify the most toxic content – including violence, harassment and conspiracy theories.

Abusive and hateful content has been shown to particularly target women, people of colour, LGBTI people and other vulnerable groups – with the outcome of silencing people or driving them offline, thereby directly undermining the right to freedom of expression.

The way these Articles are currently drafted are not perfect. The wording leaves too much leeway for platforms to find loopholes and comply with weak or problematic follow up action.

Article 26 is currently based on a discrete list of cherry-picked fundamental rights for platforms to assess themselves against, missing some of the most obvious and widespread rights breaches – including data protection and consumer rights.

For audits, there are valid concerns that this will replicate some of the problems we already see with financial audits – lining the pockets of the Big Four accountancy firms while serious questions over conflicts of interest persist.

But to do away with the provisions on risk assessment and audit entirely would be to throw away the baby with the bathwater.

The Digital Services Act proposal attempts to straddle a fine and thorny line to protect free speech and prevent “upload filters” online while still holding tech companies accountable for their harmful business practices and erosion of fundamental rights.

Without risk assessments and audits for Big Tech, how else will regulators be able to hold their feet to the fire for systematic rights violations?

Designed in the right way, risk assessments and audits are an elegant solution to flexibly and robustly interrogate the impact of big tech companies on our societies and rights and force them to change course at a systems level. In short, risk assessment and audit present a route out of the content moderation whack-a-mole.

The text can and should be improved by making the risk assessments comprehensive, looking at the real and potential impacts of tech companies’ services and their business models on all fundamental rights.

This should be better aligned with international standards on human rights due diligence such as the UN Guiding Principles on Business and Human Rights. In addition, it is critical that platforms shouldn’t just have to mitigate systemic risks but where they’re identified should also be forced to cease them altogether.

Breyer is right in that serious consideration must be given to how audits are carried out – do we really want risk assessments to be outsourced to private entities handpicked by the businesses they’re meant to assess?

Audit is a critical tool in the DSA’s arsenal, one that should be carried out by independently appointed experts and not necessarily traditional accounting firms. For audits to be meaningful, the experts must have expertise on platform design and be mandated to see the necessary information and data for them to do their job – including platforms’ algorithms.

This type of auditor might not exist yet, but the EU should strive to encourage the emergence of one, rather than not having audits at all.

The Digital Services Act needs improving, but not in the way that Breyer suggests.

Subscribe to our newsletters

Subscribe