Trade secrets don’t trump our rights

DISCLAIMER: All opinions in this column reflect the views of the author(s), not of EURACTIV Media network.

The DSA could signal change when it comes to the power of big tech platforms, but this might be jeopardised by proposed language that could enable an overly broad “trade secrets” exemption. [Shutterstock / kenary820]

Europe’s sophisticated draft legislation – the Digital Services Act (DSA) – could finally change the asymmetry of power and information between Big Tech and citizens by enabling data access for regulators and vetted researchers to look ‘under the hood.’ 

Sarah Andrew is the Legal Director at Avaaz.

Claudia Prettner is a Legal Advisor at Amnesty International.

But there is a threat. And it comes from proposed language that could enable an overly broad “trade secrets” exemption. Big Tech will want that language in, so they can continue to obfuscate the true societal cost of optimising their algorithms for maximum engagement. That’s why Big Tech’s most high profile whistleblower Frances Haugen warned lawmakers on the floor of the European Parliament, “if you write a broad exemption from transparency for anything classified as a “trade secret” — the companies will say everything is a trade secret.”

Every time we demand data transparency and accountability, they will claim trade secrets exemptions – potentially forcing frequent, lengthy and costly litigation – an attrition battle only the tech giants can afford.

Such broad exemptions should simply not be allowed as the companies commercial interests do not trump the protection of the fundamental rights of European citizens using these platforms. This is clearly established in the EU Directive on the Protection of Trade Secrets but overly broad language in the DSA could undermine this principle in the platforms’ favour.  And as we already know the platforms have a track record of abusing our rights, routinely withholding information and actively misleading both regulators and the public.

When asked during a congressional hearing in March whether Facebook’s platforms “harm children” Facebook CEO Mark Zuckerberg said, “I don’t believe so.” However, based on Facebook’s own internal research, revealed by whistleblower Frances Haugen, “13.5% of teen girls on Instagram say the platform makes thoughts of ‘Suicide and Self Injury’ worse” and 17% say the platform, which Facebook owns, makes “Eating Issues” such as anorexia worse. Their research also claims Facebook’s platforms “make body image issues worse for 1 in 3 teen girls.” It took a whistleblower to reveal these algorithmic harms.

Facebook misled investors and the public about its role in perpetuating misinformation and violent extremism related to the 2020 US election and January 6 insurrection in Washington DC. The company repeatedly claimed it was taking all necessary measures to address such content. In reality, documents reveal that it knew its algorithms and platforms promoted this type of harmful content and it failed to deploy internally-recommended or lasting counter-measures in favour of promoting growth at all costs.

The EU must also not forget that it has already fined Facebook $122 million for providing “misleading information” about the company’s takeover of WhatsApp. After first claiming that there was no possibility of “reliable automated matching between Facebook users’ accounts and WhatsApp users’ accounts”, the company released new terms of service that raised the possibility of doing exactly that.    

Platforms will of course want to protect the secrets of their algorithms because it’s their secret “recipe”. But for all of the reasons above, it is vital that we protect the beating heart of the DSA — which seeks to create a regime that will ensure the largest platform monopolies are forced to anticipate and address systemic risks that pose a threat to our democracies, social cohesion and fundamental rights, including the rights of children. Platforms will be required to identify risks themselves, but the law could finally also give third-party vetted researchers the power to examine the data independently after receiving a request from the Digital Services Coordinator of Residence or the Commission. Indeed, the law should expand the definition of ‘vetted researchers’ to civil society that have a track record of rigorous research and integrity.

Shepherding this complex legislation is a tough job at the best of times and we admire the long hours and thought that have gone into shaping this law during an unprecedented pandemic. The DSA holds such promise, in reversing the asymmetry of knowledge where the platforms hold all the cards – and we applaud the efforts of those legislators who have continued to push for civil society rights to access data alongside regulators and academic institutions. We urge the staff and political representatives working on this right now not to inadvertently negate their own efforts and urgently remove reference to this trade secrets exemption (in Article 31.6).

Only the removal of such a broad exemption will give us the crucial independent oversight that we as citizens, policy experts and campaigners have been fighting for for years. Over 100 organisations with a combined membership of 70 million European citizens have signed the People’s Declaration demanding an end to this status quo where Big Tech “calls the shots as judge and jury” and put us, the citizens, back in charge.

Subscribe to our newsletters