How EU students are being forced into a surveillance nightmare

DISCLAIMER: All opinions in this column reflect the views of the author(s), not of EURACTIV Media network.

Requiring students to submit to biometric surveillance should be illegal, writes Nakeema Damali Stefflbauer. [Shutterstock]

Requiring students to submit to biometric surveillance should be illegal, writes Dr. Nakeema Stefflbauer.

Dr. Nakeema Stefflbauer is a Berlin-based digital transformation executive. She is the founder of FrauenLoop.org, a women’s programming non-profit in Berlin, and the Techincolor.eu network of racially diverse tech leaders across Europe.

The past year has been a clash of cultures in the worst of ways: from customer-facing private companies to public schools and universities, sectors ill-equipped for online operation have struggled for survival. Real-time communication shifted first, as in-person meetings morphed into Hopin, Whereby, Hangouts, Slack, or Zoom on-demand sessions.

Some could choose how to access the internet according to their needs. But many did not, and still do not have such options. My children’s private school pays for a subscription to Microsoft Teams: when remote classes were mandated, they immediately moved lessons online. But many of my friends throughout Europe with children in public schools laughed when I mentioned online learning. Or, as one parent told me “we got a book list and got wished good luck.”

It’s not hard to empathize with schools that lack both the budget and the technical expertise to adapt to sudden digital disruption. However, switching from public infrastructure to private tech company solutions is usually a fast-track to handing over control of service quality, privacy and other rights, and accessibility standards to businesses that are under no obligation to uphold such things.

What follows is an expansion of private infrastructure for keeping an eye on people, whether they are suspected terrorists, or refugees, or youthful applicants to institutes of higher education. For criminal suspects, there is Clearview AI (to match photos of faces scraped from the internet with police watch lists); for immigrants and refugees, there is General Atomics (real-time facial recognition software for unmanned border control). For aspiring university students, there is Proctorio, ProctorU, and other software (biometric eye-tracking scans, facial and environment surveillance, and sound monitoring).

As an EU resident, I feel dismayed by the relentless push to sell American surveillance technologies to local law enforcement, to border control forces and, whenever possible, to European public institutions that try to digitally “catch up.”

Inside Europe, we often hear about the ‘global AI race’ as if it’s a matter of who will first invent flying cars and lifelike robots. In reality, we’re facing the rollout of a range of spying tools that may soon create a “perpetual line-up” for us to access schools and hospitals.

This is not an exaggeration: this week, my child received instructions for the entrance exam of a public university in the Netherlands. The mandatory software for all students who wish to apply to the school? Proctorio.

Any online search offers a comprehensive, and frightening, picture of this software. Black students in Canada and the US report being obliged to shine interrogation-style bright lights onto their faces for Proctorio to register them.

North American law students urinate in bottles and buckets for fear of their exams being terminated over a bio break. No solutions are offered for the student whose younger sibling speaking could cause them to be flagged for ‘cheating.’ Proctorio offers no alternatives for neurodiverse students, for whom looking away from the camera, or at their laps, or blinking too much (or not enough) can trigger Proctorio’s ‘abnormal behaviour’ tag.

Civil society, human rights, and privacy advocates have rung the alarm about invasive and discriminatory biometric surveillance via a pan-European ‘Reclaim Your Face’ campaign and a recent 40-page report demanding fundamental rights protection by the EU Commission.

In January, 62 digital and human rights organizations called on the European Commission to include limits on the most harmful uses of AI in legislation, including student surveillance technology. With an EU legislative proposal on AI due to be published only in April, now is a perfect time for EU public administrators in search of quick fixes to ignore the waves of US student petitions that have described Proctorio software as “unsafe and a complete violation of students’ privacy.

It is possible that many don’t know how fast surveillance technology becomes ubiquitous. Recording, opaquely ranking and digitally stalking anyone is unethical. Requiring students to submit to biometric surveillance should be illegal. More than just a miserable consequence of being young in the age of Coronavirus, lack of regulation on the use of proctoring software is almost a guarantee of discriminatory impact.

Some students may accept having their movements tracked as if they were criminals prowling a bank after hours instead of children taking a test in their homes. But after a year when young people everywhere have suffered massively from COVID-19 disruptions, it is unconscionable to allow private companies to deprive them of the rights to privacy and dignity which are rightfully enshrined in the EU Constitution.

Subscribe to our newsletters

Subscribe