EU Council presidency conclusions on Artificial Intelligence and human rights failed to secure unanimous backing from member states last week after Poland refused to support a text due to opposing the inclusion of the term ‘gender equality,’ EURACTIV has learned.
The German presidency of the EU Council adopted on 21 October conclusions on the charter of fundamental rights in the context of artificial intelligence, but stated that ‘one member state’ objected to the inclusion of ‘gender equality’ on the grounds that “neither the Treaties nor the EU Charter of Fundamental Rights uses the term ‘gender.’”
All other 26 nations, however, had insisted that the phrase be included in the conclusions. Multiple EU sources confirmed to EURACTIV that Poland was the member state which opposed the phrase.
EURACTIV followed up with the Poles, who came clean on their opposition. In a statement, Ambassador Andrzej Sadoś said the country “strongly supports equality between women and men, as it stands in all EU documents based on terminology adopted in primary law.”
“The Treaties refer to equality between women and men, similar to the Charter of Fundamental Rights. The meaning of ‘gender’ is unclear; the lack of definition and unambiguous understanding for all member states may cause semantic problems. Neither the Treaties nor the Charter of Fundamental Rights use the term ‘gender’,” Sadoś continued.
The ambassador added that his country would work to adopt the conclusions on fundamental rights in the future.
Poland has recently come under heavy criticism in Brussels for a series of measures put in place that some believe encroach on gender rights.
A court ruling in the country recently outlawed the majority of abortions, with a small number of exceptions, including in cases of rape or incest.
This followed Poland’s July decision to withdraw from the Istanbul Convention on Preventing and Combating Violence against Women and Domestic Violence, at a time when dozens of conservative and Catholic regions across the country were adopting resolutions that criticised ‘LGBT ideology.’
‘Clear legal requirements for AI’
Elsewhere in the German Presidency’s conclusions on Artificial Intelligence and fundamental rights, certain nefarious applications of next-generation technologies were cited as a potential cause for concern.
Both the use of mass surveillance technologies and facial recognition applications should be subject to ‘clear legal requirements’, the text said.
“We note the ongoing discussion on whether such systems should be used in principle and on possible bans on their use. To the extent that these systems are to be used, we recognize that clear legal requirements need to be formulated beforehand.”
The European Commission has not entirely discounted the possibility of a ban on the use of facial recognition in public spaces. Speaking to MEPs in the European Parliament’s Internal Market Committee in September, Kilian Gross of the Commission’s DG Connect said that all options were still on the table with regards to a possible ban on this technology.
Ahead of the official publication of the Commission’s White paper on Artificial Intelligence in February this year, which presented its framework strategy for mitigating the future risks of AI, a leaked working document revealed that the Commission had been weighing up the possibility of introducing a temporary moratorium on facial recognition technologies.
Parliament on fundamental rights and AI
Elsewhere in Brussels, last week MEPs adopted proposals on how the EU can best regulate Artificial Intelligence (AI) as a means to foster innovation, ethical standards, and trust.
One initiative report, led by Spanish Socialist Iban García del Blanco, urged the Commission to develop ethical standards that abide by a human-centric approach, while also safeguarding against bias and discrimination.
Del Blanco’s text duly highlighted the importance of preserving non-bias in AI technologies, noting that discrimination should not be conducted on grounds including race, gender, sexual orientation, among a range of other metrics.
[Edited by Zoran Radosavljevic]