No progress in UN talks on regulating lethal autonomous weapons

A killer robot. [International Committee of the Red Cross]

Attempts to regulate lethal autonomous weapon systems (LAWs), often dubbed as “killer robots”, have once again ended in a stalemate as UN talks in November produced few results. Europe, meanwhile, is struggling with its role in the regulation efforts.

So far, discussions sought to address the ramifications of autonomous weapons systems on human rights as well as the ethical and security issues that arise as part of the assimilation of such systems into modern warfare.

Autonomous weapons are technologies such as drones, tanks and other machinery controlled by a computer run on artificial intelligence systems and programmed to select and attack targets, without human control.

At the UN level, at least 28 governments are demanding a ban on artificial intelligence weapons, while both the US and Russia have blocked any moves to form legally binding agreements on autonomous weaponry.

Other major military powers, including China, Israel, South Korea and the UK, are also racing to develop autonomous weapons systems.

In September, China announced it would join the ban group, saying it would support prohibiting fully autonomous weapons, but clarified Beijing was only against their use on the battlefield and not their production and development.

No progress at UN level

At the November meeting of member countries of the Convention on Conventional Weapons (CCW) at the United Nations in Geneva, diplomats could not agree on a binding common approach towards the issues and decided to continue talks regulating lethal autonomous weapons systems or fully autonomous weapons for the next two years.

UN diplomats expressed disappointment that “the next two years will be spent on non-binding talks instead of concrete legal work”. According to them, it was mainly “Russia watering down the agenda contents and pushing back on all fronts, while developing their robot army until 2025.”

Unsatisfied with the lack of progress in the CCW, the Campaign to Stop Killer Robots, an international NGO, is increasingly urging countries to consider bypassing the convention entirely to negotiate a separate treaty.

“If the CCW cannot produce a credible outcome, alternative pathways must be pursued to avoid a future of autonomous warfare and violence,” the Campaign said.

In a speech to the Paris Peace Forum earlier this month, UN Secretary-General Antonio Guterres once again called for a new international treaty to ban LAWs, saying that “machines that have the power and discretion to kill without human intervention are politically unacceptable and morally despicable”.

“And a new arms race – the cyberarms race – is already underway. The danger is that the next war will be triggered by a massive cyberattack,” Guterres recently warned.

Activists against the so-called “killer robots” have pleaded with world leaders to draft regulations for any arms product heading into battle as they fear they could become dangerous in a cyber-attack or as a result of a mistake in their programming.

EU: Ban or regulate?

Almost three in every four Europeans want their governments to work for an international treaty prohibiting lethal autonomous weapons systems, according to a recent poll from across 10 European countries conducted by the International Campaign to Abolish Nuclear Weapons (ICAN).

“Banning killer robots is both politically savvy and morally necessary,” said Mary Wareham, campaign coordinator and the arms division advocacy director at Human Rights Watch.

“European states should take the lead and open ban treaty negotiations if they are serious about protecting the world from this horrific development,” she added.

The EU took a stance against “killer robots” last year when the European Parliament passed a resolution calling for an international ban on the development, production and use of weapons that kill without human involvement.

Experts, however, point out that technological development might have progressed already too far to implement a full ban and argue for regulatory measures instead.

“Autonomy is neither new nor problematic per se, because defensive weapons systems such as missile defence 30 years ago already had such features,” said Frank Sauer, senior researcher at the Bundeswehr University Munich.

According to him, the novelty is using the “select and engage” option without human intervention in all kinds of weapons systems, not only defensive ones.

“If we delegate the decision of killing a combatant on the battlefield, we are infringing on the human dignity with leaving the act to an anonymous machine that does not understand what human life is,” Sauer said.

EU lawmakers are currently looking for ways on how to impose certain limits and standards with work towards a common definition steering towards the creation of norms, even in the absence of binding legal instruments.

A majority argues that there should be a similar kind of agreement established for LAWs as has been done in the past with the use of chemical weapons, branding users as pariahs.

EU’s military projects

Asked how it will be ensured on EU level that LAWs don’t find their way into the EU’s military initiatives or research conducted under the European Defence Fund (EDF), Green MEP Hannah Neumann recently stressed that the main problem with EU projects is transparency.

“I do not understand why the European Parliament has given up the power of oversight,” she said, adding that the EDF statutes include an ethical committee controlling funding proposals.

Nevertheless, Neumann stressed there is no full framework yet in place and a problem is also in attempts to use the civilian framework for research and development for defence research.

“At the moment, LAWs are banned, but there is technology in development that just needs adding on LAWs-components to make the systems autonomous,” the Green MEP said, pointing to the fact that bilateral projects, such as the future European next-generation fighter jet (FCAS), are more difficult to monitor.

Unlike other fighter jets in use, the FCAS will also include a range of associated weapons, such as swarms of unmanned aerial carriers (drones) interconnected by a cloud, surveillance and command aircraft, cruise missiles, satellites and ground stations.

[Edited by Zoran Radosavljevic]

Subscribe to our newsletters

Subscribe
Contribute