UN talks continue this week to try and turn the screw on lethal autonomous weapon systems (LAWS) or what has been termed “killer robots”.
Discussions are taking place that seek to address the ramifications of autonomous weapons systems on human rights as well as the ethical and security issues that arise as part of the assimilation of such systems into modern warfare.
Ambassador Amandeep Singh Gill of India is chairing talks and has the challenging task of moderating a controversial issue that has attracted broad media attention and vociferous criticism from stakeholders.
While Russia clearly opposes any potential global ban on autonomous weapons systems and the US maintains an ambiguous stance, EU nations will no doubt want to substantiate their positions set out during the first round of talks in April.
As reported by EURACTIV, the EU’s position in April was that “humans should make the decisions with regard to the use of lethal force, exert sufficient control over lethal weapons systems they use, and remain accountable for decisions over life and death.”
Since the first round of talks in April, the EU has taken pragmatic steps to ensure that the rise of ‘killer robots’ is stifled.
The European Parliament adopted a resolution in July that called for “an international ban on weapon systems that lack human control over the use of force,” in developing a common position on the use of autonomous weapon systems in the field of war.
Meanwhile, some states have taken it upon themselves to crack down on the development of killer robots.
The Belgian parliament’s defence committee adopted a July resolution asking the government to support an international consensus against the employment of autonomous weapons systems while Italy’s Network for Disarmament organised a July conference at the national parliament in Rome to discuss tighter regulation on autonomous weapons systems.
However, other states have bucked the trend. In June, the UK government refused to consider a revision of its definition of autonomous weapons systems that would align more closely to international standards.
The United Nations Association of the UK had previously called on the British government to clamp down on the development of advanced technologies that could lead to robotic warfare.
Britain’s reluctance to cooperate fully with the international community is also reflected by France and Germany, both of whom have previously stood against a complete ban on autonomous weapons systems.
This week, activists keen to influence the outcome of the talks have voiced concerns. The Campaign to Stop Killer Robots has been pushing for an all-out ban, while Amnesty International has warned of the need for legislation to catch up with technological developments.
Rasha Abdul Rahim, a researcher from Amnesty International said in a statement on Monday (27 August): “Killer robots are no longer the stuff of science fiction. From artificially intelligent drones to automated guns that can choose their own targets, technological advances in weaponry are far outpacing international law.”
The use of fully autonomous weapons has also come under criticism from the Human Rights Watch (HRW). The HRW recently published a report stating that the development, production and use of autonomous weapons directly contravenes the Martens Clause, a commonly cited referent in international humanitarian law and disarmament treaties.
The clause states: “civilians and combatants remain under the protection and authority of the principles of international law derived from established custom, from the principles of humanity and from the dictates of public conscience.”
#Fullyautonomousweapons would lack the human judgement necessary to evaluate the proportionality of an attack, distinguish civilian from combatant, and abide by other core principles of the laws of war. Just one of the reasons we're at the #CCWUN today. #KillerRobots pic.twitter.com/qMmlMnkXHA
— Campaign to Stop Killer Robots (@BanKillerRobots) August 28, 2018
Activists have suggested that the element of ‘public conscience’, requiring the application of moral standards, is something that cannot be replicated in the decision-making process of machines. As such, no other option other than a ban on these systems should be considered.
“A ban on fully autonomous weapons systems could prevent some truly dystopian scenarios,” Rahim said. “We are calling on states present in Geneva this week to act with the urgency this issue demands, and come up with an ambitious mandate to address the numerous risks posed by autonomous weapons.”
This week’s discussions take place ahead of EU foreign policy chief Federica Mogherini’s statement on autonomous weapon systems in september’s Strasbourg plenary session.