Last month, Senior Clinical Instructor Bonnie Docherty traveled with students to Geneva for the first multilateral meeting of the Convention on Conventional Weapons devoted to fully autonomous weapons, or “killer robots.” Below is her re-cap of the week’s events, published originally on May 23, 2014 in the online forum Just Security.
“Taking on ‘Killer Robots'”
New weapons that could revolutionize killing are on the horizon. Lethal autonomous weapons systems, also called fully autonomous weapons or “killer robots,” would go beyond today’s armed drones. They would be able to select and fire on targets without meaningful human intervention. In other words, they could determine themselves when to take a human life.
Representatives from 87 countries gathered at the United Nations in Geneva last week to discuss concerns about this technology and possible ways to respond. The conference was the first multilateral meeting dedicated to lethal autonomous weapons systems. It represented a crucial step in a process that should result in a ban on these problematic weapons before it grows too late to change course.
Human Rights Watch and Harvard Law School’s International Human Rights Clinic are calling for a pre-emptive prohibition on the development, production, and use of these weapons. The Campaign to Stop Killer Robots, a global coalition of 51 nongovernmental organizations coordinated by Human Rights Watch, is making the same call.
Overall, the talks in Geneva were productive and positive. The conference, under the auspices of the Convention on Conventional Weapons (CCW), attracted hundreds of delegates from governments, the United Nations, the International Committee of the Red Cross, and nongovernmental groups, setting a record for a CCW meeting. Participants engaged in four days of substantive discussions about the technical, ethical, legal, and operational concerns raised by fully autonomous weapons.
This “informal meeting of experts” was also noteworthy for its timeliness, unusual for a CCW conference. This meeting took place just a year and a half after Human Rights Watch and the Harvard clinic issued a groundbreaking report on these weapons, Losing Humanity: The Case against Killer Robots, which the UN website credited with bringing the issue to “the international community’s attention.”
The meeting illuminated both areas of emerging agreement and ongoing points of contention. At their next meeting in November, states parties to the Convention on Conventional Weapons should show that they are serious about taking action to deal with fully autonomous weapons and adopt a mandate for even deeper discussions in 2015.
Areas of Emerging Agreement
Four promising themes emerged at the recent meeting. First, there was widespread support for continuing discussions. The countries made clear that they saw last week as merely an initial foray into the issue. Many delegates also explicitly recognized the importance of continuing to involve nongovernmental groups, including the Campaign to Stop Killer Robots and its member organizations.
Second, a significant number of countries expressed particular concern about the ethical problems raised by fully autonomous weapons. The chair’s final report noted that these countries “stressed the fact that the possibility for a robotic system to acquire capacities of ‘moral reasoning’ and ‘judgment’ was highly questionable.” Furthermore, these machines could not understand and respect the value of life, yet they would be given the power to determine when to take it away. Fully autonomous weapons would thus threaten to undermine human dignity.
Third, many countries emphasized that weapons systems should always fall under “meaningful human control.” While the parameters of this concept will require careful definition, obligating nations to maintain that control is vital to averting a watershed in the nature of warfare that could endanger civilians and soldiers alike.
Finally, countries frequently noted in their statements the relevance of international human rights law as well as international humanitarian law. Human rights law applies in peace and war, and it would govern the use of these weapons not only on the battlefield but also in law enforcement operations. In a new report released last week, Shaking the Foundations: The Human Rights Implications of Killer Robots, Human Rights Watch and the Harvard clinic found that fully autonomous weapons could contravene the rights to life and a remedy as well as the principle of dignity.
Legal Debate
The most contentious part of the discussion surrounded the application of international humanitarian law to fully autonomous weapons. The debate echoed many of the points raised in a second paper that Human Rights Watch and the Harvard clinic released at the meeting. “Advancing the Debate on Killer Robots” responds directly to 12 critiques of a ban on the weapons.
The meeting revealed a divergence of views about the adequacy of international humanitarian law to deal with fully autonomous weapons. Critics of a ban argue that problematic use of these weapons would violate existing law and that supplementary law is unnecessary. A new treaty banning the weapons, however, would bring clarity, minimizing the need for case-by-case determinations of lawfulness and facilitating enforcement. It would also increase the stigma against the weapon, which can influence even states not party to a treaty to abide by a ban. In addition, a treaty dedicated to fully autonomous weapons could address proliferation, unlike traditional international humanitarian law, which focuses on use.
The debate about the adequacy of international humanitarian law to deal with fully autonomous weapons is reminiscent of arguments made in earlier Convention on Conventional Weapons meetings about cluster munitions. The adoption of the 2008 Convention on Cluster Munitions by 107 states resolved that dispute. Prohibitions on five other weapons that cause unacceptable humanitarian harm—antipersonnel landmines, blinding lasers, chemical weapons, biological weapons, and poison gas— provide additional precedent for new law. While most states are reserving judgment on the best solution to deal with the problems posed by fully autonomous weapons, five countries called for a ban last week.
Participants in the last week’s meeting also disagreed about when action should be taken. Critics of a ban supported a wait-and-see approach, arguing that improvements in technology could address the obstacles to compliance with international humanitarian law. There are serious doubts, however, that robots could ever replicate certain complex human qualities, such as judgment, necessary to comply with principles of distinction and proportionality. Furthermore, grave ethical concerns, the likelihood of proliferation and a robotic arms race, an accountability gap, and the prospect of premature deployment all suggest a technological fix would not suffice to address the weapons’ problems.
Action should be taken now before countries invest more in the technology and become less willing to give it up. The pre-emptive ban on blinding lasers in Protocol IV to the Convention on Conventional Weapons can serve as a useful model.
Next Steps
Despite some points of disagreement, the meeting advanced efforts to deal with fully autonomous weapons. Nations need to keep up momentum, however, to avoid having such meetings become what some have called a “talk shop.” In the short term, individual countries should establish national moratoria on fully autonomous weapons.
In November, the parties to the Convention on Conventional Weapons should adopt a mandate to study the issue in greater depth in 2015. They should agree to hold three to four weeks of formal meetings, known as a Group of Governmental Experts. They should also be clear that the meetings would be a step toward negotiating a new protocol on fully autonomous weapons. Such intense discussions would move the debate forward. They would show that the treaty members are committed to addressing this issue and that the Convention on Conventional Weapons is re-emerging as an important source of international humanitarian law.