Formalize ‘Killer Robots’ Talks; Aim for Ban

Fully Autonomous Weapons on Disarmament Conference Agenda

(Geneva, December 9, 2016) – Governments should agree at the upcoming multilateral disarmament meeting in Geneva to formalize their talks on fully autonomous weapons, with an eye toward negotiating a preemptive ban, Human Rights Watch said in a report released today.

The 49-page report, “Making the Case: The Dangers of Killer Robots and the Need for a Preemptive Ban,” rebuts 16 key arguments against a ban on fully autonomous weapons.

Fully autonomous weapons, also known as lethal autonomous weapons systems and ‘killer robots,’ would be able to select and attack targets without meaningful human control. These weapons and others will be the subject of the five-year Review Conference of the Convention on Conventional Weapons (CCW) from December 12-16, 2016.

“It’s time for countries to move beyond the talking shop phase and pursue a preemptive ban,” said Bonnie Docherty, senior clinical instructor at Harvard Law School’s International Human Rights Clinic. “Governments should ensure that humans retain control over whom to target with their weapons and when to fire.”

The report is co-published with Human Rights Watch, for which Docherty is also senior arms researcher.

The Clinic and Human Rights Watch examined the legal, moral, security, and other dangers of killer robots. They concluded that a ban is the only option for addressing all of the concerns. Other more incremental measures, such as adopting limited regulations on their use or codifying best practices for the development and acquisition of new weapons systems, have numerous shortcomings.

Countries participating in the Fifth Convention on Conventional Weapons Review Conference must decide by consensus on December 16 whether to continue deliberations on lethal autonomous weapons systems in 2017 and what shape the deliberations should take. Countries should establish a formal Group of Governmental Experts to delve more deeply into the problems of the weapons and to work toward new international law prohibiting them, said Human Rights Watch, which coordinates the Campaign to Stop Killer Robots.

Spurred to act by the efforts of the Campaign to Stop Killer Robots, countries that have joined the international treaty on conventional weapons have held three week-long informal meetings on lethal autonomous weapons systems since 2014. The formation of a Group of Governmental Experts at the review conference would compel countries to move beyond talk by formalizing the deliberations and creating the expectation of an outcome.

In past publications, the Clinic and Human Rights Watch have elaborated on the challenges that fully autonomous weapons would present for compliance with international humanitarian law and international human rights law and analyzed the lack of accountability that would exist for the unlawful harm caused by such weapons. The weapons would also cross a moral threshold, and their humanitarian and security risks would outweigh possible military benefits.

Several of the 121 countries that have joined the Convention on Conventional Weapons – including the United States, the United Kingdom, China, Israel, Russia, and South Korea – are developing weapons systems with increasing levels of autonomy. Critics who dismiss concerns about fully autonomous weapons depend on speculative arguments about the future of technology and the false presumption that technological developments could address all of the dangers posed by the weapons, Human Rights Watch and the Harvard clinic said.

Docherty will present the report at a Campaign to Stop Killer Robots briefing at 1:15 p.m. on December 14 in Conference Room XXIV at the United Nations in Geneva.

“The success of past disarmament treaties shows that an absolute prohibition on fully autonomous weapons would be achievable and effective,” Docherty said.

For more reporting from the Clinic and Human Rights Watch on killer robots, please visit:

For more information on Campaign to Stop Killer Robots, please visit:

For more information, please contact:
Bonnie Docherty, [email protected]