New Report Identifies More Effective Routes to Protecting Civilians, Avoiding Insecurity
Governments should move the stalled discussions of a treaty on autonomous weapons systems, known as “killer robots,” to a new international forum, Harvard Law School’s International Human Rights Clinic and Human Rights Watch said in a new report. Such weapons systems operate without meaningful human control, delegating life-and-death decisions to machines.
The 40-page report, “An Agenda for Action: Alternative Processes for Negotiating a Killer Robots Treaty,” was released in advance of a UN disarmament meeting that will convene this this week and address the topic. The report proposes that countries initiate a treaty-making process elsewhere based on past humanitarian disarmament models, such as for the treaty banning cluster munitions.
“A new international treaty that addresses autonomous weapons systems needs a more appropriate forum for negotiations,” said Bonnie Docherty, associate director of armed conflict and civilian protection at the Clinic, senior arms researcher at Human Rights Watch, and lead author of the report. “There’s ample precedent to show that an alternative process to create legal rules on killer robots is viable and desirable, and countries need to act now to keep pace with technological developments.”
More than 70 countries as well as nongovernmental organizations and the International Committee of the Red Cross regard a new treaty with prohibitions and restrictions as necessary, urgent, and achievable. United Nations Secretary-General António Guterres called for “internationally agreed limits” on weapons systems that could, by themselves, target and attack human beings, describing such weapons as “morally repugnant and politically unacceptable.”
Talks on concerns about lethal autonomous weapons systems have been underway under the auspices of the Convention on Conventional Weapons (CCW) since 2014. Countries will reconvene at the UN in Geneva on November 16-18, 2022, for the treaty’s annual meeting, but there is no indication they will agree to negotiate a new legally binding instrument via the CCW in 2023 or in the near future.
The main reason for the lack of progress under the CCW is that its member countries rely on a consensus approach to decision-making, which means a single country can reject a proposal, even if every other country agrees to it. A handful of major military powers have repeatedly blocked proposals to move to negotiations, notably India and Russia over the past year. Both countries also attempted to block nongovernmental organizations from participating in discussions in 2022.
India and Russia, as well as Australia, China, Iran, Israel, South Korea, Turkey, the United Kingdom, and the United States are investing heavily in the military applications of artificial intelligence and related technologies to develop air, land, and sea-based autonomous weapons systems.
Given the shortcomings of the CCW forum, alternative processes for negotiating a new treaty should be explored, the Clinic and Human Rights Watch said. One option is an independent process outside of the UN, as was used for the treaties banning antipersonnel landmines and cluster munitions. Another is via the UN General Assembly, which initiated negotiations of the nuclear weapons ban treaty.
Four characteristics of these alternative processes are particularly conducive to achieving strong treaties in a timely fashion: a common purpose; voting-based decision-making; clear and ambitious deadlines; and a commitment to inclusivity, the Clinic and Human Rights Watch said.
Countries have already expressed broad support for essential elements needed to address concerns over removing human control from the use of force. A new international treaty should prohibit autonomous weapons systems that inherently lack meaningful human control as well as systems that target people. It should contain positive obligations to ensure meaningful human control in other weapons systems with autonomy. “Meaningful human control” is widely understood to require that technology is understandable, predictable, and constrained in space and time.
In October, 70 countries expressed their support for “internationally agreed rules and limits” on autonomous weapons systems in a joint statement to the UN General Assembly’s First Committee on Disarmament and International Security.
There have also been more expressions of support for regulation from industry. In October, Boston Dynamics and five other robotics companies pledged not to weaponize their advanced mobile robots and called on others to “make similar pledges not to build, authorize, support, or enable the attachment of weaponry to such robots.”
Human Rights Watch is a cofounder of Stop Killer Robots, the coalition of more than 190 nongovernmental organizations in 67 countries that advocates for new international law on autonomy in weapons systems.
“The longer the killer robots issue stays stuck in the current forum, the more time developers of autonomous weapons systems have to hone new technologies and achieve commercial viability,” Docherty said. “A new treaty would help stem arms races and avoid proliferation by stigmatizing the removal of human control.”
Bonnie Docherty co-authored and supervised the production this report. Clinic students Arnaaz Ameer, LLM ’23, Ryen Bani-Hashemi, JD ’22, Madeleine Cavanagh JD ’23, Alexa Santry, JD ’24, Elliot Serbin, JD ’24, Lisa Wang, JD ’22, and Rosalinn Zahau, LLM ’22, also contributed significantly to the research and writing of this report.