Governments should negotiate a treaty that prohibits fully autonomous weapons and requires meaningful human control over the use of force, the International Human Rights Clinic and the Campaign to Stop Killer Robots said in two publications released last week.
The first paper outlines key elements of the proposed treaty. The second paper expands on the proposal and responds to some of the challenging “frequently asked questions.”
Bonnie Docherty, the Clinic’s Associate Director of Armed Conflict and Civilian Protection, presented the papers on April 2 at the Berlin Forum on Lethal Autonomous Weapons Systems, the first-ever major digital disarmament meeting. While Germany had originally planned to hold an in-person meeting in Berlin, its decision to move it online exemplifies the disarmament community’s efforts to continue work on this key issue during the COVID-19 pandemic.
Fully autonomous weapons, also known as lethal autonomous weapons systems or “killer robots,” are weapons systems that would select and engage targets without meaningful human control. The technological capacity for autonomy in these systems raises a host of moral, legal, and ethical concerns. In light of these concerns, a new treaty is needed to clarify and strengthen existing international law.
Over the past year, the Clinic has worked closely with the Campaign to Stop Killer Robots, a global civil society coalition, to develop the proposal for key elements for a new treaty. In crafting the elements, the Clinic team analyzed government positions, examined legal precedent, and reviewed technical publications. The team also consulted with lawyers, ethicists, technology experts, civil society representatives, and others during UN conferences and the Campaign’s global meeting in Argentina in February.
The proposed treaty covers all weapons systems that select and engage targets on the basis of sensor processing, rather than human input. The broad scope is designed to ensure that systems posing legal and ethical concerns do not escape regulation. While this scope requires examination of existing weapons, the proposed restrictions are narrow and directed at future ones.
The heart of the treaty proposed by the Campaign and the Clinic consists of three key elements: (1) a general obligation to maintain meaningful human control over the use of force, (2) prohibitions on specific weapons systems that independently select and engage targets and by their nature pose fundamental moral or legal problems, and (3) specific positive obligations to ensure that meaningful human control is maintained in the use of all other systems that select and engage targets.
The concept of meaningful human control frames the treaty proposal. Retaining meaningful human control over the use of force is essential because many of the problems associated with fully autonomous weapons are due to the lack of such control. The Clinic distilled the concept into:
– Decision-making components, which give human operators the information and ability to assess the legality and morality of the use of force,
– Technological components, embedded features of a weapon system that, for example, ensure predictability and reliability or allow a human to intervene after a system is activated, and;
– Operational components, which limit when and where a system can operate.
States parties to the Convention on Conventional Weapons (CCW) began discussing lethal autonomous weapons systems in 2014, but progress has been slow. Last November, these states adopted a set of guiding principles and agreed to “consider the development of aspects of the normative and operational framework” in advance of the 2021 CCW Review Conference. States parties were scheduled to meet for ten days in 2020, but concerns over the pandemic could lead to postponements of the June and August meetings. During the period of confinement, states, international organizations, and civil society will have to use new tools, such as digital forums, to advance the issue. Their ultimate goal should be a new legally binding treaty that represents a global solution to the problems raised by fully autonomous weapons.
Harvard Law students Alev Erhan JD’21, Ayako Fujihara JD’21, Matthew Griechen JD’19, Richard Millett JD’20, Daniel Moubayed JD’20, and Shaiba Rather JD’21 contributed to the Campaign’s new papers through their research, analysis, and writing.