Autonomous weapon systems (AWS) are no longer confined to works of fiction. The 2021 UN Panel of Experts on Libya reported the Libyan Government of National Accord Affiliated Force’s use of a STM Kargu-2 autonomous drone to attack the logistics convoys of the Haftar Affiliated Forces. More recently, the Russian Federation’s invasion of Ukraine and the conflict in Gaza have accelerated the development of uncrewed technologies with autonomous capabilities. Ukraine’s attack on the Russian Black Sea Fleet using uncrewed surface vessels and Russia’s use of Iranian-made “kamikaze drones” against Ukrainian infrastructure are evidence of this trend. Similarly, Israel is employing artificial intelligence (AI) to enhance its targeting in Gaza.
While international debate on the need for new international law continues in the Group of Governmental Experts, established under the auspices of the Convention on Certain Conventional Weapons questions concerning their legality remain ongoing. It is recognized as a fundamental rule of international humanitarian law (IHL) that the right of belligerents to choose means and methods of warfare is not unlimited. States must therefore consider how they can ensure that their development and use of AWS is consistent with their existing international law obligations.
Legal Review Practice
For States party to the First Additional Protocol to the Geneva Conventions of 1949 (AP I), Article 36 requires:
In the study, development, acquisition or adoption of a new weapon, means or method of warfare, a High Contracting Party is under an obligation to determine whether its employment would, in some or all circumstances, be prohibited by this Protocol or by any other rule of international law applicable to the High Contracting Party.
In contrast to the number of States party to AP I, there is limited evidence of State practice on the review of conventional, human operated weapons. There is even less evidence in relation to AWS. Many States have called for the voluntary exchange of best practices for the weapons review of AWS, however, until recently, no State has publicly disclosed its national approach to weapon reviews of AWS.
While international law does not mandate a specific approach to weapons reviews, State practice suggests there are four broad steps in what may be described as the “traditional” approach. First, a State should identify the need for a weapon review based on a national definition of a weapon, means or method of warfare. Second, if a weapon review is required, the next step considers whether the weapon is specifically prohibited or restricted by international law binding the reviewing State. If none applies, the third step is to determine whether the general prohibitions against certain weapon effects (superfluous injury or unnecessary suffering, indiscriminate effects) prohibit the weapon or certain uses. Finally, the fourth step permits States to apply national policy, for example limitations on certain weapon uses, that are in addition to their international law obligations.
The focus of the traditional weapon review is a weapon’s normal or expected use. The analysis does not normally extend to the targeting rules governing methods of use, including distinction, proportionality, and precautions in attack which relate to the use of the weapon in specific circumstances. This reflects the purpose of the traditional weapon reviews which is to determine whether a weapon is lawful and capable of lawful use, whereas the responsibility for actual use rests with commanders and weapon users.
Absent new international law specifically prohibiting or regulating AWS, the legal risks posed by autonomy in weapons stem less from their legality per se and more from how they are used in armed conflict. To address such risks, a weapon review of an AWS must analyze whether the system designed to perform targeting functions can do so in compliance with IHL targeting rules. This may oblige States to modify their weapon review processes to address the risks associated with the use of AWS in armed conflict. This could occur in several ways.
Functional Review Step
States may adopt a “functional review” step in addition to the traditional weapon review steps. The focus of the functional review step is the AWS software, including any AI, and its ability to perform targeting functions in compliance with IHL. The functional review may precede the traditional weapon review steps or, in the case of an existing weapon enhanced by AI, follow them. The functional review only analyzes those AWS functions that are regulated by targeting rules, including distinction, proportionality, and precautions, and considers the legal risks arising from the performance of these functions in different environments and operational circumstances. It also permits the State to include consideration of national AWS or AI policy addressing legal and ethical risk mitigation, human control, reliability, predictability, and explainability.
While a State can design its own process, the functional review may include the following elements:
- identify the IHL rules that govern the AWS targeting function;
- identify the standards of legal compliance required to determine legality for that function;
- identify the operational context in which the AWS will perform the function;
- obtain the AWS’s technical, functional, and performance data relevant to the function;
- conduct a legal risk analysis for the function in light of the intended use in the different operational contexts;
- determine the appropriate level of human control over the AWS to ensure legality;
- confirm lines of human responsibility;
- identify issues of reliability, predictability, and explainability that raise legal risk.
The functional review step will require multi-disciplinary input. In addition, to allow contextual analysis, the AWS functions must be tested using hypothetical use cases or scenarios that are consistent with the AWS’s normal or expected use. These scenarios will reflect realistic situations and include variations in terrain, weather, and tactical situations that will affect the AWS’s operation. This may reveal that it can perform a specific function lawfully in one situation but not in another.
Expanded Focus of the Review
In addition to a functional review step, States may expand the focus of their weapon review process across the AWS lifecycle, from its inception, through acquisition and during its in-service use. By expanding the weapon review process in this way, a State can achieve different weapon review outcomes across three broad stages:
- the initial informative stage to inform those responsible for the initial design and development of the AWS of the State’s international law obligations;
- a determinative stage (including the traditional weapons review steps) to determine the legality of the AWS prior to its use in armed conflict;
- a governance stage to monitor the ongoing legality of the AWS during its use in armed conflict.
The Australian Defence Force has recently revised its internal Guide to the Legal Review of New Weapons, Means or Methods of Warfare. The revised guide consists of three parts. Part I provides definitions, an overview of the weapon review process and individual responsibilities. The new Part II is written for legal officers who conduct weapons reviews and explains the traditional weapons review steps and necessary legal analysis in relation to Australia’s international law obligations. Finally, a new Part III outlines additional legal analysis, based on the functional review step, to complete a weapons review of new and emerging technology, including AWS. The functional review step will be trialled over the next 12-24 months.
Colonel Damian Copeland is Director, Operations and International Law, in the Australian Defence Force.
This article is republished from Articles of War. Read the original article, which appeared as part of a series on the legal review of autonomous weapon systems. An introductory post by Professors Rain Liivoja and Sean Watts provides an overview of the series.