Policy options on Autonomous Weapon Systems: An international law perspective
Dr Thompson Chengeta
Recently, states gathered at the United Nations in Geneva to discuss– among other things – ‘possible options for addressing the humanitarian and international security challenges posed by emerging technologies in the area of lethal autonomous weapons systems’ also known as killer robots. While there is no agreed definition, many states and the International Committee of the Red Cross [ICRC] define autonomous weapon systems [AWS] as robotic weapons that have autonomy in the critical functions of selecting, targeting and releasing force against humans. Once AWS are activated, they can decide who to kill or harm without any further human intervention. AWS raise complex legal, ethical, moral and security issues that are not adequately regulated by existing law.
- Proposed policy options
Twenty-eight (28) states – Algeria, Argentina, Austria, Bolivia, Brazil, Chile, China, Colombia, Costa Rica, Cuba, Djibouti, Ecuador, Egypt, El Salvador, Ghana, Guatemala, Holy See, Iraq, Mexico, Morocco, Nicaragua, Pakistan, Peru, State of Palestine, Uganda, Venezuela and Zimbabwe – have explicitly endorsed the option of a legally binding instrument that contains both negative and positive obligations prohibiting certain AWS.
The policy option of a legally binding instrument has been supported by the United Nations Secretary General, Antonio Guterres, who, at the opening of the 2019 Meeting of Group of Governmental Experts on Lethal Autonomous Weapon Systems emphasised that ‘machines with the power and discretion to take lives without human involvement are politically unacceptable, morally repugnant and should be prohibited by international law’.
France and Germany have proposed a political declaration on AWS while some states have suggested guiding principles or standards as a preferred options. These options, unlike a legal instrument, are not binding.
A few states like the United Kingdom, United States and Russia have argued against any of the above-mentioned options positing either that existing international humanitarian law (IHL) is sufficient to deal with the challenges posed by AWS or that it is too early for the international community to proffer a specific policy response.
The International Panel on the Regulation of Autonomous Weapon Systems has already articulated the advantages and disadvantages of the above-mentioned policy options and noted that inaction is not an option.
In addition to the fact that a legally binding instrument on AWS is the policy option that is most supported by states, humanitarian organisations and the general public, there are other compelling reasons – even so, obligatory – why states should choose a legally binding instrument on AWS.
- Lacuna and AWS
The development and potential deployment of AWS present a lacuna– a legal gap. In international weapons law [IWL], lacunae are neither cured by ignoring them, engaging in creative interpretations of existing law nor putting in place political declarations/guiding principles that do not have legal force to bridge the specific legal gap.
In the context of disarmament, Canada defined as a true legal gap or lacuna “a situation where the absence of a law or legal norm prevents an inherently illegal situation from being addressed, or where the applicable law is incomplete such that it prevents States Parties from fulfilling their obligations”.
Aside the general principles of international law and basic rules of IHL – the limitations of which are discussed below – there are no specific legal provisions that address the challenges raised by AWS thereby potentially undermining state obligations on the use of force.
While defining lacunae as the “absence of something that arguably ought to be there”, Kammerhofer  notes that it is generally difficult to prove a lacuna“because one has to prove the absence of norms, not their validity or their existence”. Nevertheless, as will be exemplified below, the legal gap presented by AWS is clear.
To begin with, Morita  rightfully observes ‘the rapid breakdown of many traditional aspects [of international law] under the impact of technological, economic and political change’ while Simpson  indicates that the advent of cutting-edge technologies such as AWS present a number of situations that are not addressed by existing law.
IHL and IWL rules are indeed applicable to AWS. Nevertheless, ICRC emphasised in the recent 2019 GGE Meeting on AWS that while there is no doubt that existing IHL rules such as distinction, proportionality and precaution limit autonomy in weapons to a certain degree, they are inadequate. While existing law gives an idea or guidance, it is insufficient or incomplete on questions that relate to the standard of human control, acceptable autonomy and levels of predictability of AWS.
It is important to remember that ICRC has been, for years, regarded by states and non-state entities as the guardian of IHL. Ordinarily, ICRC does not support revision/additions to IHL rules unless and until it is strictly necessary. In the case of AWS, ICRC has indicated that necessity. It would be to the detriment of humanity to insist on the adequacy of law when it is not.
I have previously argued that the drafters of IWL and IHL treaties did not anticipate artificial intelligence technologies where machines or robots make the decisions and legal judgments on the use of force against humans. It is in this reason that I have questioned the adequacy of the legal process of reviewing AWS in terms of Article 36 of Additional Protocol 1 to the Geneva Conventions.
The drafters of Article 36 of Additional Protocol 1 to the Geneva Conventions concerned themselves with review of proper weapons or capabilities purely regarded as nothing more than tools in the hands of fighters or combatants. They did not anticipate robo-combatans that take up the human obligation and function of making the decision to use force and legal judgments regarding the use of such force.
The legal inquiry in terms of Article 36 is whether a weapon can be used by humans in compliance with applicable laws not whether the weapon or capability can, by itself, make lawful decisions on the use of force and carry out legal judgments associated with such decisions. That duty has, from time immemorial, been the sacred preserve of humans. In this sense, AWS enter an uncharted territory that is not fully addressed by existing law.
The United States have argued that under no circumstances can robots make decisions to use force because they only execute pre-programmed human decisions. I have disagreed with the United States’ argument by discussing what human decision-making means in the context of use of force and the demands of applicable legal rules. To the same end, the ICRC has also highlighted that the decision to use force is made and reviewed throughout the targeting cycle until the final release of force – even so, afterwards. As such, it is insufficient to only focus on the planning stage of the attack and pre-programme human decisions.
Furthermore, while noting the importance of Article 36 reviews, the ICRC has also stated that ‘they are not a substitute for states working towards internationally agreed limits on autonomy in weapon systems’. In other words, Article 36 reviews cannot cure the lacunacreated by AWS. After all, Article 36 demands that new weapons be reviewed in terms of applicable laws and the applicable law in this instance is inadequate. There is no principled reason as to why such ‘internationally agreed limits’ should be contained in non-binding instruments.
Tied to the above issue is the legal responsibility gap that is created when unpredictable AWS are used. I have previously discussed the legal responsibility gap that is created when unpredictable AWS are used. Legal responsibility of humans for use of force is premised on the legal assumption that in all cases, it is a human person who plans and decide on the final release of force against a particular or specific person(s). It has always been humans. Yet, the advent of AWS and their potential use present an unprecedented scenario that is not adequately addressed by legal provisions on human responsibility in the use of force.
Over and above, the ICRC correctly notes that the challenges raised by AWS go ‘beyond questions of the compatibility of autonomous weapon systems with our laws to encompass fundamental questions of acceptability to our values’. Rightly so, ‘in the face of new developments not specifically foreseen or not clearly addressed by existing law, contemporary ethical concerns go beyond what is already codified in the law.’ There is no question that ethical concerns and standards have always guided and influenced states in the making, refining and redefining of IWL and IHL. Yet, until ethics standards that are relevant to AWS are codified in a binding legal instrument, they cannot cure the lacuna.
- Limitations of general and basic principles of law in the face of a lacuna
The insufficiency or limitations of general and basic principles when dealing with new or novel weapons was made apparent in the the International Court of Justice’s Advisory Opinion on the Legality of the Threat or Use of Nuclear Weapons. While the ICJ noted the applicability of IHL to nuclear weapons and the timelessness of IHL basic principles, the Court admitted that nuclear weapons – having been invented after the codification of IHL basic principles – presented not only a quantitative but qualitative difference from other conventional weapons. The Court noted that the existing law at the time neither ‘contain[ed] any specific prescription authorizing the threat or use of nuclear weapons’ nor ‘any principle or rule of international law which would make the legality of the threat or use of nuclear weapons or of any other weapons dependent on a specific authorization’.
While international courts sometimes fill in gaps in international law by applying general principles of law, equity or recourse to analogy (e.g. as applied in the cases of the Corfu Channel Case, the Atomic Bomb Trial and the Trail Smelter Case), they may only go as far and are not allowed to create law. Both international and domestic courts cannot fill up the legal gaps that are created in the use of AWS as far as individual responsibility is concerned.
In theNuclear Weapons case, Judge Vereshchetinnoted that courts of law must not create law and where a ‘court finds a lacunain the law or finds the law to be imperfect, it ought merely to state this without trying to fill the lacunaor improve the law by way of judicial legislation’. Instead, the Court in the Nuclear Weapons Caseemphasised the importance of express prohibitions in international law noting that ‘the illegality of the use of certain weapons as such does not result from an absence of authorization but, on the contrary, is formulated in terms of prohibition’. This is the reason why, in the circumstances explained above, there ought to be a comprehensive legal instrument on AWS.
- Importance of codification of binding & non-binding standards applicable to AWS
It has been noted that there are a number of basic norms of IHL and IWL that are applicable to AWS. Equally, AWS raise concerns that more fully offend certain ethical standards and other human values that are not contained in existing law. As a result, there is a need of codification, not only of legal norms applicable to AWS, but also relevant ethical standards and human values.
The reason why it is possible at present to refer to provisions of IHL such as those on distinction and proportionality is because states made the right decision to codify such norms in legal instruments. Codification makes the law clear and accessible.
Furthermore, discussions on AWS have shown that there are various interpretations as to how current IHL applies to AWS. In the 2019 GGE Meeting, South Africa pointed to the different interpretations of existing law and its applicability to AWS, hence the need for uniformity in interpretation.
There are attempts by other participants in the AWS discussions to push and stretch the corners of the law; engaging in creative interpretation of IHL in order to accommodate weapon systems that are otherwise illegal. Such approaches ought to be vigorously opposed especially where such interpretations undermine IHL. This underlines the importance of codification of existing norms specifically addressing the issue of human control and those weapon systems that are illegal. No matter the advances in technology, it is the technology or weapons that should comply with the law and remain within accepted human values and standards, not the other way around.
Codification provides a platform for multilateral discussions and hence building of uniform interpretation of the law as it applies to AWS. Such a process is preferred to states engaging in unilateral interpretations of the law on this complex issue.
As noted above, specific issues that are not covered by existing law require specific regulation which can then be covered in a codification process. It is indeed state practice to draft and adopt legal instruments or additional protocols addressing specific new problematic weapons. It has never been a successful or sufficient argument to say instead, let us improve and implement the existing laws.
- Additional prohibitions can only be contained in a legal instrument
If states agree that there is need to adopt negative and positive obligations on AWS, it should be noted that such obligations can only be contained in a legally binding instrument. They cannot, in legal sense, be considered to be obligations if they are non-binding.
Understanding human control over use of force as a legal requirement means that certain weapon systems with autonomy in their critical functions need to be prohibited. ICRC has referred to such weapon systems as illegal per se– unpredictable by design and unlawful by their nature. Legally speaking, prohibitions cannot be contained in non-binding instruments.
- Timelessness of the principle of human control over use of force
In relation to the repeated argument by some states that it is too early to have a legally binding instrument on the issue of AWS, there is a wrong supposition that regulation of AWS is dependent on the future technical abilities of the technology. It is not. The need for regulation of AWS is primarily normative. It is driven by the need to codify norms, standards and human values that cannot be transgressed.
One such norm or principle is maintenance of human control over use of force. This is a grand norm whose nature does not depend on the future capabilities of the technology. It is norm that will never change and ought not to change. As such, it is unconvincing – and perhaps a misdirection – to argue that it is too early to act because we do not know the future capabilities of the technology in question. An idea of them is sufficient to inform states on what is not ideal.
Furthermore, there is a general agreement among states and other stakeholders that the crux of the AWS debate is maintaining human control over use of weapon systems. There is no doubt that maintaining human control is fundamental. Because that is the case, one is yet to hear a principled argument or reason why this important notion of human control must be contained in a non-binding instrument. Placing this important norm in a non-binding instrument is tantamount to pouring milk in a broken calabash.
- The residual negative principle and AWS
Where it is clear that there exists a lacuna, inaction is not a viable option. In international law, there is a residual negative principle that provides that ‘what is not prohibited is legally permitted’. It is along these lines that in the Fisheries Case, the Court noted that ‘the attitude of governments bears witness to the fact that they did not consider it [the conduct in question] to be contrary to international law’. Likewise, in the Lotus Case, it was noted that restrictions on states’ conduct cannot be presumed since ‘the rules of law binding upon States emanate from their own free will as expressed in conventions or by usages generally accepted as expressing principles of law’. That being the case, it is important that states be more specific in the regulation of AWS or face unacceptable use of AWS with some states arguing that there is no law that prohibits it.
In conclusion, it is reiterated that where there is a lacuna,the option that cures that lacunais a legally binding instrument. Adoption of a legally binding instrument on AWS is, after all, squarely within the mandate of the UN CCW.