AustLII Home | Databases | WorldLII | Search | Feedback

University of New South Wales Law Journal Student Series

You are here:  AustLII >> Databases >> University of New South Wales Law Journal Student Series >> 2020 >> [2020] UNSWLawJlStuS 20

Database Search | Name Search | Recent Articles | Noteup | LawCite | Author Info | Download | Help

Kehl, Jacqueline --- "Lethal Autonomous Weapons In Armed Conflicts: Recommendation Of Development Or Justification Of Prohibition?" [2020] UNSWLawJlStuS 20; (2020) UNSWLJ Student Series No 20-20


LETHAL AUTONOMOUS WEAPONS IN ARMED CONFLICTS: RECOMMENDATION OF DEVELOPMENT OR JUSTIFICATION OF PROHIBITION?

JACQUELINE KEHL

I INTRODUCTION

“Artificial intelligence is poised to play an increasing role in military systems. There is an urgent opportunity and necessity for citizens, policymakers, and leaders to distinguish between acceptable and unacceptable uses of AI.”[1]

The technology of the 21st century has created a new kind of warfare. The modern conduct of war is characterised by intelligent weapon technologies that can replace human soldiers. Over the past decade, militaries have been deploying robots for intelligence or monitoring purposes in risky or tedious situations.[2] Although those smart technologies are well-advanced, their development is still at the very beginning. In view of the technical opportunities offered by robots and artificial intelligence, a worldwide debate has been aroused. The focus of the debate is on the potential application of lethal autonomous weapons (‘LAWs’) on the battlefield.

Lethal autonomous weapons can be defined as weapon systems that are capable of selecting and engaging targets without further intervention by a human operator after they have been activated.[3] Subsequent to this definition, it must be distinguished between so-called ‘Human-on-the-Loop Weapons’ and ‘Human-out-of-the-Loop Weapons’.[4] The first term describes a weapon that can select targets and deliver force under the oversight of a human operator who can override its actions, if necessary. The second term comprises weapons that independently select targets and deliver force without any human input or interaction. Both types of weapons are considered as fully autonomous weapons.[5] Fully autonomous weapons can further be subdivided into three categories: Munition, platforms and operational systems.[6] In the following, the focus will lie on platforms. Unmanned aerial vehicles (‘UAV’), also referred to as drones, fall within this category.

Although there is no existing example of a fully autonomous weapon on the platform-level, their persecutors are already circulating. Currently, over ninety militaries and nonstate actors possess UAVs and almost a dozen of these have armed drones at their disposal.[7] Popular examples are the Unites States MQ-9 Reaper or the ch-4 drones possessed by China. The United States already deployed its drones during missions in Iraq and Syria. Additionally, several military powers such as the US, China and Russia are actively pursuing novel LAW technologies.[8] It is only a question of time when technology will reach a level of maturity that facilitates the deployment of autonomous weapons in military missions.

The key characteristic of LAWs is that they can independently decide what or whom to engage. [9] Consequently, if deployed on a battlefield, they will be making autonomous decisions about life and death. This raises a bundle of new and extremely difficult ethical and legal questions. It is not surprising that autonomous weapons, often designated as ‘killer robots’, have many detractors. Among these detractors are several non-governmental organisations such as Human Rights Watch, single governments and political leaders, artificial intelligence experts and large sections of the global population. Most of them are calling for an absolute and preventive ban of fully autonomous weapons, including their development, commerce and deployment.[10] But is this claim justified?

In the following, the question whether an absolute prohibition of fully autonomous weapons is reasonable and legally defensible will be examined. For this purpose, potential chances and threats of autonomous weapons for humane warfare will be illustrated. Hereby, the focus will be on the legal status of LAWs within the existing framework of international humanitarian law (‘IHL’) and the encounter with selected legal issues.

II CHANCES AND THREATS TO HUMANE WARFARE

The most apparent advantage of the deployment of LAWs is that they would minimise casualties among a state’s own troops. [11] Autonomous weapon systems can operate independently without exposing human soldiers to the direct risk of enemy fire. Seen from the opposite perspective, the existence of self-governing weapons could lower the threshold for military engagement. The risk of casualties among own personnel constitutes the major restraint to state leaders in their decision to commit troops.[12] Although this might be reasonable concern, it is clearly not reserved for autonomous weapons. The same objection could be made in respect of remotely-piloted UAVs and high-altitude bombs that have already been utilised.[13] Additionally, it must be noted that the legitimate resort to force under international law constitutes the ultima ratio, regardless of the types of weapons deployed, and that decisions of commitment are dependent on numerous other factors such as the own political environment.

Another argument in favour of LAWs on the battlefield is that they might be more effective and even more ethical than human soldiers.[14] Contrary to human beings, such weapons are not subjected to stress, fatigue or limited cognitive abilities that could negatively impact military operations. Furthermore, self-directed machines are not driven by emotions such as rage, revenge or mere pugnacity which frequently have caused massacres and atrocities.[15] In this respect, there is a real chance that LAWs could enhance compliance with the law of armed conflict. The position raised against this opportunity is mostly held among human rights scholars which fear a violation of the inviolable right to human dignity by the dehumanisation of targets.[16] Since all human life has an intrinsic value, decisions over life and death in armed conflicts certainly require compassion and intuition which cannot be fulfilled by a robot.[17] In view of the inhumane and degrading conduct by human soldiers, particularly during the Second World War, this objection against autonomous weapons is though hardly convincing.

In respect of the civilian population in areas of conflict, the deployment of LAWs could further contribute to more humane warfare in accordance with the rules of war. Autonomous weapon systems could reduce risks to civilians by more precise targeting and more controlled firing decisions.[18] Within this context, it must be noted that the current technology does not enable such weapons to reliably discriminate between combatants and civilians.[19] If autonomous weapons will ever be capable of sufficiently differentiating between combatants and non-combatants is controversial. This objection raises the key question if LAWs are compatible with the fundamental principles of international humanitarian law and therefore deserves a more precise analysis.

III AUTONOMOUS WEAPONS AND THE LAW OF ARMED CONFLICT

Critics of fully autonomous weapon systems claim that such weapons are inconsistent with the law of armed conflict.[20] This belief is primarily drawn in relation to certain fundamental rules of international humanitarian law, the principles of distinction and proportionality and its accompanying principle of military necessity. These principles are considered customary international law[21] and constitute the origin of all substantive rules of IHL.

To analyse this proposition, self-directed weapons must be examined in respect of Article 35(2) and Article 51(4) Additional Protocol I to the Geneva Conventions (‘AP I’)[22] and the law of targeting. Both articles of AP I constitute ‘basic rules’ for means and methods of warfare regulating the legality of weapons per se. The law of targeting primarily anchored in AP I governs the circumstances in which weapons can be utilised legitimately.

A Compliance with the ‘Basic Rules’

Under Article 35(2) AP I, a weapon system is considered unlawful if it is indiscriminate by its nature.[23] Landmines that detonate regardless of their victim's status are the most popular example. If a fully autonomous weapon system was supplied with sufficiently reliable parameters and if it was able to strike specific targets such as a human soldier, it would not infringe the ‘indiscriminate by nature’ rule.[24] If this prerequisite is met, Article 35(2) AP I would not prohibit a weapon system solely on account of its autonomy.

Furthermore, a lawful weapon system may not be ‘of a nature’ to cause ‘unnecessary suffering or superfluous injury’.[25] This provision which aims to protect combatants from needless or inhumane suffering[26] sets a high threshold. A weapon must significantly increase the suffering without increasing the military advantage to be rendered illegal.[27] Causation of solely severe or horrendous injury would not fulfil the requirement. If autonomous weapons were capable of shooting with accuracy, which is intended, they would regularly not cause superfluous injury or immense suffering pursuant to Article 35(2) AP I.

Ultimately, a weapon system itself can be considered unlawful under Article 51(4)(c) AP I if its harmful effects cannot be limited.[28] This provision originally intended to address bacteriological means of warfare that could cause the uncontrollable spread of diseases.[29] Although opponents might consider autonomous weapon systems as uncontrollable, their effects are not unlimited within the meaning of Art. 51(4)(c) AP I.[30] Although several rules of IHL prevent the utilisation of weapons in circumstances that might have uncontrollable effects, the threshold to render a weapon illegal per se is high.

In conclusion, the feature of autonomy does not violate the basic rules set out in the Additional Protocol. LAWs cannot be considered unlawful solely on grounds of their independence.

B Compliance with the Law of Targeting

Notwithstanding the above, the issue at hand is not whether LAWs are ‘good or bad’ in themselves. Rather, it must be further examined under which circumstances their deployment can be considered lawful or not. In respect of their major relevance, the explanations will be limited to the basic principles of distinction, proportionality and precaution in attacks.

1 The Principle of Distinction and Discrimination

As indicated earlier, one of the major concerns regarding LAWs is the fundamental requirement of distinction between combatants and civilians under Article 48 AP I.[31] Critics claim that an autonomous weapon is not capable of making such a determination if combatants are not identifiable by physical markings.[32] This would contradict with the prohibition of indiscriminate attacks pursuant to Article 51(4)(b) AP I.[33] Thinking further ahead, terrorists or insurgents could possibly trick the machine by concealing weapons or by exploiting their sensual and behavioural limitations.[34]

In this light, it must be admitted that the legitimate use of LAWs depends on their stage of technological development that varies among different weapon systems. In addition, context and environment in which the weapon is deployed must be considered. For instance, a combat between two autonomous drones above a deserted area or the ocean would not conflict with the principle of distinction since civilians were not involved. The deployment of LAWs in this context would not violate the laws of war per se.

On the contrary, if these drones were deployed in an urban area, a different standard would apply. Conducting attacks with LAWs in an environment where civilians could be involved requires a high standard of technology and strict regulation. An autonomous weapon must have the capacity to make judgment calls which are equal to or even greater than human beings. One attempt to approximate this level of capacity might be the utilisation of sensors gathering combatant data equivalent to or beyond that of humans.[35] Whether or not such high technical standards could be accomplished in the future is difficult to predict.

2 The Principles of Proportionality and Necessity

The principle of proportionality in attack under Article 51(5)(b) AP I[36] can be described as an inherently subjective determination that demands evaluation on a case-by-case basis.[37] In respect of the infinite number of possible scenarios, it appears to be unlikely that an autonomous weapon could ever be pre-programmed to cope with all situations. An ad hoc interpretation of the circumstances might be even more complicated. The argument against this objection joins the one raised in relation to the principle of distinction. In situations where civilians are not present, weighing military advantage against civilian harm is simply not required.

Notwithstanding the above, even if solely military troops were present, LAWs would still have to face obstacles. The principle of military necessity, found in the 1907 Hague Regulations[38] and the Rome Statute of the International Criminal Court[39], also requires a subjective analysis of a certain situation. Identifying whether an enemy soldier has become hors de combat, demands not only an appropriate but a compassionate judgment. If such ‘humane’ judgments could ever be made by a robot remains to be seen.

3 The Precautionary Principle

Ultimately, to comply with the rules of targeting, the deployment of LAWs must meet the precautionary principle anchored in Article 57 AP I.[40] Compliance with the rule of precautions in attacks falls upon commanders as they plan how to deploy their combat resources in an operation.[41] In this context, autonomous warfare would not be different from conventional warfare.[42] Self-governing drones might operate independently but they must be activated and sent to a predetermined destination. The legal analysis would be conducted by human decision-makers who decide whether to use an autonomous weapon in a specific situation. Whether the legal requirements are satisfied will not solely depend on the machine’s own programming and technical capabilities, but on human judgments as well.[43] Imaginations of a war or military mission exclusively conducted and observed by artificial intelligence that have been evoked by critics[44] are unrealistic and misleading. Objections regarding this principle in relation to the foreseeable future are thus unfounded.

Contrary to the basic rules, compliance with the law of targeting is indeed a difficult task. When it comes to complex assessments in conflict areas that involve human beings, whatever soldier or civilian, the objections of critics are justified. It is indisputable that a lethal autonomous weapon would violate fundamental principles of IHL if it is deployed in a mission without an appropriate and certain standard of technology. However, where basic principles are not at stake, nothing might hamper their way. And if technical progress would create autonomous weapons capable of complex, moral and ‘humane’ thinking, what else might hamper their way?

IV AUTONOMOUS WEAPONS AND THE QUESTION OF ACCOUNTABILITY

It is a fundamental condition of belligerence that someone may be held morally and legally accountable for misconduct and crimes committed in a war.[45] Particularly in respect of war crimes, the gravest breach of the law of armed conflict, prosecution is indispensable. The question arising out of this requirement constitutes one of the major arguments in favour of an absolute prohibition of LAWs. Who should be held accountable for the mishap of a fully autonomous weapon that has caused harm or a serious violation of the law of armed conflict?

A Individual Responsibility

Legal accountability for violations of IHL has been predominantly imposed upon individuals in the aftermath of the Second World War. Regarding self-directed weapons, three parties could be considered. One attempt is to father the weapons’ failure on the programmer of its system. This might only be legitimate if the mishap verifiably occurred as a result of negligence on the part of the programmer.[46] The same would apply to the human operator activating the robot. An operator acting ‘on-the-loop’ has the obligation to ensure that the system performs in a moral and appropriate way. However, if the machine acts in an unforeseeable and incomprehensible way, he could not bear the responsibility.[47]

Ultimately, the responsible party could be the commander who ordered the activation of the LAW. Commander accountability would create a strong incentive for commanders only to deploy self-directed weapons when they have a high degree of confidence in its situational appropriateness.[48] Nevertheless, command responsibility would only apply if a commander was aware in advance of the potential for mishaps and still recklessly deployed a fully autonomous weapon.[49] Beyond that, a commander might not be able to identify a threat prior to the deployment because he or she had not programmed the robot.[50]

In relation to war crimes primarily anchored in the Rome Statute, a further obstacle occurs. Both under international law and most domestic legal regimes, war crimes must be committed ‘wilfully’[51]. Devoid of intentional human action, serious violations of the law of armed conflict will not occur. The only imaginable scenario might be the reckless deployment of an autonomous weapon system or the utilisation with the direct intention of committing a war crime.

The small scope of scenarios where the law would apply is unsatisfying. Beyond that scope, it is inequitable to hold persons accountable for actions of machines which they could not sufficiently control. In most cases, the deployment of autonomous weapon systems would create a ‘responsibility gap’.[52] This speaks in favour of opponents arguing for an absolute ban of LAWs.

B State Responsibility

With respect to the dominant role of individual criminal liability in the law of armed conflict, the responsibility of states often fades into the background. It has long been established both in treaty law and international customary law that states are accountable for infringements and war crimes committed by soldiers of their armed forces.[53] The law of state responsibility indicates that a state may owe an international legal obligation to individuals, another state, or the international community in its entirety.[54] A state that is responsible for the violation of a legal obligation must ‘make full reparation for the injury caused by the internationally wrongful act’[55]. This responsibility is recognised as existing in conjunction with individual criminal responsibility.[56] The jurisdiction over the law of state responsibility for international crimes lays with the International Court of Justice (ICJ) and regional human rights courts.[57]

In the case concerning the Aerial Incident of 3 July 1988, Iran filed a lawsuit before the ICJ against the United States for shooting down an Iranian airliner over the Persian Gulf.[58] After roughly six years of negotiation, the United States agreed to pay compensation of 131, 8 Million US Dollars including 61.8 Million in aid of the victims’ families.[59] It is mentionable that, albeit resulting from human failure, the launching of Flight 655 involved an autonomous weapon system operating in a semi-autonomous mode.[60]

In light of the above, it becomes clear that the case for fully autonomous weapons should not be different. As states are held accountable for crimes or ‘accidents’ of their troops, they are responsible for the actions of their autonomous weapon systems as part of their military force. If the conduct of an autonomous weapon is attributable to a state, that state is then obligated to make full reparation for the injury caused by the internationally wrongful act.[61] In conclusion, holding states accountable for the actions of their LAWs requires solely a clarification of the applicability of existing law.[62]

C The ‘War Torts’ Regime

The applicability of the law of state responsibility in armed conflicts could be clarified through the establishment of a ‘war torts’ regime.[63] This regime would function similarly to regular tort actions on the national level. Instead of tort actions brought for domestically wrongful acts, the war torts framework would be customised to internationally wrongful acts. The establishment of such a regulatory framework would ensure that victims of states’ internationally wrongful actions can receive compensation for their injuries.[64] Within a sole war crimes regime, this would not occur. In addition, tort law aims to minimise accidents and deter others from engaging in similar activities whereas criminal law contemplates moral culpability.[65]

Since state responsibility for breaches of IHL and the right of reparations have widely been neglected in practice,[66] such a framework is desirable. A war torts regime would establish a general right of individual reparation for violations of IHL that, albeit its urgent need, has not been constituted.[67]

Beyond the aforesaid benefits, the existence of an effective war torts regime would fill the accountability gap originating from the deployment of fully autonomous weapon systems. If international criminal law does not apply due to the absence of a wilfully acting individual, states could be held responsible for war torts. Accordingly, the objection raised against LAWs in terms of legal consequences can be disproved.

In order to establish a functioning war torts regime, an adequate liability standard must be determined. Concerning the feature of LAWs, a standard of strict liability might be most appropriate. Fully autonomous weapons have an inherent significant risk associated with them.[68] Therefore, deployed autonomous systems are potentially more dangerous than semiautonomous and non-autonomous weapons. In addition, this standard would be most appropriate to address the difficulty with tracing the causal chain of injuries.[69] Ultimately, the application of strict liability could curtail an excessive utilisation of LAWs and thereby reduce the overall number of disasters.[70]

V CONCLUSION AND REGULATORY APPROACHES

The examination in relation to certain fundamental principles and legal consequences of IHL has illustrated that lethal autonomous weapons are not inherently unlawful or unethical. Critics of self-directed weapons have raised reasonable pleas in relation to their inherent risks, particularly in respect of compliance with the rules of targeting. None of these objections may legally justify an absolute ban of LAWs in view of the examined rules. Although, they illustrate the importance of regulating this novel and unsought element of warfare.

For this purpose, an international regulatory framework governing the use of LAWs in armed conflicts along the lines of the Convention of Certain Conventional Weapons[71] should be considered.[72] Such an instrument would clarify that autonomous systems are regulated by the existing rules of warfare and that they are subject to the customary law requirements of legal review of weapons anchored in Article 36 AP I[73].

A regulatory framework tailored to autonomous weapons should include certain mandatory requirements such as an agreed upon programming and adequate targeting system.[74] Furthermore, a built-in self-neutralisation mechanism and a human override capability requiring a ‘human-on-the-loop’ for LAWs deployed on a battlefield involving combatants should be considered.[75] In light of the latest state of technology and the high risks involved, a prohibition of the utilisation of fully autonomous weapons in urban areas where civilians are present is reasonable. This proscription should also be extended to cases of doubt. Ultimately, the allocation of accountability considering state responsibility should be stipulated.

The establishment of an international treaty of this magnitude impacted by continuous technical progress and varying opinions among states is indeed a difficult and protracted task. Notwithstanding this tremendous challenge, a blanket ban on autonomous weapons would omit certain moral advantages that appropriate LAWs could entail.[76] In addition, the implementation of an absolute prohibition has considerable risks. States that would not support or not comply with an absolute proscription could be encouraged to develop and commerce their weapons underground. In the field of warfare, non-transparency is a great threat. Consequently, states should be urged to develop self-directed weapons in an open and transparent manner pursuant to the rules of war.

BIBLIOGRAPHY

Articles and books

Air Force Judge Advocate General’s Department, Air Force Operations and the Law: A Guide for Air and Space Forces (International and Operations Law Division, 2002)

Crawford, Emily and Pert, Alison, International Humanitarian Law (Cambridge University Press, 2015)

Crootof, Rebecca ‘War Torts: Accountability for Autonomous Weapons’ (2016) 164(6) University of Pennsylvania Law Review, 1347 – 1402

Fuzaylova, Elizabeth, ‘War Torts, Autonomous Weapon Systems and Liability: Why a Limited Strict Liability System Should Be Implemented’ (2019) 40(3) Cardozo Law Review, 1327 – 1366

Hamilton, Rebecca J, ‘State-Enabled Crimes’ (2016) 41(2) Yale Journal of International Law, 302 – 346

Henckarters, Jean-Marie and Doswald-Beck, Louise (eds), International Committee of the Red Cross: Customary International Humanitarian Law (Cambridge University Press, 2005)

Horowitz, Michael C, ‘The Ethics & Morality of Robotic Warfare: Assessing the Debate over Autonomous Weapons’ (2016) 145(4) Daedalus, 25 – 36

Krishnan, Armin, Killer Robots: Legality and Ethicality of Autonomous Weapons (Taylor & Francis Ltd, 17 July 2009)

Matthias, Andreas ‘The Responsibility Gap: Ascribing Responsibility for the Actions of Learning Automata’ (2004) 6(3) Ethics and Information Technology, 175 – 183

Sauer, Frank and Schörnig, Niklas, ‘Killer drones: The ‘silver bullet’ of democratic warfare?’ (2012) 43(4) Security Dialogue, 363 – 380

Solis, Garis D, The Law of Armed Conflict (Cambridge University Press, February 2010)

Sparrow, Robert, ‘Killer Robots’ (2007) 24(1) Journal of Applied Philosophy, 62 – 77

Umbrello, Steven, Torres, Phil and De Bellis, Angelo F, ‘The future of war: could lethal autonomous weapons make conflict more ethical?’ (2020) 35(1) IA & Society, 273 – 282

Vöneky, Silja, ‘Implementation and Enforcement of International Humanitarian Law’ in Dieter Fleck (ed) The Handbook of International Humanitarian Law (Oxford University Press, 3rd ed, 2013)

Commentaries

International Committee of the Red Cross, Commentary on the Additional Protocols of 8 June 1977 to the Geneva Conventions of 12 August 1949 (Martinus Nijhoff Publishers, 1987)

Legislative materials

Hague Convention (IV) Respecting the Laws and Customs of War on Land (1907), opened for signature 18 October 1907 (entered into force 26 January 1910)

Protocol Additional to the Geneva Conventions of 12 August 1949 and relating to the Protection of Victims of International Armed Conflicts (Protocol I) (1977), opened for signature 8 June 1977, 1125 UNTS 3 (entered into force 7 December 1978)

Convention of Certain Conventional Weapons (1980), opened for signature 10 October 1980, 1342 UNTS 137 (entered into force 2 December 1983)

Rome Statute of the International Criminal Court (1998) opened for signature 17 July 1988, 2187 UNTS 90 (entered into force 1 July 2002)

Articles on Responsibility of States for Internationally Wrongful Acts (2001), adopted November 2001, Supplement No. 10 (A/56/10)

Jurisdictional materials

Aerial Incident of 3 July 1988 (Islamic Republic of Iran v United States of America) (Application instituting Proceedings) (International Court of Justice, General List No 79, 17 May 1989)

Aerial Incident of 3 July 1988 (Islamic Republic of Iran v United States of America) (Settlement Agreement) (International Court of Justice, 9 February 1996)

Armed Activities on the Territory of the Congo (New Application: 2002) (Democratic Republic of the Congo v Rwanda) (Judgment) [2006] ICJ Rep 6

‘Memorial submitted by the Islamic Republic of Iran’, Aerial Incident of 3 July 1988 (Islamic Republic of Iran v United States of America) (International Court of Justice, 24 July 1990)

Prosecutor v. Blaškic (Trial Judgement) (International Criminal Tribunal for the Former Yugoslavia, Trial Chamber I, Case No. IT-95-14-T, 3 March 2000)

Research papers and reports

Anderson, Kenneth and Waxman, Matthew C, ‘Law and Ethics for Autonomous Weapon Systems: Why a Ban Won't Work and How the Laws of War Can’ (Research Paper No. 2013-11, College of Law, American University Washington, 2013) 1 – 31

Anderson, Kenneth, Reisner, Daniel and Waxman, Matthew C, ‘Adapting the Law of Armed Conflict to Autonomous Weapon Systems’ (Research Paper No. 2014-50, College of Law, American University Washington, 4 September 2014) 386 – 411

Heyns, Christoph, Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions (UN Doc A/HRC/23/47, 9 April 2013) 1 – 22

Human Rights Watch, Losing Humanity - The Case against Killer Robots (Report, November 2012) 1 – 49

Others

‘The Threat of Fully Autonomous Weapons’ Campaign to Stop Killer Robots (Website) https://www.stopkillerrobots.org/learn/

‘Lethal Autonomous Weapons Pledge’ Future of Life Institute (Website) https://futureoflife.org/lethal-autonomous-weapons-pledge/

Wareham, Marry, ‘It’s Time For a Binding, Absolute Ban on Fully Autonomous Weapons’ Human Rights Watch (Website, 9 November 2017) https://www.hrw.org/news/2017/11/09/its-time-binding-absolute-ban-fully-autonomous-weapons

ICRC, ‘Responsibility for violations of International Humanitarian Law’, IHL Database (Website) https://ihl-databases.icrc.org/customary-ihl/eng/docs/v1_rul_rule149

‘Autonome Drohnen – die besseren Waffen?’ [Autonomous drones – the better weapons?], Netzpolitik [Network Policy] (Website, 10 October 2017) https://netzpolitik.org/2017/ autonome-drohnen-die-besseren-waffen/#

US Department of Defense, ‘Autonomy in Weapon Systems’ (Directive No 3000.09, 21 November 2012)


[1] ‘Lethal Autonomous Weapons Pledge’, Future of Life Institute (Website) https://futureoflife.org/lethal-autonomous-weapons-pledge/.

[2] cf. ‘Autonome Drohnen – die besseren Waffen?’ [Autonomous drones – the better weapons?], Netzpolitik [Network Policy] (Website, 10 October 2017) https://netzpolitik.org/2017/autonome-drohnen-die-besseren-waffen/#.

[3] US Department of Defense, ‘Autonomy in Weapon Systems’ (Directive No 3000.09, 21 November 2012) 13 f.

[4] Human Rights Watch, Losing Humanity - The Case against Killer Robots (Report, November 2012) 2.

[5] Ibid.

[6] Michael C Horowitz, ‘The Ethics & Morality of Robotic Warfare: Assessing the Debate over Autonomous Weapons’ (2016) 145(4) Daedalus 25, 27.

[7] Horowitz, 28.

[8] Steven Umbrello, Phil Torres and Angelo F De Bellis, ‘The future of war: could lethal autonomous weapons make conflict more ethical?’ (2020) 35(1) IA & Society 273.

[9] Frank Sauer and Niklas Schörnig, ‘Killer drones: The ‘silver bullet’ of democratic warfare?’ (2012) 43(4) Security Dialogue 363, 374.

[10] cf. Mary Wareham, ‘It’s Time for a Binding, Absolute Ban on Fully Autonomous Weapons’, Human Rights Watch (Website, 9 November 2017 https://www.hrw.org/news/2017/11/09/its-time-binding-absolute-ban-fully-autonomous-weapons.

[11] cf. Sauer and Schörnig, 372-373.

[12] cf. Ibid.

[13] Kenneth Anderson and Matthew C Waxman, ‘Law and Ethics for Autonomous Weapon Systems: Why a Ban Won't Work and How the Laws of War Can’ (Research Paper No. 2013-11, College of Law, American University Washington, 2013) 18.

[14] cf. Horowitz, 29.

[15] cf. e.g. Armed Activities on the Territory of the Congo (New Application: 2002) (Democratic Republic of the Congo v Rwanda) (Judgment) [2006] ICJ Rep 6.

[16] cf. Ibid, 32; Christoph Heyns, Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions (UN Doc A/HRC/23/47, 9 April 2013) paras 55 f.

[17] Heyns, para 55.

[18] Kenneth Anderson, Daniel Reisner and Matthew C Waxman, ‘Adapting the Law of Armed Conflict To Autonomous Weapon Systems’ (Research Paper No. 2014-50, College of Law, American University Washington, 4 September 2014) 393.

[19] Umbrello, Torres and De Bellis, 274.

[20] cf. ‘The Threat of Fully Autonomous Weapons’, Campaign to Stop Killer Robots (Website) https://www.stopkillerrobots.org/learn/; HWR report, 30f.

[21] cf. Jean-Marie Henckarters and Louise Doswald-Beck (eds), International Committee of the Red Cross: Customary International Humanitarian Law (Cambridge University Press, 2005).

[22] Protocol Additional to the Geneva Conventions of 12 August 1949 and relating to the Protection of Victims of International Armed Conflicts (Protocol I) (1977), opened for signature 8 June 1977, 1125 UNTS 3 (entered into force 7 December 1978).

[23] AP I, Art. 35(2).

[24] cf. Anderson, Reisner and Maxwell, 400.

[25] AP I, Art. 35(2).

[26] Gary D Solis, The Law of Armed Conflict (Cambridge University Press, February 2010) 270.

[27] Ibid.

[28] AP I, Art. 51(4)(c).

[29] ICRC, Commentary on the Additional Protocols of 8 June 1977 to the Geneva Conventions of 12 August 1949 (Martinus Nijhoff Publishers, 1987) para 1965.

[30] cf. Anderson, Reisner and Maxwell, 400.

[31] AP I, Art. 48.

[32] cf. HRW report, 31.

[33] AP I, Art. 51(4)(b).

[34] cf. Armin Krishnan, Killer Robots: Legality and Ethicality of Autonomous Weapons (Taylor & Francis Ltd, 17 July 2009) 99.

[35] Umbrello, Torres and De Bellis, 278.

[36] AP I, Art. 51(5)(b).

[37] Air Force Judge Advocate General’s Department, Air Force Operations and the Law: A Guide for Air and Space Forces (International and Operations Law Division, 2002) 27.

[38] Hague Convention (IV) Respecting the Laws and Customs of War on Land (1907), opened for signature 18 October 1907 (entered into force 26 January 1910) Art. 23(g).

[39] Rome Statute of the International Criminal Court (1998), opened for signature 17 July 1988, 2187 UNTS 90 (entered into force 1 July 2002) Art. 8.

[40] AP I, Art. 57.

[41] Anderson, Reisner and Maxwell, 404.

[42] cf. Ibid, 405.

[43] cf. Ibid.

[44] cf. Campaign to Stop Killer Robots.

[45] cf. Robert Sparrow, ‘Killer Robots’ (2007) 24(1) Journal of Applied Philosophy 62, 67.

[46] cf. Sparrow, 69 f.

[47] cf. Ibid.

[48] Horowitz, 31.

[49] HRW report, 43.

[50] Ibid.

[51] Prosecutor v. Blaškic (Trial Judgement) (International Criminal Tribunal for the Former Yugoslavia, Trial Chamber I, Case No. IT-95-14-T, 3 March 2000).

[52] Andreas Matthias, ‘The Responsibility Gap: Ascribing Responsibility for the Actions of Learning Automata’ (2004) 6(3) Ethics and Information Technology 175, 183.

[53] cf. ICRC, ‘Responsibility for violations of International Humanitarian Law’, IHL Database (Website) https://ihl-databases.icrc.org/customary-ihl/eng/docs/v1_rul_rule149, Rule 149.

[54] Rebecca Crootof, ‘War Torts: Accountability for Autonomous Weapons’ (2016) 164(6) University of Pennsylvania Law Review 1347, 1355 f.

[55] Articles on Responsibility of States for Internationally Wrongful Acts (2001), adopted November 2001, Supplement No. 10 (A/56/10), Art. 31(1).

[56] First Geneva Convention, Art. 51.

[57] cf. Rebecca J Hamilton, ‘State-Enabled Crimes’ (2016) 41(2) Yale Journal of International Law 302, 313.

[58] Aerial Incident of 3 July 1988 (Islamic Republic of Iran v United States of America) (Application instituting Proceedings) (International Court of Justice, General List No 79, 17 May 1989) 4.

[59] Aerial Incident of 3 July 1988 (Islamic Republic of Iran v United States of America) (Settlement Agreement) (International Court of Justice, 9 February 1996) 649.

[60] ‘Memorial submitted by the Islamic Republic of Iran’, Aerial Incident of 3 July 1988 (Islamic Republic of Iran v United States of America) (International Court of Justice, 24 July 1990) 42 f.

[61] Articles on Responsibility of States for Internationally Wrongful Acts, Art. 31.

[62] cf. Crootof, 1391.

[63] cf. Ibid, 1388.

[64] Ibid.

[65] Ibid, 1387.

[66] Silja Vöneky, ‘Implementation and Enforcement of International Humanitarian Law’ in Dieter Fleck (ed) The Handbook of International Humanitarian Law (Oxford University Press, 3rd ed, 2013) 648, 683.

[67] cf. Emily Crawford and Alison Pert, International Humanitarian Law (Cambridge University Press, 2015) 253.

[68] Elizabeth Fuzaylova, ‘War Torts, Autonomous Weapon Systems and Liability: Why a Limited Strict Liability System Should Be Implemented’ (2019) 40(3) Cardozo Law Review 1327, 1360.

[69] cf. Crootof, 1395.

[70] cf. Ibid.

[71] Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May be Deemed to be Excessively Injurious or to Have Indiscriminate Effects (1980), opened for signature 10 October 1980, 1342 UNTS 137 (entered into force 2 December 1983).

[72] cf. Anderson, Reisner and Waxman, 407.

[73] AP I, Art. 36.

[74] Umbrello, Torres, De Bellis, 278.

[75] Anderson, Reisner and Waxman, 407.

[76] cf. Umbrello, Torres, De Bellis, 279.


AustLII: Copyright Policy | Disclaimers | Privacy Policy | Feedback
URL: http://www.austlii.edu.au/au/journals/UNSWLawJlStuS/2020/20.html