AustLII Home | Databases | WorldLII | Search | Feedback

Journal of Law, Information and Science

Journal of Law, Information and Science (JLIS)
You are here:  AustLII >> Databases >> Journal of Law, Information and Science >> 2012 >> [2012] JlLawInfoSci 9

Database Search | Name Search | Recent Articles | Noteup | LawCite | Author Info | Download | Help

Wagner, Markus --- "Taking Humans Out of the Loop: Implications for International Humanitarian Law" [2012] JlLawInfoSci 9; (2012) 21(2) Journal of Law, Information and Science 155


Taking Humans Out of the Loop: Implications for International Humanitarian Law

COMMENT BY MARKUS WAGNER[*]

1 Introduction

In their 2008 article entitled The Laws of Man over Vehicles Unmanned: The Legal Response to Robotic Revolution on Sea, Land and Air, Brendan Gogarty and Meredith Hagger have pointed to a wide range of challenges posed by the introduction of unmanned vehicles (UVs).[1] Their multi-faceted analysis of these challenges provides a blueprint for the debate in this area in the future.

Technological advances have allowed for armed conflicts to take place over ever greater distances. While early combat took place face to face, inventions made it possible to develop weapon systems that allowed for increased separation from the actual location of combat. Early inventions only allowed for relatively small distances — think bow and arrow — while the introduction of, eg, black powder increased that distance considerably. Inventions such as airplanes as well as rockets and missiles have driven this development even further. Technological advances have continuously increased that distance, yet the very large majority of today’s weapon systems continue to be characterised by one common element: human input is still — by and large — a common denominator in some form or another. In the case of rockets the firing decision is still in the hands of humans, whereas pilots not only make decisions over where to fly and what route to take, but also whether, and if so, what weapons to deploy.

Keeping humans in the loop has not yet changed even with the onset of unmanned vehicles (UVs). The current generation of UVs is remotely operated, sometimes from a close distance, sometimes over long distances. And while the use of fully autonomous weapons is still a decade or more away,[2] there has been considerable discussion as to when this goal is to be reached. Until a few years ago, it was commonplace for defense officials to consider retaining humans in the loop as an essential component of warfare even in the future.[3] However, a US Department of Defense (DoD) report in 2009, predicted that the technological challenges regarding fully autonomous systems will be overcome by the middle of the century.[4]

Technological development has been particularly rapid regarding unmanned aerial vehicles, followed by a vigorous and concomitant public debate.[5] Focusing largely on the legality of targeted killing,[6] this debate has also brought to light the increasing extent to which UAVs have been used in prosecuting armed conflict in Afghanistan and Pakistan, as well as Iraq. And while the numbers are — as such — inconclusive, the trend is rather unambiguous. Figures released by the Air Force indicate that Predators and Reapers deployed 219 missiles and bombs in Afghanistan in 2009, compared to 183 in Afghanistan in 2008 and 74 in 2007.[7] Congressional hearings confirm the large number of missions carried out by remotely controlled aircraft, citing that at any given moment ‘40 Predator-series aircraft are airborne worldwide, while the hours that various UAVs by the Air Force are in operation has more than tripled between 2006 and 2009, then standing at 295,000 hours per year’.[8] Fiscally, there has been a similarly marked increase. In 2010, the US Department of Defense allocated approximately US$5.4 billion to the development, procurement and operation of UAVs.[9] This number has risen markedly from 1990, when the figure stood at US$165 million, to 2001 when this investment totaled US$667 million, after which it rose considerably.[10]

This comment first addresses the difference between the current weapon systems and the next generation of truly autonomous weaponry (Part 2), followed by an overview of the applicable rules of armed conflict (Part 3) before offering some concluding remarks (Part 4).

2 The Next Generation of Weapon Systems: True Robotics

As is the case in many areas of the law, technological advances generally outpace the generation of rules pertaining to particular social phenomena. International humanitarian law is no exception in this regard. From the very beginning, weapons and methods of warfare have created challenges to fundamental assumptions, eg to what extent certain weapons that created excessive injuries may be banned.[11] Similar arguments can be made over the issue of targeted killing by UAVs that are not part of the traditional military arsenal, but that are commandeered by non-military governmental institutions.[12]

The prospective introduction of autonomous weapon systems poses a similar challenge in this regard. One of the central questions is whether the technological advances in the area of robotics is such that it threatens to leave the existing framework of international humanitarian law inadequate.[13] Before conducting this analysis however, it is important to briefly outline what distinguishes the current generation of machinery such as aerial drones from future generations of autonomous weapon systems.

Attempts to produce remotely operated weapons have been undertaken since the end of the 19th century, when Nikola Tesla tested a remote-controlled weapon.[14] Further attempts were made in WW I and WW II, some operated by wire and some by radio.[15] Development continued throughout the 20th century, but UAVs did not gain prominence until shortly before the millennium.[16]

While it is not possible to describe the debate about the use of UAVs in great detail, their usage appears uncontroversial as long as a person remains in the loop. Notwithstanding the debate over whether or not the amount of information that is relayed by way of remotely-operated drones leads to better targeting decisions,[17] the use of such weapon systems appears generally unproblematic under international humanitarian law. This remains presumptively the case even in scenarios where an operator no longer actively manages detection and targeting, but also in cases of more advanced autonomy. This is the case, for example, where an operator has to actively intervene in order to stop an attack. Situations like this are not characterised by full autonomy, as an operator remains in the loop. Arguably however, the control that an operator exercises in these situations is far less detailed than is the case today. Instead of actively operating a UV, the situation is characterised by managing UVs through oversight, intervening only when necessary.

Future systems are predicted to be able to function fully autonomously. This differs considerably from the current generation of remotely-operated vehicles. Autonomy in this context can thus be understood as an unmanned system that prosecutes an attack based on a code that enables an independent (ie not pre-determined) decision-making process without direct human input. This includes the detection and targeting as well as the firing decision,[18] wholly independent from immediate human intervention. These characteristics differentiate autonomous systems not only from their current predecessors which are commanded from a distance, but also from weapons which have been pre-programmed to follow a certain flight path and attack one or more targets without making independent decisions.

3 Autonomy and International Humanitarian Law: The Need to Ensure Correct Quantitative and Qualitative Assessments

Based on the foregoing, this section will sketch some of the legal problems that a move towards a higher degree of autonomy will bring about. This brief comment will only deal cursorily with two cornerstones of international humanitarian law: distinction and proportionality.[19] Other legal challenges that autonomous weapon systems pose, such as individual criminal responsibility, state responsibility and compliance with the testing requirement in Article 36 Additional Protocol I, quite apart from ethical and political challenges, will have to be dealt with in other forums.

At the outset, it should be borne in mind that the requirements for autonomous weapon systems are considerable. Such systems must not only have the ability to quantitatively assess whether a particular human being or object is a military target, but once the decision has been made whether to engage a target, UVs must also be able to qualitatively determine whether the requirements of precaution have been met. Good arguments can be made that this is possible for the former; there are however strong indications that qualitative assessments (or at combination of qualitative and quantitative assessments) are — at least as of now and technically speaking — difficult if not impossible for computers to perform.[20]

International humanitarian law is characterised by a constant tension between two competing elements: military necessity on one hand, and the requirement to carry out combat in a humane fashion on the other.[21] Being essential elements influencing this tension, the interpretation of both the principle of distinction and the principle of proportionality shape the outcome of any legal analysis. Needless to say there is considerable disagreement over the degree to which humane behavior in combat may trump military necessity or vice versa.[22]

3.1 Principle of distinction

In a simplified form, the principle of distinction — laid down in general terms in Article 48 Additional Protocol I, — requires that an attacker distinguish between combatants and civilians as well as objects, in the case of which attackers must distinguish between those that possess a military quality and those that are of a civilian nature.[23] Difficulties arise for obvious reasons in cases where, for example, a target can be classified as both being civilian in nature as well as possessing a military purpose. Classic examples include infrastructure such as bridges by which an army could be supplied. More problematic is the targeting of objects that are useful not only for military purposes, but also have fundamental value for the civilian population.

Subsequent rules in Additional Protocol I, refine this general concept, namely by prohibiting the targeting of individual civilians[24] (unless they take a direct part in hostilities[25]) historic monuments, works of art or places of worship.[26] Furthermore, Additional Protocol I contains prohibitions against certain types of attacks that have an indirect effect on the civilian population. Specifically, Additional Protocol I forbids attacks that target objects that are considered to be ‘indispensable to the survival of the civilian population’, the natural environment and ‘installations containing dangerous forces’.[27] Moreover, certain methods of attack are also prohibited, namely those that are indiscriminate.[28] This means that in addition to being able to properly distinguish between legitimate targets and those that are not, the principle of distinction requires that an attack be carried out with weapons that are capable of prosecuting the attack in a discriminatory manner. This could mean that a pilot, left only with a large ordinance the kill radius of which would not allow for a distinction between combatants and civilians, could not attack a target because the use of this particular weapon would not satisfy the rule of distinction.

UVs will have to be able to distinguish between civilian and military targets. As pointed out above, while the textual basis for the distinction appears clear, realities on the ground oftentimes leave ambiguous whether a target is legitimate or not. In the case of UVs this means that the underlying software would have to be able to determine whether a particular target is civilian or military in nature.[29] Moreover, the UV would have to be programmed so that it also takes account of the requirement that in cases of uncertainty it would abort the attack.[30] A number of weapons today are capable of determining — based on pre-programmed characteristics, such as shape and dimensions — a target’s military nature.[31] Once a sufficient number of characteristics of the target have been reconciled with the pre-programmed version, the weapon system can initiate an attack. This type of matching is mechanical, based on quantitative data and even if one were to argue that there is still an unacceptable amount of ambiguity, it appears that the recent advances regarding this technology will enable such systems to function with the required accuracy in the near future.[32]

3.2 Principle of proportionality

More problematic for present purposes is the principle of proportionality. Laid out in various provisions throughout Additional Protocol I, proportionality requires that damage to civilian objects may not be ‘excessive in relation to the concrete and direct military advantage anticipated’.[33] These provisions — which do not use the term proportionality — are designed to protect the civilian population, yet their application is made more difficult through the use of the term ‘excessive’. This choice of wording is a result of the tension mentioned above, between the competing interests during armed conflict: gaining military advantage, while protecting the civilian population.[34] This tension was pointed out in a 2000 report to the International Criminal Tribunal for the Former Yugoslavia (ICTY) Prosecutor, which addressed the difficulty in applying the principle of proportionality and professed that ‘[o]ne cannot easily assess the value of innocent human lives as opposed to capturing a particular military objective’.[35]

It is thus impossible to find bright-line rules that can determine a priori what is permissible and what is prohibited.[36] In order to minimise the legal exposure of commanding officers, Additional Protocol I Article 57(2) refers to certain precautions that must be taken, again using the same terminology, ie ‘excessive’. Thus, a legal evaluation will have to take place on a case-by-case basis.[37] Given these difficulties, another question would have to be answered, namely whether a singular set of proportionality assessments actually exists which could thus be programmed. The answer to this question is obviously negative and it is clear that a military commander may arrive at different conclusions in different situations and would most certainly differ in that assessment from a human rights lawyer.

It is readily apparent that this type of analysis is no longer open to quantitative assessments, but rather that such analysis requires the evaluation of a variety of data in addition to weighing the relative weight of each aspect in the specific circumstances that exist when an attack is about to be launched. In the context of designing UVs, this would require at the very least addressing the following areas: With respect to target selection, the program would have to be designed to anticipate all potential decisions in an abstract manner. It would have to be able to determine how many civilian casualties would be acceptable under the circumstances at the time.[38] The program would also have to be able to determine which type of weapon could be used under which circumstances, eg whether it is permissible to fire a high-impact weapon despite the presence of civilians because of the military advantage that could be gained by doing so. Since UVs would be fully autonomous, these systems would have to be able to react to changing circumstances. While the use of a particularly large weapon may have been permissible at one point, circumstances may change so as to make the use of that weapon illegal.[39]

The inability of software to confront these challenges — leaving aside the ability to prosecute humans for transgressions, which may act as an additional deterrent — may render the use of UVs almost useless except in the narrowest of circumstances.[40] Even if one were to accept that certain elements of the principle of distinction are amenable to quantitative assessment, this is clearly not the case for the principle of proportionality with its requirement of making highly relational and context-dependent assessments.

One could argue that the move towards greater degrees of autonomy requires a new legal framework.[41] Arguments to this effect have been made and are based on the view that current international humanitarian law is inadequate to deal with a decreased level of human participation in military operations.[42] While it is true that international humanitarian law has been rooted in an anthropocentric paradigm, armed conflict has moved away considerably from its origins. Weapons have been designed to reach their target over longer distances and new categories of weapons have been invented, yet most fundamental principles of international humanitarian law have remained in place. This is why others argue that the existing framework provides the necessary rules for autonomous weapon systems.[43] There is a danger that by moving away from legal principles that have governed armed conflict to date, the legal principles underlying current international humanitarian law become more diffuse or that the lack of consensus over the actual meaning of new legal rules will allow — at least for some time, but bearing in mind that early movers can have a disproportionate impact — for exploitation, possibly not only at the margins.

4 Conclusion

UVs will continue to proliferate in the future. Their technical utility is simply too great and there are many tasks in which there is little doubt that UVs can take on an important role which humans simply cannot carry out, such as rescue missions and fact-finding missions after natural or man-made disasters. The recent nuclear disaster in Fukushima is but one example.[44] This being the case, it is all the more important that their introduction be accompanied by a legal structure that is designed to take on the challenges that this development inevitably poses. Gogarty and Hagger have posited that the proliferation of UVs in all areas of life will create pressure to form a regulatory framework which UVs fall into.[45]

This is true not only for the civilian realm, but equally — and potentially even more so — in the case of military use of genuine robotic technology. After all, it is the military use where the impact of autonomous systems on human life and liberty is — with the exception of only a handful of examples in the non-military world — at its strongest. This brief comment only addressed two areas in which autonomous weapon systems confront potential legal limitations.

Arguments can be made that these challenges can be overcome regarding the principle distinction which requires that combat be carried out without directly targeting the civilian population. Advanced technologies already can or will be able in the future to distinguish between civilian and military targets because of the quantitative nature of this assessment, although, as has been mentioned above, not all such assessments are necessarily quantitative. The same cannot be said regarding the principle of proportionality. The principle is, by nature, not amenable to quantification, but rather requires a qualitative assessment of various factors. The assessment at this stage is no longer quantitative, but because of its relational nature becomes a qualitative one. And while great strides have been made in advancing the former, this is not the case in the development of the latter. This means that the use of autonomous weapon systems is legally indefensible. It is, moreover, highly problematic both ethically and politically.


[*] Associate Professor of Law, University of Miami School of Law. I would like to thank the editors for inviting me as a commentator in this special issue of the Journal of Law, Information and Science.

[1] Brendan Gogarty and Meredith Hagger, ‘The Laws of Man over Vehicles Unmanned: The Legal Response to Robotic Revolution on Sea, Land and Air’ (2008) 19 Journal of Law, Information and Science 73.

[2] Elizabeth Quintana, The Ethics and Legal Implications of Military Unmanned Vehicles (2008) Royal United Services Institute for Defence and Security Studies, 5 <http://www.rusi.org/downloads/assets/RUSI_ethics.pdf> . A recent White Paper designated only guard, support and medical duties to UMS as being feasible at this point and precluded any tasks that are combat-related. See Army Capabilities Integration Center – Tank-Automotive Research and Development Engineering Center, Robotics Strategy White Paper (2009), 28-33,

<http://futurefastforward.com/images/stories/military/RoboticsStrategyWhitePaper_19Mar09.pdf> .

[3] See Department of Defense, Unmanned Systems Safety Guide for DoD Acquisition, (1st ed (Version .96) 27 June 2007)

<https://acc.dau.mil/adl/en-US/269574/file/41532/Unmanned_Guide_DOD_Acq_2007.pdf>; P W Singer, Wired for War: The Robotics Revolution and Conflict in the Twenty-first Century (Penguin Press, 2009) 123-124.

[4] United States Air Force, United States Air Force Unmanned Aircraft Systems Flight Plan 2009-2047 (18 May 2009)

<http://www.aviationweek.com/media/pdf/UnmannedHorizons/17312080-United-States-Air-Force-Unmanned-Aircraft-Systems-Flight-Plan-20092047-Unclassified.pdf> .

[5] See John Markoff, ‘War Machines: Recruiting Robots for Combat’, New York Times (New York), 28 November 2010, A1.

[6] Nils Melzer, Targeted Killing in International Law (Oxford University Press, 2008); David Kretzmer, ‘Targeted Killing of Suspected Terrorists: Extra-Judicial Executions or Legitimate Means of Defence?’ (2005) 16 European Journal of International Law 171; Orna Ben-Naftali and Keren R Michaeli, ‘”We Must Not Make a Scarecrow of the Law”: A Legal Analysis of the Israeli Policy of Targeted Killings’ (2003) 36 Cornell International Law Journal 233. For a philosophical inquiry arguing that targeted killing cannot be objected to if one accepts large-scale killing in war, see Daniel Statman, ‘Targeted Killing’ (2004) 5 Theoretical Inquiries in Law 179. For a similar view from a legal and policy perspective, see Kenneth Anderson, ‘Targeted Killing in US Counterterrorism Strategy and Law’ (Working Paper of the Series on Counterterrorism and American Statutory Law) Brookings Institution, Georgetown University Law Center and the Hoover Institution (11 May 2009)

<http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1415070> . For an analysis under US constitutional law, see Richard Murphy and Afsheen John Radsan, ‘Due Process and Targeted Killing of Terrorists’ (2009-2010) 31 Cardozo Law Review 405.

[7] Christopher Drew, ‘Drones Are Playing a Growing Role in Afghanistan’ The New York Times (New York) 19 February 2010, A6. See also Gogarty and Hagger, above n 1, 85 et seq.

[8] See eg report during a Congressional hearing by Michael S Fagan, Rise of the Drones: Unmanned Systems and the Future of War, US House of Representatives, Committee on Oversight and Government Reform, Subcommittee on National Security and Foreign Affairs, 111th Congress, 2nd Sess., 49 (23 March 2010).

[9] See John Keller, Unmanned vehicle spending in the 2010 DOD budget to reach $5.4 billion, (28 May 2009) Military and Aerospace Electronics,

<http://www.militaryaerospace.com/index/display/article-display/363553/articles/military-aerospace-electronics/executive-watch/unmanned-vehicle-spending-in-the-2010-dod-budget-to-reach-54-billion.html> . The number is a combination of various line items from the fiscal year 2010 budget. See National Defense Authorization Act for Fiscal Year 2010, PL 111-84, 123 Stat. 2190 (2009). More recent numbers are available at Department of Defense, Office of the Under Secretary of Defense (Comptroller), <http://comptroller.defense.gov/Budget2012.html> .

[10] Harlan Gear and Christopher Bolkcom, Unmanned Aerial Vehicles: Background and Issues for Congress (21 November 2005) Congressional Research Service, 11, <http://www.fas.org/irp/crs/RL31872.pdf> .

[11] Declaration Renouncing the Use, in Time of War, of Explosive Projectiles Under 400 Grammes Weight, Saint Petersburg, 29 November (11 December) 1868.

[12] See Philip Alston, Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, Addendum – Study on Targeted Killings, UN Doc A/HRC/14/24/Add.6; Mary Ellen O’Connell, ‘Unlawful Killing with Combat Drones: A Case Study of Pakistan, 2004-2009’ (Research Paper No 09-43, Notre Dame Law School Legal Studies, (2009)) in Simon Bronitt (ed), Shooting to Kill: The Law Governing Lethal Force in Context, (forthcoming) 1,

<http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1501144> .

[13] For a closer analysis, see Part 3, Autonomy and International Humanitarian Law, below.

[14] Mark Edward Peterson, ‘The UAV and the Current and Future Regulatory Construct for Integration into the National Airspace System’ (2006) 71 Journal of Air Law and Commerce 521, 537.

[15] Nick T Spark, ‘Unmanned Precision Weapons Aren’t New’ (2005) 131 US Naval Institute Proceedings 66.

[16] For a brief overview, see Peterson, above n 14, 535 et seq.

[17] Jack M Beard, ‘Law and War in the Virtual Era’ (2009) 103 American Journal of International Law 409.

[18] If available, an autonomous system may also be programmed to choose among different weapons at its disposal.

[19] An overwhelming majority of countries have ratified Additional Protocol I to the Geneva Conventions (Protocol Additional to the Geneva Conventions of 12 August 1949, and relating to the Protection of Victims of International Armed Conflicts (Protocol I), opened for signature 8 June 1977, 1125 UNTS 3 (entered into force 7 December 1979) (‘Additional Protocol I’)). Both principles form part of customary international law. Prosecutor v Kupreskic et al (Trial Judgement) (International Criminal Tribunal for the former Yugoslavia (ICTY) Case No IT-95-16-T, 14 January 2000) [524]. See also William H Taft, ‘The Law of Armed Conflict after 9/11: Some Salient Features’ (2003) 28 Yale Journal of International Law 319, 323; Michael J Matheson, ‘Session One: The United States Position on the Relation of Customary International Law to the 1977 Protocols Additional to the 1949 Geneva Conventions’ (1987) 2 American University Journal of International Law and Policy 419, 426.

[20] Tony Gillespie and Robin West, ‘Requirements for Autonomous Unmanned Air Systems set by Legal Issues’ (2010) 4(2) The International C2 Journal 1, 4. Whether, as the authors claim there ‘will always need to be human intervention’ is far from clear (at 23).

[21] Michael N Schmitt, ‘Military Necessity and Humanity in International Humanitarian Law: Preserving the Delicate Balance’ [2010] VirgJlIntLaw 15; (2010) 50 Virginia Journal of International Law 795, 795.

[22] Ibid; Theodor Meron, ‘The Martens Clause, Principles of Humanity, and Dictates of Public Conscience’ (2000) 94 American Journal of International Law 78.

[23] Additional Protocol I, Article 48 states:

‘In order to ensure respect for and protection of the civilian population and civilian objects, the Parties to the conflict shall at all times distinguish between the civilian population and combatants and between civilian objects and military objectives and accordingly shall direct their operations only against military objectives.’

[24] Ibid Article 51(2).

[25] Ibid Article 52(3).

[26] Ibid Article 53.

[27] See ibid, Articles 54, 55 and 56, respectively.

[28] Ibid Article 51(4) states:

‘Indiscriminate attacks are prohibited. Indiscriminate attacks are:

(a) those which are not directed at a specific military objective;

(b) those which employ a method or means of combat which cannot be directed at a specific military objective; or

(c) those which employ a method or means of combat the effects of which cannot be limited as required by this Protocol.’

[29] One interesting proposal is proffered is mandating that UVs would not target humans, but only weapon systems. See John S Canning, A Concept for the Operation of Armed Autonomous Systems on the Battlefield, (2006) 3rd Annual Disruptive Technology Conference

<http://www.dtic.mil/ndia/2006disruptive_tech/canning.pdf> . While this may minimise the danger somewhat, it is unclear how this would alleviate the problem of, for example, someone carrying a rifle for safety reasons or for hunting purposes.

[30] With respect to civilians, see Additional Protocol I Article 50(1), with respect to civilian objects, see Additional Protocol I Article 52(3).

[31] Robert Sparrow, ‘Killer Robots’ (2007) 24 Journal of Applied Philosophy 62, 63. More recently, see Michael Lewis et al, ‘Scaling Up Wide-Area-Search Munition Teams’ (May-June 2009) 24 IEEE Intelligent Systems 10.

[32] Note however that specifically with respect to Additional Protocol I Article 51(4)(c) there has been considerable controversy since it arguably contains elements of proportionality and thus may not be merely a quantitative assessment. See generally Stefan Oeter, ‘Methods and Means of Combat’, in Dieter Fleck (ed) The Handbook of International Humanitarian Law (Oxford University Press, 2nd ed, 2008) 119, 201 et seq.

[33] See eg Additional Protocol I Articles 51.5(b) and 57(2)(iii).

[34] This has led some authors to claim that the principle of proportionality is too vague a concept and proportionality would only be implicated when ‘acts have occurred that are tantamount to the direct attack of the civilian population’. W Hays Parks, ‘Air War and the Law of War’ (1990) 32 Air Force Law Review 1, 173; Michael N Schmitt, ‘Targeting and International Humanitarian Law in Afghanistan’ (2009) 39 Israel Yearbook on Human Rights 307, 312. For an opposing view, see Yoram Dinstein, The Conduct of Hostilities Under the Law of International Armed Conflict (Cambridge University Press, 2nd ed, 2010) 120-121. Problems relating to proportionality assessments in the context of targeted killings have been pointed out by Noel Sharkey, ‘Death Strikes From the Sky: The Calculus of Proportionality’ (Spring 2009) IEEE Technology and Society Magazine 17, 19. The idea that the principle of proportionality applies in armed conflict has been affirmed strongly by the Supreme Court of Israel. See HCJ 769/02 Public Committee against Torture in Israel et al v Government of Israel et al, [2006] especially 30-33, <http://elyon1.court.gov.il/Files_ENG/02/690/007/a34/02007690.a34.pdf> .

[35] International Criminal Tribunal for the Former Yugoslavia (ICTY), Final Report to the Prosecutor by the Committee Established to Review the NATO Bombing Campaign Against the Federal Republic of Yugoslavia, (June 8, 2000) 39 International Legal Materials 1257, [48].

[36] Gary D Solis, The Law of Armed Conflict: International Humanitarian Law in War (Cambridge University Press, 2010) 273.

[37] ICTY, Above n 35, [50].

[38] Sharkey, above n 34, 18-19.

[39] For a different view, arguing that increased precision inherent in the rising UV technology make attacks more proportional by reducing the likelihood of collateral casualties, see Andy Myers, ‘The Legal and Moral Challenges Facing the 21st Century Air Commander’ (2007) 10 Royal Air Force Air Power Review 76, 89.

[40] These circumstances, though imaginable, can be predicted to occur only infrequently such as a lone military installation far from civilian objects or a military target similarly far removed from objects that could conflict with the requirements of international humanitarian law. Additional arguments that militate against the use of UVs and that could not be addressed here are based on ethical and political considerations. For an opposing view regarding the former, see Ronald C Arkin, Governing Lethal Behavior in Autonomous Robots (CRC Press, 2009). For a view from an operational perspective, see Brian Burridge, ‘Post-Modern Warfighting with Unmanned Vehicle Systems: Esoteric Chimera or Essential Capability?’ (2005) 150 Royal United Services Institute Journal 20.

[41] Note that there are no prohibitions under the current framework of international humanitarian law as the approach has been not to prohibit a particular invention as such but rather its specific use. This point has also been made by Philip Alston, above n 12, [79].

[42] See eg Armin Krishnan, Killer Robots: Legality and Ethicality of Autonomous Weapons (Ashgate, 2009).

[43] See eg Sparrow, above n 31; Arkin, above n 40, 72.

[44] Indeed, the lack of UVs in this regard is striking. See eg the debate after the nuclear meltdown in the Fukushima nuclear plant. John M Glionna and Yuriko Nagano, ‘Japan’s Technology Not Up to this Task – Despite Its Expertise, the Country Must Go Abroad for Robots to Work in Its Damaged Nuclear Plant’, Los Angeles Times (Los Angeles), 8 May 2011, 3.

[45] Gogarty and Hagger, above n 1, 122.


AustLII: Copyright Policy | Disclaimers | Privacy Policy | Feedback
URL: http://www.austlii.edu.au/au/journals/JlLawInfoSci/2012/9.html