Skip to main content
Erschienen in: Ethics and Information Technology 4/2019

23.07.2018 | Original Paper

Just research into killer robots

verfasst von: Patrick Taylor Smith

Erschienen in: Ethics and Information Technology | Ausgabe 4/2019

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

This paper argues that it is permissible for computer scientists and engineers—working with advanced militaries that are making good faith efforts to follow the laws of war—to engage in the research and development of lethal autonomous weapons systems (LAWS). Research and development into a new weapons system is permissible if and only if the new weapons system can plausibly generate a superior risk profile for all morally relevant classes and it is not intrinsically wrong. The paper then suggests that these conditions are satisfied by at least some potential LAWS development programs. More specifically, since LAWS will lead to greater force protection, warfighters are free to become more risk-acceptant in protecting civilian lives and property. Further, various malicious motivations that lead to war crimes will not apply to LAWS or will apply to no greater extent than with human warfighters. Finally, intrinsic objections—such as the claims that LAWS violate human dignity or that it creates ‘responsibility gaps’—are rejected on the basis that they rely upon implausibly idealized and atomized understandings of human decision-making in combat.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Fußnoten
1
Devices that decide whether and when to explode, in the broadest possible sense, have been around since the nineteenth century in the form of naval mines, while homing missiles and proximity tracking explosives have existed since WWII. Before that, military use of trained animals—who have their own will and attack on their own volition in some cases—is almost as old as warfare itself. For a discussion of dogs in particular, see https://​thestrategybridg​e.​org/​the-bridge/​2017/​12/​9/​autonomous-weapons-mans-best-friend.
 
3
See http://​www.​stopkillerrobots​.​org/​ as well as a large scale petition signed by many information technology experts demanding a ban on LAWS research (https://​futureoflife.​org/​open-letter-autonomous-weapons/​).
 
4
The Campaign to Stop Killer Robots only calls for the ban on research of ‘fully’ autonomous weapons systems, which excludes the Phalanx and other systems in widespread use.
 
5
For this reason, this paper will not be concerned with scenarios that involve robot uprisings as described in movies like The Terminator or books like Robopocalypse. Such scenarios are implausible and, even if we thought such possibilities were realistic, it is unlikely that a ban on researching LAWS would be relevant to preventing the ‘x-risk’ of human extinction or enslavement by robot overlord. First, AI research will continue outside the military context and, second, no one is arguing for bans on technology—military or otherwise—that can be controlled through a computer network and be used to kill humans. So, the ban will neither prevent the rise of a true AI nor will it prevent that AI from having access to weapons.
 
7
If one judged that one’s military was not engaged in a good faith effort to fight just wars and fight them justly, then presumably any research program would be deeply suspect.
 
8
Consider the worry that LAWS will be hacked, as in the 2017 Open Letter (https://​futureoflife.​org/​autonomous-weapons-open-letter-2017/​). It is surely true that LAWS will be vulnerable to hacking. But human soldiers are vulnerable to similar issues. Undercover militants joining the police are a significant problem in counter-insurgency warfare, as are betrayals due to ideological sympathy, blackmail, or bribery. Setting intrinsic objections to the side temporarily, the question is not, “Can LAWS be hacked?” but “Will replacing or supplementing this particular job, post, or position with LAWS increase the risk of harmful behavior, given equivalent ideal and non-ideal assumptions?”
 
9
Again, if we reached a judgment that a particular advanced military was indifferent to the principle of just war, then that would give us good reasons not to continue LAWS research for that particular military.
 
10
In setting up the issue this way, I am deeply indebted to Simpson and Mullers (2016). They argue that we should understand just war principles in terms of fair distributions of risk. My paper is different in at least three ways. It sets out the dynamics of the modern military system in order to more fully describe the appropriate baseline, it uses that baseline to undermine intrinsic objections, and it is about the morality of research rather than the morality of deployment.
 
11
Ibid. For the canonical statement of just war principles, see Walzer’s (2002). For international humanitarian law, see the Geneva Conventions, especially Article 51 of the First Protocol Relating to the Victims of International Armed Conflicts, which concerns discrimination and proportionality.
 
12
The heroic actions of Hugh Thompson and his helicopter crew at the My Lai Massacre during the Vietnam War are instructive (Angers 2014). Upon observing a massacre of Vietnamese civilians by American soldiers, as ordered by Lieutenant Calley, they saved many innocent people by physically blocking soldiers from approaching parts of the village and then reporting the massacre to superiors. Yet, despite the fact that Lieutenant Calley was acting in direct contravention to his orders, he had no difficulty getting the vast majority of the soldiers under his command to obey. What’s more, it is worth noting that Thompson and his crew were not under Calley’s command.
 
13
My account of the ‘modern military system’ is based on the analysis of Chap. 3 of Biddle (2006). I’ve also benefitted from Pollack (2002), especially Chap. 1. For a good illustration of these concepts in military practice: see Department of the Army (1998).
 
14
Swinton (1986) is a commonly used illustration of these principles in military academies.
 
15
According to the Iraqi Body Count, only 13% of civilian casualties in Iraq could be attributed to coalition direct fire; the majority of casualties were the result of indirect combat between insurgents and coalition forces or committed by insurgent forces alone: https://​www.​theguardian.​com/​news/​datablog/​2012/​jan/​03/​iraq-body-count-report-data.
 
16
A veteran and student in my “Ethics of War and Peace” course suggested this case based on his experience in Fallujah.
 
17
There is considerable controversy as to why force protection is morally important. Walzer (2006) argues that force protection is only relevant due to military necessity: dead soldiers cannot achieve the military objective of winning a just war. Others (Kasher and Yadlin 2005), however, have argued that soldiers need not take on risks to themselves to spare civilian lives. I remain agnostic; my paper only requires that military behavior is driven by force protection and that this is at least sometimes justified.
 
18
Singer (2009) makes the point that LAWS can be more ‘conservative’ than human warfighters. My view moves beyond his in several ways. First, it bases the claim on an understanding of the modern military system. Second, it offers the normative basis for the civilian-risk-acceptant behavior of human warfighters. Third, it describes the knock-on dynamics of this advantage for LAWS. Fourth, it uses these insights to engage with intrinsic objections as well. Fifth, Singer does not describe the issue in terms of the fair distribution of risk and so does not offer adequate normative guidance on the question.
 
19
This article aptly summarizes the available evidence that drones are more discriminate than other tactics: http://​www.​slate.​com/​articles/​news_​and_​politics/​foreigners/​2015/​04/​u_​s_​drone_​strikes_​civilian_​casualties_​would_​be_​much_​higher_​without_​them.​html. A few other studies purport to show that drone attacks are not more accurate, but they often rely on problematic comparisons. For example, some studies show that fighter-bomber attacks on ISIS military positions kill fewer civilians per sortie than drone assaults in Pakistan or Yemen. But this is not an apples to apples comparison since ISIS is engaged in conventional, full spectrum operations with operational lines and distinct warfighters. Attacks in Yemen and Pakistan are aimed at insurgents who deliberately hide amongst civilians. Conventional tactics by the Pakistani military in these areas generate civilian casualties that are orders of magnitude higher than drone attacks.
 
20
As reported by Edward Porter Alexander (1907).
 
21
For example, neither Larson (1996) nor Gartner and Segura (1998) assert a straightforward relationship between casualties and support for a war. First, war support always erodes over time but it can be revived through victories even in the face of high casualties; the way the war is fought and perceived plays a key role. Second, casualties cannot explain the level of support for war even in those cases where casualties can explain a reduction in war support.
 
22
On the (possibly problematic) difference between responsibility as attributability and responsibility as accountability, see Watson (1996) and Smith (2012).
 
23
Asaro, Johnson, and Axinn do not seem to apply these arguments to self-driving cars, which also involve AI systems making life or death decisions.
 
24
Street lights and self-driving cars are not, however, designed in order to take life even if they make choices that take it. Perhaps these human dignity concerns only apply to killings that are done by autonomous systems that are intentionally created to take life. Notice, however, that we are no longer talking about particular actions or decisions made by some autonomous system but are focusing on the telos or purpose of the system. Yet, what constitutes the purpose of a particular technology is notoriously difficult to pin down. We can, of course, describe the streetlight’s purpose as ‘directing traffic’ and a LAWS’ purpose as ‘killing’ but we could also describe the streetlight’s purpose as “the provision of transportation-related goods at an acceptable level of traffic-related deaths” and LAWS as “the provision of physical security goods through deterrence at an acceptable level of casualties.” This view would also have the perverse implication that commercial autonomous systems that were misused in order to kill would be subject to lesser moral constraints than weapon systems that could be deliberately designed to minimize civilian casualties. At any rate, much would need to be done to make this objection work. The relevant level of description would need to be set, an account of how to precisely determine the function of a weapon system would need to be created, and one would need to show why the function of a system is morally relevant. All of these steps are fraught with difficulty, and this strategy seems to be ultimately contrary to contemporary developments in warfare. In a world where insurgents increasingly use civilian technologies and the line between warfare and law enforcement is increasingly blurred, it seems like we need to develop concepts that do not assume a sharp rupture between military systems and everything else.
 
Literatur
Zurück zum Zitat Alexander, E. P. (1907). Military memoirs of a confederate. New York: Skyhorse Publishing. Alexander, E. P. (1907). Military memoirs of a confederate. New York: Skyhorse Publishing.
Zurück zum Zitat Angers, T. (2014). The forgotten hero of My Lai. New York: Acadian Publishing. Angers, T. (2014). The forgotten hero of My Lai. New York: Acadian Publishing.
Zurück zum Zitat Asaro, P. (2012). On banning autonomous weapon systems: Human rights, automation, and the dehumanization of lethal decision-making. International Review of the Red Cross, 94, 687–709.CrossRef Asaro, P. (2012). On banning autonomous weapon systems: Human rights, automation, and the dehumanization of lethal decision-making. International Review of the Red Cross, 94, 687–709.CrossRef
Zurück zum Zitat Barstow, A. (2000). War’s Dirty Little Secret: Rape, Prostitution, and Crimes Against Women. Pilgrim Press. Barstow, A. (2000). War’s Dirty Little Secret: Rape, Prostitution, and Crimes Against Women. Pilgrim Press.
Zurück zum Zitat Biddle, S. (2006). Military power: Explaining victory and defeat in warfare in modern battle. Princeton: Princeton University Press. Biddle, S. (2006). Military power: Explaining victory and defeat in warfare in modern battle. Princeton: Princeton University Press.
Zurück zum Zitat Boland, R. (2007). Developing reasoning robots for today and tomorrow. Signal, 61, 43. Boland, R. (2007). Developing reasoning robots for today and tomorrow. Signal, 61, 43.
Zurück zum Zitat Brooks, F. Jr. (1987). No silver bullet: Essence and accidents of software engineering. IEEE Computer, 20, 10–19.CrossRef Brooks, F. Jr. (1987). No silver bullet: Essence and accidents of software engineering. IEEE Computer, 20, 10–19.CrossRef
Zurück zum Zitat Chenoweth, E., & Stephan, M. J. (2011). Why civil resistance works. New York: Columbia University Press. Chenoweth, E., & Stephan, M. J. (2011). Why civil resistance works. New York: Columbia University Press.
Zurück zum Zitat Gartner, S., & Segura, G. (1998). War, casualties, and public opinion. The Journal of Conflict Resolution, 42(3), 278–300.CrossRef Gartner, S., & Segura, G. (1998). War, casualties, and public opinion. The Journal of Conflict Resolution, 42(3), 278–300.CrossRef
Zurück zum Zitat Horowitz, M., & Sharre, P. (2015). Meaningful human control in weapon systems: A primer. Center for New American Century Working Paper (Project Ethical Autonomy). Horowitz, M., & Sharre, P. (2015). Meaningful human control in weapon systems: A primer. Center for New American Century Working Paper (Project Ethical Autonomy).
Zurück zum Zitat Johnson, A. M., & Axinn, S. (2013). The morality of autonomous robots. Journal of Military Ethics, 12(2), 129–141.CrossRef Johnson, A. M., & Axinn, S. (2013). The morality of autonomous robots. Journal of Military Ethics, 12(2), 129–141.CrossRef
Zurück zum Zitat Kasher, A., & Yadlin, A. (2005). Assassination and preventive killing. SAIS Review of International Affairs, 25, 41–57.CrossRef Kasher, A., & Yadlin, A. (2005). Assassination and preventive killing. SAIS Review of International Affairs, 25, 41–57.CrossRef
Zurück zum Zitat Larson, E. (1996). Casualties and consensus: The historical role of casualties in support of U.S. military operations. Washington, D.C.: RAND Corporation. Larson, E. (1996). Casualties and consensus: The historical role of casualties in support of U.S. military operations. Washington, D.C.: RAND Corporation.
Zurück zum Zitat Mackinnon, C. (1994). Rape, genocide, and women’s rights. Harvard Women’s Law Journal, 17(5), 5. Mackinnon, C. (1994). Rape, genocide, and women’s rights. Harvard Women’s Law Journal, 17(5), 5.
Zurück zum Zitat Mill, J. S. (2008). Principles of political economy and chapters on socialism, edited by Jonathan Riley. Oxford: Oxford University Press. Mill, J. S. (2008). Principles of political economy and chapters on socialism, edited by Jonathan Riley. Oxford: Oxford University Press.
Zurück zum Zitat Pollack, K. (2002). Arabs at war: Military effectiveness 1948–1991. Lincoln: University of Nebraska Press. Pollack, K. (2002). Arabs at war: Military effectiveness 1948–1991. Lincoln: University of Nebraska Press.
Zurück zum Zitat Rousseva, V. (2004). Rape and Sexual Assault in Chechnya. Culture, Society, and Praxis, 3(1), 64–67.MathSciNet Rousseva, V. (2004). Rape and Sexual Assault in Chechnya. Culture, Society, and Praxis, 3(1), 64–67.MathSciNet
Zurück zum Zitat Schwartz, S. (1994). Rape as a weapon of war in former Yugoslavia. Hastings Women Law Journal, 5(1), 69–74.MathSciNet Schwartz, S. (1994). Rape as a weapon of war in former Yugoslavia. Hastings Women Law Journal, 5(1), 69–74.MathSciNet
Zurück zum Zitat Sikkink, K. (2011). The justice cascade. New York: WW Norton and Company. Sikkink, K. (2011). The justice cascade. New York: WW Norton and Company.
Zurück zum Zitat Simpson, T. W., & Mullers, V. C. (2016). Just war and robots’ killings. The Philosophical Quarterly, 33, 302–322.CrossRef Simpson, T. W., & Mullers, V. C. (2016). Just war and robots’ killings. The Philosophical Quarterly, 33, 302–322.CrossRef
Zurück zum Zitat Singer, P. W. (2009). Wired for War: The Robotics Revolution and Conflict in the 21st Century. Penguin Books. Singer, P. W. (2009). Wired for War: The Robotics Revolution and Conflict in the 21st Century. Penguin Books.
Zurück zum Zitat Smith, A. M. (2012). Attributability, answerability, and accountability: In defense of a unified account. Ethics, 122, 575–589.CrossRef Smith, A. M. (2012). Attributability, answerability, and accountability: In defense of a unified account. Ethics, 122, 575–589.CrossRef
Zurück zum Zitat Sparrow, R. (2007). Killer robot. Journal of Applied Philosophy, 24(1), 62–77.CrossRef Sparrow, R. (2007). Killer robot. Journal of Applied Philosophy, 24(1), 62–77.CrossRef
Zurück zum Zitat Swinton, E. D. (1986). The Defense of Duffer’s Drift. Wayne NJ: Avery Publishing Group. Swinton, E. D. (1986). The Defense of Duffer’s Drift. Wayne NJ: Avery Publishing Group.
Zurück zum Zitat Walzer, M. (2006). Just and unjust wars (4th ed.). New York: Basic Books. Walzer, M. (2006). Just and unjust wars (4th ed.). New York: Basic Books.
Zurück zum Zitat Watson, G. (1996). Two faces of responsibility. Philosophical Topics, 24, 227–248.CrossRef Watson, G. (1996). Two faces of responsibility. Philosophical Topics, 24, 227–248.CrossRef
Zurück zum Zitat Department of the Army. (1998). Field manual 71-1: Tank and mechanized infantry company team. Department of the Army. (1998). Field manual 71-1: Tank and mechanized infantry company team.
Zurück zum Zitat International Committee for Red Cross (Autonomous Weapon Systems: Implications of Increasing Autonomy in the Critical Functions of Weapons—ref. 4283-ebook). International Committee for Red Cross (Autonomous Weapon Systems: Implications of Increasing Autonomy in the Critical Functions of Weapons—ref. 4283-ebook).
Zurück zum Zitat Making the case: The dangers of killer robots and the need for a preemptive ban. Human Rights Watch, December 2016. Making the case: The dangers of killer robots and the need for a preemptive ban. Human Rights Watch, December 2016.
Zurück zum Zitat Office of the Department of Defense. (2005). Development and utilization of robotics and unmanned ground vehicles. Office of the Department of Defense. (2005). Development and utilization of robotics and unmanned ground vehicles.
Metadaten
Titel
Just research into killer robots
verfasst von
Patrick Taylor Smith
Publikationsdatum
23.07.2018
Verlag
Springer Netherlands
Erschienen in
Ethics and Information Technology / Ausgabe 4/2019
Print ISSN: 1388-1957
Elektronische ISSN: 1572-8439
DOI
https://doi.org/10.1007/s10676-018-9472-6

Weitere Artikel der Ausgabe 4/2019

Ethics and Information Technology 4/2019 Zur Ausgabe