Skip to main content
Erschienen in:
Buchtitelbild

2019 | OriginalPaper | Buchkapitel

Autonomous Weapon Systems – Dangers and Need for an International Prohibition

verfasst von : Jürgen Altmann

Erschienen in: KI 2019: Advances in Artificial Intelligence

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Advances in ICT, robotics and sensors bring autonomous weapon systems (AWS) within reach. Shooting without control by a human operator has military advantages, but also disadvantages – human understanding of the situation and control of events would suffer. Beyond this, compliance with the law of armed conflict is in question. Would it be ethical to allow a machine to take a human life? The increased pace of battle may overburden human understanding and decision making and lead to uncontrolled escalation. An international campaign as well as IT, robotics and AI professionals and enterprises are calling for an international ban of AWS. States have discussed about limitations in the UN context, but no consensus has evolved so far. Germany has argued for a ban of fully autonomous weapons, but has not joined the countries proposing an AWS ban, and is using a problematic definition.
An international ban could comprise a prohibition of AWS and a requirement that each use of force must be under meaningful human control (with very few exceptions). If remotely controlled uninhabited weapon systems remain allowed, a-priori verification that they cannot attack under computer control is virtually impossible. Compliance could be proved after the fact by secure records of all communication and sensor data and the actions of the human operator.
The AI and robotics communities could make significant contributions in teaching and by engaging the public and decision makers. Specific research projects could be directed, e.g., at dual use, proliferation risks and scenarios of interaction between two fleets of AWS. Because of high military, political and economic interests in AWS, a ban needs support by an alert public as well as the AI and robotics communities.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Fußnoten
1
Fratricides by US Patriot missiles in the 2003 war against Iraq [53].
 
2
The US-DoD definition of “semi-autonomous weapon”, “a weapon system that is intended to only engage individual targets or specific target groups that have been selected by a human operator” [5] is more problematic in that it does not specify how targets or target groups are to be selected.
 
3
For a more differentiated autonomy scale with six steps see [55].
 
4
“The transformation of International Protocols and battlefield ethics into machine-usable representations …”, “Mechanisms to ensure that the design of intelligent behaviors only provides responses within rigorously defined ethical boundaries”, “The development of effective perceptual algorithms capable of superior target discrimination capabilities …”, “The creation of techniques to permit the adaptation of an ethical constraint set and underlying behavioral control parameters that will ensure moral performance …”, “A means to make responsibility assignment clear and explicit for all concerned parties …” [41, p. 211f.].
 
5
“The work … is, in fact, merely a suggestion for a computer software system for the ethical governance of robot ‘behaviour’. This is what is known as a ‘back-end system’. Its operation relies entirely on information from systems yet ‘to be developed’ by others sometime in the future. It has no direct access to the real world through sensors or a vision system and it has no means to discriminate between combatant and non-combatant, between a baby and a wounded soldier, or a granny in a wheelchair and a tank. It has no inference engine and certainly cannot negotiate the types of common sense reasoning and battlefield awareness necessary for discrimination or proportionality decisions. There is neither a method for interpreting how the precepts of the laws of war apply in particular contexts nor is there any method for resolving the ambiguities of conflicting laws in novel situations.” [50].
 
6
Note that the CFE Treaty in its preamble calls for “establishing a secure and stable balance of conventional forces at lower levels” and for “eliminating disparities detrimental to stability and security”. [46] Unfortunately the Treaty is no longer operating with respect to Russia.
 
7
Similar unpredictable, but probably escalatory interactions can be foreseen if offensive cyber operations were done under automatic/autonomous/AI control. Combined with AWS operations the problems could intensify each other.
 
8
The author was one of the founders. In the meantime the number of members has grown to 33 [52].
 
9
The full name is “Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects”. This framework convention was concluded in 1980 and has five specific protocols, the most relevant in the present context being Protocol IV that prohibits blinding laser weapons [45]. There are 125 member states, including practically all states with relevant militaries [51].
 
10
What this can mean in detail is explained in [54].
 
11
The Swiss Federal Office for Defence Procurement – armasuisse – has re-enacted the scene and shown that a shaped charge of 3 g explosive can penetrate a skull emulator [47].
 
12
In military parlance, “lethal” is mostly understood as “destructive”, not explicitly as killing people, as e.g. in military notions of “target kill” or “mission kill”. The use of the term LAWS for the CCW expert meetings was not intended for exclusion of weapons against materiél or of non-lethal weapons (personal communication from Ambassador Jean-Hugues Simon-Michel of France, first chair of the expert meetings).
 
13
See the respective discussion in the life sciences (e.g. [49]) and the wider German Leopoldina-DFG “Joint Committee for the Handling of Security-Relevant Research” [48].
 
14
The Trump administration no longer mentions the offset strategy explicitly, but continues emphasising the need to maintain “decisive and sustained U.S. military advantages” or “overmatch” [56, p. 4] , [43, p. 28].
 
15
Russia: “Whoever becomes the leader in this sphere [AI] will become the ruler of the world.” (Putin) [42] China: “[T]he PLA intends to ‘seize the advantage in military competition and the initiative in future warfare,’ seeking the capability to win in not only today’s informatized warfare but also future intelligentized warfare, in which AI and related technologies will be a cornerstone of military power.” [57, p. 13] The USA is more circumscript: “The Trump Administration’s National Security Strategy recognizes the need to lead in artificial intelligence, and the Department of Defense is investing accordingly.” [44].
 
16
As in the case of the Anti-personnel Land Mine Convention (1997) by Canada and for the Cluster Munitions Convention (2008) by Norway.
 
Literatur
1.
Zurück zum Zitat Bhuta, N., Beck, S., Geiß, R., Liu, H.-Y., Kreß, C. (eds.): Autonomous Weapons Systems. Law, Ethics, Policy. Cambridge University Press, Cambridge (2016) Bhuta, N., Beck, S., Geiß, R., Liu, H.-Y., Kreß, C. (eds.): Autonomous Weapons Systems. Law, Ethics, Policy. Cambridge University Press, Cambridge (2016)
2.
Zurück zum Zitat Scharre, P.: Army of None: Autonomous Weapons and the Future of War. Norton, New York (2018) Scharre, P.: Army of None: Autonomous Weapons and the Future of War. Norton, New York (2018)
8.
Zurück zum Zitat Sauer, F., Schörnig, N.: Killer drones – the silver bullet of democratic warfare? Secur. Dialogue 43(4), 353–370 (2012)CrossRef Sauer, F., Schörnig, N.: Killer drones – the silver bullet of democratic warfare? Secur. Dialogue 43(4), 353–370 (2012)CrossRef
15.
Zurück zum Zitat Altmann, J.: Präventive Rüstungskontrolle. Die Friedens-Warte 83(2–3), 105–126 (2008) Altmann, J.: Präventive Rüstungskontrolle. Die Friedens-Warte 83(2–3), 105–126 (2008)
17.
Zurück zum Zitat Altmann, J.: Arms control for armed uninhabited vehicles: an ethical issue. Ethics Inf. Technol. 15(2), 137–152 (2013)CrossRef Altmann, J.: Arms control for armed uninhabited vehicles: an ethical issue. Ethics Inf. Technol. 15(2), 137–152 (2013)CrossRef
18.
Zurück zum Zitat Altmann, J., Sauer, F.: Autonomous weapon systems. Survival 59(5), 117–142 (2017)CrossRef Altmann, J., Sauer, F.: Autonomous weapon systems. Survival 59(5), 117–142 (2017)CrossRef
32.
Zurück zum Zitat Bundesministerium der Verteidigung, Pol II 5. Definitionsentwurf deutsch/englisch: Letales Autonomes Waffensystem. Personal communication (2014) Bundesministerium der Verteidigung, Pol II 5. Definitionsentwurf deutsch/englisch: Letales Autonomes Waffensystem. Personal communication (2014)
41.
Zurück zum Zitat Arkin, R.C.: Governing Lethal Behavior in Autonomous Robots. Chapman&Hall/CRC, Boca Raton (2009)CrossRef Arkin, R.C.: Governing Lethal Behavior in Autonomous Robots. Chapman&Hall/CRC, Boca Raton (2009)CrossRef
50.
Zurück zum Zitat Sharkey, N.E.: The evitability of autonomous robot warfare. Int. Rev. Red Cross 94, 787–799 (2012)CrossRef Sharkey, N.E.: The evitability of autonomous robot warfare. Int. Rev. Red Cross 94, 787–799 (2012)CrossRef
54.
Zurück zum Zitat Sharkey, N.: Staying in the loop. Human supervisory control of weapons. In: Bhuta, N., Beck, S., Geiß, R., Liu, H., Kreß, C. (eds.) Autonomous Weapons Systems. Law, Ethics, Policy, pp. 23–28. Cambridge University Press, Cambridge (2016) Sharkey, N.: Staying in the loop. Human supervisory control of weapons. In: Bhuta, N., Beck, S., Geiß, R., Liu, H., Kreß, C. (eds.) Autonomous Weapons Systems. Law, Ethics, Policy, pp. 23–28. Cambridge University Press, Cambridge (2016)
Metadaten
Titel
Autonomous Weapon Systems – Dangers and Need for an International Prohibition
verfasst von
Jürgen Altmann
Copyright-Jahr
2019
DOI
https://doi.org/10.1007/978-3-030-30179-8_1

Premium Partner