Skip to main content
Erschienen in: Ethics and Information Technology 3/2010

01.09.2010

Moral appearances: emotions, robots, and human morality

verfasst von: Mark Coeckelbergh

Erschienen in: Ethics and Information Technology | Ausgabe 3/2010

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Can we build ‘moral robots’? If morality depends on emotions, the answer seems negative. Current robots do not meet standard necessary conditions for having emotions: they lack consciousness, mental states, and feelings. Moreover, it is not even clear how we might ever establish whether robots satisfy these conditions. Thus, at most, robots could be programmed to follow rules, but it would seem that such ‘psychopathic’ robots would be dangerous since they would lack full moral agency. However, I will argue that in the future we might nevertheless be able to build quasi-moral robots that can learn to create the appearance of emotions and the appearance of being fully moral. I will also argue that this way of drawing robots into our social-moral world is less problematic than it might first seem, since human morality also relies on such appearances.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Fußnoten
1
For instance, the Laws seem to limit the range of possible human-robot relations to the master–slave model.
 
2
For contemporary examples of such rules and arguments see Peter Singer’s work.
 
3
I do not agree with this interpretation of Kant. The categorical imperative is not a rule but at best meta-rule asking from us to reason from the moral point of view when we make rules (when we, as autonomous beings, give the rule to ourselves). But as I argued in my book […] this leaves open a lot of space for types of moral reasoning that require the exercise of imaginative and emotional capacities.
 
4
Note that there are tensions between the theoretical traditions mentioned here, for instance between a Human and a virtue ethics approach (see for instance Foot’s criticism of Hume, Foot 2002), but Nussbaum has managed to reconcile them in an attractive way.
 
5
Influenced by the Stoics, Nussbaum writes that emotions are not just ‘unthinking forces that have no connection with our thoughts, evaluations, or plans’ like ‘the invading currents of some ocean’ (Nussbaum 2001, p. 26–27) but, by contrast, more like ‘forms of judgment’ that ‘ascribe to certain things and persons outside a person’s own control great importance for the person’s own flourishing.’ This renders emotions acknowledgments of vulnerability and lack of self-sufficiency (Nussbaum 2001, p. 22). Note also that this view is not Stoic but neo-Stoic since Nussbaum rejects their normative view of the role emotions should have (the Stoics evaluated the role of emotions negatively) and revises their account of cognition.
 
6
Note that emotional moral reasoning does not exclude taking into account rules, laws and conventions.
 
7
Given the role of emotions in making moral discriminations, we would not even want ‘psychopathic’ military robots.
 
8
The authors argue that trying to build robots according to the rule-based model (that is, turning the rules into algorithms and build them into robots) cannot succeed since such ‘commandment’ models face the problem of conflicting rules. Overriding principles based on moral intuitions we have do not solve this problem since they might not even be universally shared within one culture (Wallach and Allen 2008, p. 84). Moreover, applying deontological and consequentialist theories requires one to gather an enormous amount of information in order to describe the situation and in order to predict, which may be hard for computers—and indeed for humans (p. 86). They give further reasons why morality is hard to computate, which is particularly problematic for Bentham-type utilitarian approaches to ethics (pp. 86–91). They also explicitly discuss problems with Asimov’s laws (pp. 91–95) and, more generally, problems with deontological abstract rules, which run into similar problems as consequentialist theories since this approach also requires us to predict consequences (pp. 95–97). These problems do not only get roboticists into trouble; they cast doubt on the ambitions of much normative moral theory: it shows that (top–down) theory is valuable but that it has significant limitations. .
 
9
Today there are already robots that have some capacity to learn in and from social interaction, for instance the robot Kismet developed by Cynthia Breazeal at MIT. In a sense, she has ‘raised’ the robot. However, these developments do not approach human moral and emotional learning.
 
10
The Turing test has been proposed by Alan Turing to test if an entity is human or not (Turing 1950).
 
11
These conditions have already been proposed by Aristotle and are endorsed by many contemporary writers on freedom and responsibility.
 
12
More generally, there is a kind of virtual intentionality (understood in a phenomenological sense): it appears as if the other is conscious and as if that consciousness is directed to objects.
 
13
Perhaps this helps to interpret the phenomenon that Japanese designers are more advanced at making humanoid robots: they tend to understand themselves as imitators of nature rather than creators (‘playing God’), which appears to be more a Western idea.
 
14
In animal ethics this demand for consistency is known as ‘the argument from marginal cases’.
 
15
Note that these moral categories constituted (and arguably still constitute) a kind of moral life that is fundamentally asymmetrical. An alternative, symmetrical moral framework would accommodate perceptions and treatment of robots as companions or co-workers. One might also apply other ‘human’ categories to them. However, I will not further discuss this issue here.
 
Literatur
Zurück zum Zitat Asimov, I. (1942). Runaround. Astounding Science Fiction, 94–103. Asimov, I. (1942). Runaround. Astounding Science Fiction, 94–103.
Zurück zum Zitat Damasio, A. (1994). Descartes’ error: emotion, reason, and the human brain. New York: G.P. Putnam’s Sons. Damasio, A. (1994). Descartes’ error: emotion, reason, and the human brain. New York: G.P. Putnam’s Sons.
Zurück zum Zitat De Sousa, R. (1987). The rationality of emotion. Cambridge, MA: MIT Press. De Sousa, R. (1987). The rationality of emotion. Cambridge, MA: MIT Press.
Zurück zum Zitat Foot, P. (2002). Hume on moral judgment. In Virtues and vices. Oxford/New York: Oxford University Press. Foot, P. (2002). Hume on moral judgment. In Virtues and vices. Oxford/New York: Oxford University Press.
Zurück zum Zitat Goldie, P. (2000). The emotions: a philosophical exploration. Oxford: Oxford University Press. Goldie, P. (2000). The emotions: a philosophical exploration. Oxford: Oxford University Press.
Zurück zum Zitat Greene, J. D. (2001). An fMRI investigation of emotional engagement in moral judgment. Science, 293(5537), 2105–2108.CrossRef Greene, J. D. (2001). An fMRI investigation of emotional engagement in moral judgment. Science, 293(5537), 2105–2108.CrossRef
Zurück zum Zitat Kennett, J. (2002). Autism, empathy and moral agency. The Philosophical Quarterly, 52(208), 340–357.CrossRef Kennett, J. (2002). Autism, empathy and moral agency. The Philosophical Quarterly, 52(208), 340–357.CrossRef
Zurück zum Zitat Merleau-Ponty, M. (1945). Phénoménologie de la Perception. Paris: Gallimard. Merleau-Ponty, M. (1945). Phénoménologie de la Perception. Paris: Gallimard.
Zurück zum Zitat Nussbaum, M. C. (1990). Love’s knowledge. Oxford: Oxford University Press. Nussbaum, M. C. (1990). Love’s knowledge. Oxford: Oxford University Press.
Zurück zum Zitat Nussbaum, M. C. (1994). The therapy of desire: theory and practice in hellenistic ethics. Princeton: Princeton University Press. Nussbaum, M. C. (1994). The therapy of desire: theory and practice in hellenistic ethics. Princeton: Princeton University Press.
Zurück zum Zitat Nussbaum, M. C. (1995). Poetic justice: literary imagination and public life. Boston: Beacon Press. Nussbaum, M. C. (1995). Poetic justice: literary imagination and public life. Boston: Beacon Press.
Zurück zum Zitat Nussbaum, M. C. (2001). Upheavals of thought: the intelligence of emotions. Cambridge: Cambridge University Press. Nussbaum, M. C. (2001). Upheavals of thought: the intelligence of emotions. Cambridge: Cambridge University Press.
Zurück zum Zitat Prinz, J. (2004). Gut reactions: a perceptual theory of emotion. Oxford: Oxford University Press. Prinz, J. (2004). Gut reactions: a perceptual theory of emotion. Oxford: Oxford University Press.
Zurück zum Zitat Solomon, R. (1980). Emotions and choice. In A. Rorty (Ed.), Explaining emotions (pp. 81–251). Los Angeles: University of California Press. Solomon, R. (1980). Emotions and choice. In A. Rorty (Ed.), Explaining emotions (pp. 81–251). Los Angeles: University of California Press.
Zurück zum Zitat Wallach, W., & Allen, C. (2008). Moral machines: teaching robots right from wrong. Oxford: Oxford University Press. Wallach, W., & Allen, C. (2008). Moral machines: teaching robots right from wrong. Oxford: Oxford University Press.
Metadaten
Titel
Moral appearances: emotions, robots, and human morality
verfasst von
Mark Coeckelbergh
Publikationsdatum
01.09.2010
Verlag
Springer Netherlands
Erschienen in
Ethics and Information Technology / Ausgabe 3/2010
Print ISSN: 1388-1957
Elektronische ISSN: 1572-8439
DOI
https://doi.org/10.1007/s10676-010-9221-y

Weitere Artikel der Ausgabe 3/2010

Ethics and Information Technology 3/2010 Zur Ausgabe

EditorialNotes

Editorial

Premium Partner