Skip to main content
Erschienen in: Ethics and Information Technology 4/2018

24.09.2018 | Original Paper

Why robots should not be treated like animals

verfasst von: Deborah G. Johnson, Mario Verdicchio

Erschienen in: Ethics and Information Technology | Ausgabe 4/2018

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Responsible Robotics is about developing robots in ways that take their social implications into account, which includes conceptually framing robots and their role in the world accurately. We are now in the process of incorporating robots into our world and we are trying to figure out what to make of them and where to put them in our conceptual, physical, economic, legal, emotional and moral world. How humans think about robots, especially humanoid social robots, which elicit complex and sometimes disconcerting reactions, is not predetermined. The animal–robot analogy is one of the most commonly used in attempting to frame interactions between humans and robots and it also tends to push in the direction of blurring the distinction between humans and machines. We argue that, despite some shared characteristics, when it comes to thinking about the moral status of humanoid robots, legal liability, and the impact of treatment of humanoid robots on how humans treat one another, analogies with animals are misleading.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Fußnoten
1
Many roboticists talk about robots “feeling” or “sensing” the environment because these machines are endowed with sensors, but their discourse is metaphorical.
 
2
Some scholars, like Solaiman, turn to animals as a model to deny that robots should be granted personhood (Solaiman 2017). The scholar uses a case in which a judge denied personhood to chimpanzees to argue against the idea of conferring legal personhood to robots. This all-or-nothing approach on personhood (either animals and robots have all the rights and duties connected with personhood or they don’t have any) may be too coarse-grained for our analysis, since it begs the question on why there are laws against animal cruelty even though animals are not considered persons.
 
3
Strict liability means liability does not depend on intent to do harm or negligence. With strict liability, one is liable regardless of the fact that one had no ill intent and may have taken precautions to prevent the harm. By contrast, negligence involves failure to take proper care in doing something.
 
4
In a later paper, Kelley et al. (2010) further modify the Robots as Animals framework by specifying that the important distinction is between robots that are dangerous and robots that are safe. Using an analogy with dangerous dogs, they suggest that bans or restrictions might be appropriate for dangerous robots.
 
5
Currently there are a number of codes or standards for robotics such as the EPSRC Principles of Robotics that have a thrust in this direction but are not specific. For example, the 4th rule in EPSRC’s Principles for Designers, Builders, and Users of Robots is that: “Robots are manufactured artefacts. They should not be designed in a deceptive way to exploit vulnerable users; instead their machine nature should be transparent.” (EPSRC 2010).
 
Literatur
Zurück zum Zitat Anderson, C. A. (1997). Effects of violent movies and trait hostility on hostile feelings and aggressive thoughts. Aggressive Behavior, 23, 161–178.CrossRef Anderson, C. A. (1997). Effects of violent movies and trait hostility on hostile feelings and aggressive thoughts. Aggressive Behavior, 23, 161–178.CrossRef
Zurück zum Zitat Anderson, M., & Anderson, S. L. (Eds.). (2011). Machine ethics. Cambridge: Cambridge University Press. Anderson, M., & Anderson, S. L. (Eds.). (2011). Machine ethics. Cambridge: Cambridge University Press.
Zurück zum Zitat Asaro, P. M. (2012). A body to kick, but still no soul to damn: Legal perspectives on robotics. In P. Lin, K. Abney, & G. A. Bekey (Eds.), Robot ethics: The ethical and social implications of robotics. Cambridge: MIT Press. Asaro, P. M. (2012). A body to kick, but still no soul to damn: Legal perspectives on robotics. In P. Lin, K. Abney, & G. A. Bekey (Eds.), Robot ethics: The ethical and social implications of robotics. Cambridge: MIT Press.
Zurück zum Zitat Asaro, P. M. (2016). The liability problem for autonomous artificial agents. Ethical and Moral Considerations in Non-Human Agents, 2016 AAAI Spring Symposium Series. Asaro, P. M. (2016). The liability problem for autonomous artificial agents. Ethical and Moral Considerations in Non-Human Agents, 2016 AAAI Spring Symposium Series.
Zurück zum Zitat Ashrafian, H. (2015). Artificial intelligence and robot responsibilities: Innovating beyond rights. Science and Engineering Ethics, 21(2), 317–326.CrossRef Ashrafian, H. (2015). Artificial intelligence and robot responsibilities: Innovating beyond rights. Science and Engineering Ethics, 21(2), 317–326.CrossRef
Zurück zum Zitat Asimov, I. (1993). Forward the foundation. London: Doubleday. Asimov, I. (1993). Forward the foundation. London: Doubleday.
Zurück zum Zitat Borenstein, J., & Pearson, Y. (2010). Robot caregivers: Harbingers of expanded freedom for all? Ethics and Information Technology, 12(3), 277–288.CrossRef Borenstein, J., & Pearson, Y. (2010). Robot caregivers: Harbingers of expanded freedom for all? Ethics and Information Technology, 12(3), 277–288.CrossRef
Zurück zum Zitat Bryson, J. J., Diamantis, M. E., & Grant, T. D. (2017). Of, for, and by the people: The legal lacuna of synthetic persons. Artificial Intelligence and Law, 25, 273–291.CrossRef Bryson, J. J., Diamantis, M. E., & Grant, T. D. (2017). Of, for, and by the people: The legal lacuna of synthetic persons. Artificial Intelligence and Law, 25, 273–291.CrossRef
Zurück zum Zitat Bushman, B. J., & Anderson, C. A. (2009). Comfortably numb: Desensitizing effects of violent media on helping others. Psychological Science, 20(3), 273–277.CrossRef Bushman, B. J., & Anderson, C. A. (2009). Comfortably numb: Desensitizing effects of violent media on helping others. Psychological Science, 20(3), 273–277.CrossRef
Zurück zum Zitat Calverley, D. (2006). J. Android science and animal rights, does an analogy exist? Connection Science, 18(4), 403–417.CrossRef Calverley, D. (2006). J. Android science and animal rights, does an analogy exist? Connection Science, 18(4), 403–417.CrossRef
Zurück zum Zitat Calverley, D. J. (2005). Android science and the animal rights movement: Are there analogies. In Cognitive Sciences Society Workshop, Stresa, Italy, pp. 127–136. Calverley, D. J. (2005). Android science and the animal rights movement: Are there analogies. In Cognitive Sciences Society Workshop, Stresa, Italy, pp. 127–136.
Zurück zum Zitat Chilvers, J. (2013). Reflexive engagement? Actors, learning, and reflexivity in public dialogue on science and technology. Science Communication, 35(3), 283–310.CrossRef Chilvers, J. (2013). Reflexive engagement? Actors, learning, and reflexivity in public dialogue on science and technology. Science Communication, 35(3), 283–310.CrossRef
Zurück zum Zitat Chin, M., Sims, V., Clark, B., & Lopez, G. (2004). Measuring individual differences in anthropomorphism toward machines and animals. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol 48, pp. 1252–1255.CrossRef Chin, M., Sims, V., Clark, B., & Lopez, G. (2004). Measuring individual differences in anthropomorphism toward machines and animals. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol 48, pp. 1252–1255.CrossRef
Zurück zum Zitat Coeckelbergh, M. (2010). Robot rights? Towards a social-relational justification of moral consideration. Ethics and Information Technology, 12(3), 209–221.CrossRef Coeckelbergh, M. (2010). Robot rights? Towards a social-relational justification of moral consideration. Ethics and Information Technology, 12(3), 209–221.CrossRef
Zurück zum Zitat Darling, K. (2016). Extending legal protection to social robots: The effects of anthropomorphism, empathy, and violent behavior towards robotic objects. In R. Calo, A. M. Froomkin & I. Kerr (Eds.), Robot Law. Cheltenham: Edward Elgar. Darling, K. (2016). Extending legal protection to social robots: The effects of anthropomorphism, empathy, and violent behavior towards robotic objects. In R. Calo, A. M. Froomkin & I. Kerr (Eds.), Robot Law. Cheltenham: Edward Elgar.
Zurück zum Zitat Delvaux, M. (2016). Draft Report with recommendations to the Commission on Civil Law Rules on Robotics. European Parliament Committee on Legal Affairs Report 2015/2103 (INL). Delvaux, M. (2016). Draft Report with recommendations to the Commission on Civil Law Rules on Robotics. European Parliament Committee on Legal Affairs Report 2015/2103 (INL).
Zurück zum Zitat Dick, P. K. (1968). Do androids dream of electric sheep? London: Doubleday. Dick, P. K. (1968). Do androids dream of electric sheep? London: Doubleday.
Zurück zum Zitat Elbogen, E. B., Johnson, S. C., Wagner, H. R., Sullivan, C., & Taft, C. T. (2014). and J. C. Beckham. Violent behaviour and post-traumatic stress disorder in US Iraq and Afghanistan veterans. The British Journal of Psychiatry, 204(5), 368–375.CrossRef Elbogen, E. B., Johnson, S. C., Wagner, H. R., Sullivan, C., & Taft, C. T. (2014). and J. C. Beckham. Violent behaviour and post-traumatic stress disorder in US Iraq and Afghanistan veterans. The British Journal of Psychiatry, 204(5), 368–375.CrossRef
Zurück zum Zitat Epley, N., Waytz, A., & Cacioppo, J. T. (2007). On seeing human: A three-factor theory of anthropomorphism. Psychological Review, 114(4), 864–886.CrossRef Epley, N., Waytz, A., & Cacioppo, J. T. (2007). On seeing human: A three-factor theory of anthropomorphism. Psychological Review, 114(4), 864–886.CrossRef
Zurück zum Zitat Eyssel, F., Kuchenbrandt, D., Bobinger, S., De Ruiter, L., & Hegel, F. (2012). “If you sound like me, you must be more human”: On the interplay of robot and user features on human–robot acceptance and anthropomorphism. In Proceedings of the 7th annual ACM/IEEE International Conference on Human–Robot Interaction (HRI’12), pp. 125–126. Eyssel, F., Kuchenbrandt, D., Bobinger, S., De Ruiter, L., & Hegel, F. (2012). “If you sound like me, you must be more human”: On the interplay of robot and user features on human–robot acceptance and anthropomorphism. In Proceedings of the 7th annual ACM/IEEE International Conference on Human–Robot Interaction (HRI’12), pp. 125–126.
Zurück zum Zitat Ford, M. (2015). The rise of the robots: Technology and the threat of mass unemployment. London: Oneworld Publications. Ford, M. (2015). The rise of the robots: Technology and the threat of mass unemployment. London: Oneworld Publications.
Zurück zum Zitat Fussell, S. R., Kiesler, S., Setlock, L. D., & Yew, V. (2008). How people anthropomorphize robots. In Proceedings of the 3rd ACM/IEEE International Conference on Human–Robot Interaction (HRI 2008), pp. 145–152. Fussell, S. R., Kiesler, S., Setlock, L. D., & Yew, V. (2008). How people anthropomorphize robots. In Proceedings of the 3rd ACM/IEEE International Conference on Human–Robot Interaction (HRI 2008), pp. 145–152.
Zurück zum Zitat Garland, A. (2015). Ex Machina [Motion Picture]. Universal City: Universal Pictures. Garland, A. (2015). Ex Machina [Motion Picture]. Universal City: Universal Pictures.
Zurück zum Zitat Gentner, D., & Forbus, K. D. (2011). Computational models of analogy. WIREs Cognitive Science, 2, 266–276.CrossRef Gentner, D., & Forbus, K. D. (2011). Computational models of analogy. WIREs Cognitive Science, 2, 266–276.CrossRef
Zurück zum Zitat Gibson, W. (1996). Idoru. New York: Viking Press. Gibson, W. (1996). Idoru. New York: Viking Press.
Zurück zum Zitat Glas, D. F., Minato, T., Ishi, C. T., Kawahara, T., & Ishiguro, H. (2016). “ERICA: The ERATO Intelligent Conversational Android.” Proceedings of the 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), pp. 22–29. Glas, D. F., Minato, T., Ishi, C. T., Kawahara, T., & Ishiguro, H. (2016). “ERICA: The ERATO Intelligent Conversational Android.” Proceedings of the 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), pp. 22–29.
Zurück zum Zitat Grodzinsky, F. S., Miller, K. W., & Wolf, M. J. (2015). Developing automated deceptions and the impact on trust. Philosophy & Technology, 28(1), 91–105.CrossRef Grodzinsky, F. S., Miller, K. W., & Wolf, M. J. (2015). Developing automated deceptions and the impact on trust. Philosophy & Technology, 28(1), 91–105.CrossRef
Zurück zum Zitat Gunkel, D. J. (2012). The machine question. Cambridge: MIT Press. Gunkel, D. J. (2012). The machine question. Cambridge: MIT Press.
Zurück zum Zitat Gunkel, D. J. (2014). A vindication of the rights of machines. Philosophy & Technology, 27(1), 113–132.CrossRef Gunkel, D. J. (2014). A vindication of the rights of machines. Philosophy & Technology, 27(1), 113–132.CrossRef
Zurück zum Zitat Hauskeller, M. (2016). Mythologies of transhumanism. Basingstoke: Palgrave McMillan.CrossRef Hauskeller, M. (2016). Mythologies of transhumanism. Basingstoke: Palgrave McMillan.CrossRef
Zurück zum Zitat Hogan, K. (2017). Is the machine question the same question as the animal question? Ethics and Information Technology, 19, 29–38.CrossRef Hogan, K. (2017). Is the machine question the same question as the animal question? Ethics and Information Technology, 19, 29–38.CrossRef
Zurück zum Zitat Holyoak, K. J., & Koh, K. (1987). Surface and structural similarityin analogical transfer. Memory & Cognition, 15, 332–340.CrossRef Holyoak, K. J., & Koh, K. (1987). Surface and structural similarityin analogical transfer. Memory & Cognition, 15, 332–340.CrossRef
Zurück zum Zitat Johnson, D. G., & Verdicchio, M. (2017). AI anxiety. Journal of the Association for Information Science and Technology, 68(9), 2267–2270.CrossRef Johnson, D. G., & Verdicchio, M. (2017). AI anxiety. Journal of the Association for Information Science and Technology, 68(9), 2267–2270.CrossRef
Zurück zum Zitat Jonze, S. Her [Motion Picture], Warner Bros., Burbank, 2013. Jonze, S. Her [Motion Picture], Warner Bros., Burbank, 2013.
Zurück zum Zitat Kant, I. (1997). Lectures on ethics, In: P. Heath and J. B. Schneewind (Eds.), translated by P Heath. Kant, I. (1997). Lectures on ethics, In: P. Heath and J. B. Schneewind (Eds.), translated by P Heath.
Zurück zum Zitat Kelley, R., Schaerer, E., Gomez, M., & Nicolescu, M. (2010). Liability in robotics: An international perspective on robots as animals. Advanced Robotics, 24(13), 1861–1871.CrossRef Kelley, R., Schaerer, E., Gomez, M., & Nicolescu, M. (2010). Liability in robotics: An international perspective on robots as animals. Advanced Robotics, 24(13), 1861–1871.CrossRef
Zurück zum Zitat Kuehn, J., & Haddadin, S. (2017). An artificial robot nervous system to teach robots how to feel pain and reflexively react to potentially damaging contacts. IEEE Robotics and Automation Letters, 2(1), 72–79.CrossRef Kuehn, J., & Haddadin, S. (2017). An artificial robot nervous system to teach robots how to feel pain and reflexively react to potentially damaging contacts. IEEE Robotics and Automation Letters, 2(1), 72–79.CrossRef
Zurück zum Zitat Kurzweil, R. (2005). The Singularity is near: When humans transcend biology. London: Penguin Books. Kurzweil, R. (2005). The Singularity is near: When humans transcend biology. London: Penguin Books.
Zurück zum Zitat Latour, B. (1987). Science in action: How to follow scientists and engineers through society. Cambridge: Harvard University Press. Latour, B. (1987). Science in action: How to follow scientists and engineers through society. Cambridge: Harvard University Press.
Zurück zum Zitat Levy, D. (2008). Love and Sex with Robots. New York: Harper Perennial. Levy, D. (2008). Love and Sex with Robots. New York: Harper Perennial.
Zurück zum Zitat Levy, D. (2009). The ethical treatment of artificially conscious robots. International Journal of Social Robotics, 1(3), 209–216.CrossRef Levy, D. (2009). The ethical treatment of artificially conscious robots. International Journal of Social Robotics, 1(3), 209–216.CrossRef
Zurück zum Zitat Lin, P., Abney, K., & Bekey, G. A. (2011). Robot Ethics: The ethical and social implications of robotics. Cambridge: MIT Press. Lin, P., Abney, K., & Bekey, G. A. (2011). Robot Ethics: The ethical and social implications of robotics. Cambridge: MIT Press.
Zurück zum Zitat MacLennan, B. (2013). Cruelty to robots? The hard problem of robot suffering. Proceedings of the 2013 Meeting of the International Association for Computing and Philosophy (IACAP). MacLennan, B. (2013). Cruelty to robots? The hard problem of robot suffering. Proceedings of the 2013 Meeting of the International Association for Computing and Philosophy (IACAP).
Zurück zum Zitat MacManus, D., Rona, R., Dickson, H., Somaini, G., Fear, N., & Wessely, S. (2015). Aggressive and violent behavior among military personnel deployed to Iraq and Afghanistan: Prevalence and link with deployment and combat exposure. Epidemiologic Reviews, 37(1), 196–212.CrossRef MacManus, D., Rona, R., Dickson, H., Somaini, G., Fear, N., & Wessely, S. (2015). Aggressive and violent behavior among military personnel deployed to Iraq and Afghanistan: Prevalence and link with deployment and combat exposure. Epidemiologic Reviews, 37(1), 196–212.CrossRef
Zurück zum Zitat Markey, P. M., French, J. E., & Markey, C. N. (2014). Violent movies and severe acts of violence: Sensationalism versus science. Human Communication Research, 41(2), 155–173.CrossRef Markey, P. M., French, J. E., & Markey, C. N. (2014). Violent movies and severe acts of violence: Sensationalism versus science. Human Communication Research, 41(2), 155–173.CrossRef
Zurück zum Zitat McNally, P., & Inayatullah, S. (1988). The rights of robots: Technology, culture and law in the 21st century. Futures, 20(2), 119–136.CrossRef McNally, P., & Inayatullah, S. (1988). The rights of robots: Technology, culture and law in the 21st century. Futures, 20(2), 119–136.CrossRef
Zurück zum Zitat Metzinger, T. (2013). Two principles for robot ethics. In E. Hilgendorf & J.-P. Günther (Eds.), Robotik und Gesetzgebung (pp. 247–286). Baden-Baden: Nomos. Metzinger, T. (2013). Two principles for robot ethics. In E. Hilgendorf & J.-P. Günther (Eds.), Robotik und Gesetzgebung (pp. 247–286). Baden-Baden: Nomos.
Zurück zum Zitat Moore, A. (1989). V for Vendetta. Burbank: DC Comics. Moore, A. (1989). V for Vendetta. Burbank: DC Comics.
Zurück zum Zitat Mori, M. (1970). The uncanny valley. Energy, 7(4), 33–35. Mori, M. (1970). The uncanny valley. Energy, 7(4), 33–35.
Zurück zum Zitat Mori, M., MacDorman, K. F., & Kageki, N. (2012). The uncanny valley [from the field]. IEEE Robotics & Automation Magazine, 19(2), 98–100.CrossRef Mori, M., MacDorman, K. F., & Kageki, N. (2012). The uncanny valley [from the field]. IEEE Robotics & Automation Magazine, 19(2), 98–100.CrossRef
Zurück zum Zitat Novaco, R. W., & Chemtob, C. M. (2015). Violence associated with combat-related posttraumatic stress disorder: The importance of anger. Psychological Trauma: Theory, Research, Practice, and Policy, 7(5), 485.CrossRef Novaco, R. W., & Chemtob, C. M. (2015). Violence associated with combat-related posttraumatic stress disorder: The importance of anger. Psychological Trauma: Theory, Research, Practice, and Policy, 7(5), 485.CrossRef
Zurück zum Zitat Owen, R., Stilgoe, J., Macnaghten, P., Gorman, M., Fisher, E., & Guston, D. (2013). A framework for responsible innovation. In R. Owen, J. Bessant & M. Heintz (Eds.), Responsible innovation: Managing the responsible emergence of science and innovation in society. Chichester: Wiley.CrossRef Owen, R., Stilgoe, J., Macnaghten, P., Gorman, M., Fisher, E., & Guston, D. (2013). A framework for responsible innovation. In R. Owen, J. Bessant & M. Heintz (Eds.), Responsible innovation: Managing the responsible emergence of science and innovation in society. Chichester: Wiley.CrossRef
Zurück zum Zitat Parisi, D. (2014). Future robots: Towards a robotic science of human beings. Amsterdam: John Benjamins Publishing.CrossRef Parisi, D. (2014). Future robots: Towards a robotic science of human beings. Amsterdam: John Benjamins Publishing.CrossRef
Zurück zum Zitat Perkowitz, S. (2004). Digital people: From bionic humans to androids. Washington: Joseph Henry Press. Perkowitz, S. (2004). Digital people: From bionic humans to androids. Washington: Joseph Henry Press.
Zurück zum Zitat Ramey, C. H. (2005). “For the Sake of Others”: The “Personal” Ethics of Human-Android Interaction. In Toward Social Mechanisms of Android Science: A CogSci 2005 Workshop. July 25–26, Stresa, Italy, pp. 137–148. Ramey, C. H. (2005). “For the Sake of Others”: The “Personal” Ethics of Human-Android Interaction. In Toward Social Mechanisms of Android Science: A CogSci 2005 Workshop. July 25–26, Stresa, Italy, pp. 137–148.
Zurück zum Zitat Robertson, J. (2014). Human rights vs. robot rights: Forecasts from Japan. Critical Asian Studies, 46(4), 571–598.CrossRef Robertson, J. (2014). Human rights vs. robot rights: Forecasts from Japan. Critical Asian Studies, 46(4), 571–598.CrossRef
Zurück zum Zitat Ross, B. H. (1989). Distinguishing types of superficial similarities: Different effects on the access and use of earlier problems. Journal of Experimental Psychology: Learning, Memory and Cognition, 5, 456–468. Ross, B. H. (1989). Distinguishing types of superficial similarities: Different effects on the access and use of earlier problems. Journal of Experimental Psychology: Learning, Memory and Cognition, 5, 456–468.
Zurück zum Zitat Schaerer, E., Kelley, R., & Nicolescu, M. (2009). Robots as animals: A framework for liability and responsibility in human-robot interactions. In RO-MAN 2009-The 18th IEEE International Symposium on Robot and Human Interactive Communication, pp. 72–77, IEEE. Schaerer, E., Kelley, R., & Nicolescu, M. (2009). Robots as animals: A framework for liability and responsibility in human-robot interactions. In RO-MAN 2009-The 18th IEEE International Symposium on Robot and Human Interactive Communication, pp. 72–77, IEEE.
Zurück zum Zitat Schmidt, C. T. A. (2008). Redesigning Man? In P. E. Vermaas, P. Kroes, A. Light & S. A. Moore (Eds.), Philosophy and design: From engineering to architecture (pp. 209–216). New York: Springer.CrossRef Schmidt, C. T. A. (2008). Redesigning Man? In P. E. Vermaas, P. Kroes, A. Light & S. A. Moore (Eds.), Philosophy and design: From engineering to architecture (pp. 209–216). New York: Springer.CrossRef
Zurück zum Zitat Sharkey, A., & Sharkey, N. (2012). Granny and the robots: Ethical issues in robot care for the elderly. Ethics and Information Technology, 14(1), 27–40.CrossRef Sharkey, A., & Sharkey, N. (2012). Granny and the robots: Ethical issues in robot care for the elderly. Ethics and Information Technology, 14(1), 27–40.CrossRef
Zurück zum Zitat Sharkey, N., & Sharkey, A. (2010). The crying shame of robot nannies: An ethical appraisal. Interaction Studies, 11(2), 161–190.CrossRef Sharkey, N., & Sharkey, A. (2010). The crying shame of robot nannies: An ethical appraisal. Interaction Studies, 11(2), 161–190.CrossRef
Zurück zum Zitat Solaiman, S. M. (2017). Legal personality of robots, corporations, idols and chimpanzees: A quest for legitimacy. Artificial Intelligence and Law, 25, 155–179.CrossRef Solaiman, S. M. (2017). Legal personality of robots, corporations, idols and chimpanzees: A quest for legitimacy. Artificial Intelligence and Law, 25, 155–179.CrossRef
Zurück zum Zitat Spellman, B. A., & Holyoak, K. J. (1996). Pragmatics in analogical mapping. Cognitive Psychology, 31, 307–346.CrossRef Spellman, B. A., & Holyoak, K. J. (1996). Pragmatics in analogical mapping. Cognitive Psychology, 31, 307–346.CrossRef
Zurück zum Zitat Spennemann, D. H. (2007). R. Of great apes and robots: Considering the future(s) of cultural heritage. Futures, 39(7), 861–877.CrossRef Spennemann, D. H. (2007). R. Of great apes and robots: Considering the future(s) of cultural heritage. Futures, 39(7), 861–877.CrossRef
Zurück zum Zitat Sullins, J. P. (2006). When is a robot a moral agent. International Review of Information Ethics, 6(12), 23–30. Sullins, J. P. (2006). When is a robot a moral agent. International Review of Information Ethics, 6(12), 23–30.
Zurück zum Zitat Sullins, J. P. (2011). When is a robot a moral agent? In M. Anderson & S. L. Anderson (Eds.), Machine ethics. Cambridge: Cambridge University Press. Sullins, J. P. (2011). When is a robot a moral agent? In M. Anderson & S. L. Anderson (Eds.), Machine ethics. Cambridge: Cambridge University Press.
Zurück zum Zitat van Rysewyk, S. (2014). Robot pain. International Journal of Synthetic Emotions, 4(2), 22–33.CrossRef van Rysewyk, S. (2014). Robot pain. International Journal of Synthetic Emotions, 4(2), 22–33.CrossRef
Metadaten
Titel
Why robots should not be treated like animals
verfasst von
Deborah G. Johnson
Mario Verdicchio
Publikationsdatum
24.09.2018
Verlag
Springer Netherlands
Erschienen in
Ethics and Information Technology / Ausgabe 4/2018
Print ISSN: 1388-1957
Elektronische ISSN: 1572-8439
DOI
https://doi.org/10.1007/s10676-018-9481-5

Weitere Artikel der Ausgabe 4/2018

Ethics and Information Technology 4/2018 Zur Ausgabe