Skip to main content
Erschienen in: Ethics and Information Technology 4/2020

19.07.2017 | Original Paper

Mind the gap: responsible robotics and the problem of responsibility

verfasst von: David J. Gunkel

Erschienen in: Ethics and Information Technology | Ausgabe 4/2020

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

The task of this essay is to respond to the question concerning robots and responsibility—to answer for the way that we understand, debate, and decide who or what is able to answer for decisions and actions undertaken by increasingly interactive, autonomous, and sociable mechanisms. The analysis proceeds through three steps or movements. (1) It begins by critically examining the instrumental theory of technology, which determines the way one typically deals with and responds to the question of responsibility when it involves technology. (2) It then considers three instances where recent innovations in robotics challenge this standard operating procedure by opening gaps in the usual way of assigning responsibility. The innovations considered in this section include: autonomous technology, machine learning, and social robots. (3) The essay concludes by evaluating the three different responses—instrumentalism 2.0, machine ethics, and hybrid responsibility—that have been made in face of these difficulties in an effort to map out the opportunities and challenges of and for responsible robotics.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Fußnoten
1
This effort is informed by and consistent with the overall purpose and aim of philosophy, strictly speaking. Philosophers as different (and, at times, even antagonistic, especially to each other) as Heidegger (1962), Dennett (1996), Moore (2005), and Žižek (2006), have all, at one time or another, described philosophy as a critical endeavor that is more interested in developing questions than in providing definitive answers. “There are,” as Žižek (2006, p. 137) describes it, “not only true or false solutions, there are also false questions. The task of philosophy is not to provide answers or solutions, but to submit to critical analysis the questions themselves, to make us see how the very way we perceive a problem is an obstacle to its solution.” This is the task and objective of the essay—to identify the range of questions regarding responsibility that can and should be asked in the face of recent technological innovation. If, in the end, readers emerge from the experience with more questions—“more” not only in quantity but also (and more importantly) in terms of the quality of inquiry—then it will have been successful and achieved its end.
 
2
Because of the recent proliferation of and popularity surrounding connectionist architecture, neural networks, and machine learning, there are numerous examples from which one could select, including natural language generation (NLG) algorithms, black box trading, computational creativity, self-driving vehicles, and autonomous weapons. In fact, one might have expected this essay to have focused on the latter—autonomous weapons—mainly because of the way the responsibility gap, or what has also been called “the accountability gap,” has been positioned, addressed, and documented in the literature on this subject (Arkin 2009; Asaro 2012; Beard 2014; Hammond 2015; Krishnan 2009; Lokhorst and van den Hoven 2012; Schulzke 2013; Sharkey 2012; Sparrow 2007; Sullins 2010). I have, however, made the deliberate decision to employ other, perhaps more mundane, examples like AlphaGo and Tay.ai. And I have done so for two reasons. First, questions concerning machine autonomy and responsibility, although important for and well-documented in the literature concerning autonomous weapons, is something that is not (and should not be) limited to weapon systems. Recognizing this fact requires that we explicitly identify and consider other domains where these question appear and are relevant—domains where the issues might be less dramatic but no less significant. Second, and more importantly, I wanted to deal with technologies that are actually in operation and not under development. Despite its popularity in investigations of machine agency and responsibility, autonomous weapons are still somewhat speculative and in development. Rather than address what might happen with technologies that could be developed and deployed, I wanted to address what has happened with technologies that are already here and in operation.
 
3
Just to be clear, the problem with social robots is not that they are or might be capable of becoming moral subjects. The problem is that they are neither instruments nor moral subjects. They occupy an in-between position that effectively blurs the boundary that had typically separated the one from the other. The problem, then, is not that social robots might achieve moral status equal to or on par with human beings. That remains a topic of and for science fiction. The problem is that social robots complicate the way one decides who has moral status and what does not, which is a more difficult/interesting philosophical question. For more on this subject, see Coeckelbergh (2012), Gunkel (2012), and Floridi (2013).
 
4
There is some debate concerning this matter. What Coeckelbergh (2010, p. 236) calls “psychopathy”— e.g. “follow rules but act without fear, compassion, care, and love”—Arkin (2009) celebrates as a considerable improvement in moral processing and decision making. Here is how Sharkey (2012, p. 121) characterizes Arkin’s efforts to develop an “artificial conscience” for robotic soldiers: “It turns out that the plan for this conscience is to create a mathematical decision space consisting of constraints, represented as prohibitions and obligations derived from the laws of war and rules of engagement (Arkin 2009). Essentially this consists of a bunch of complex conditionals (if-then statements)….Arkin believes that a robot could be more ethical than a human because its ethics are strictly programmed into it, and it has no emotional involvement with the action.” For more on this debate and the effect it has on moral consideration, see Gunkel (2012).
 
Literatur
Zurück zum Zitat Anderson, M., & Anderson, S. L. (2007). The status of machine ethics: A report from the AAAI symposium. Minds & Machines, 17(1), 1–10. Anderson, M., & Anderson, S. L. (2007). The status of machine ethics: A report from the AAAI symposium. Minds & Machines, 17(1), 1–10.
Zurück zum Zitat Anderson, M., & Anderson, S. L. (2011). Machine ethics. Cambridge: Cambridge University Press. Anderson, M., & Anderson, S. L. (2011). Machine ethics. Cambridge: Cambridge University Press.
Zurück zum Zitat Arkin, R. C. (2009). Governing lethal behavior in autonomous robots. Boca Raton: CRC Press. Arkin, R. C. (2009). Governing lethal behavior in autonomous robots. Boca Raton: CRC Press.
Zurück zum Zitat Asaro, P. (2012). On banning autonomous weapon systems: Human rights, automation, and the dehumanization of lethal decision-making. International Review of the Red Cross, 94(886), 687–709. Asaro, P. (2012). On banning autonomous weapon systems: Human rights, automation, and the dehumanization of lethal decision-making. International Review of the Red Cross, 94(886), 687–709.
Zurück zum Zitat Beard, J. M. (2014). Autonomous weapons and human responsibilities. Georgetown Journal of International Law, 45(1), 617–681. Beard, J. M. (2014). Autonomous weapons and human responsibilities. Georgetown Journal of International Law, 45(1), 617–681.
Zurück zum Zitat Breazeal, C. L. (2004). Designing sociable robots. Cambridge, MA: MIT Press.MATH Breazeal, C. L. (2004). Designing sociable robots. Cambridge, MA: MIT Press.MATH
Zurück zum Zitat Bringsjord, S. (2007). Ethical robots: The future can heed us. AI & Society, 22(4), 539–550. Bringsjord, S. (2007). Ethical robots: The future can heed us. AI & Society, 22(4), 539–550.
Zurück zum Zitat Brooks, R. A. (2002). Flesh and machines: How robots will change us. New York: Pantheon Books. Brooks, R. A. (2002). Flesh and machines: How robots will change us. New York: Pantheon Books.
Zurück zum Zitat Bryson, J. J. (2010). Robots should be slaves. In Y. Wilks (Ed.), Close engagements with artificial companions: Key social, psychological, ethical and design issues (pp. 63–74). Amsterdam: John Benjamins. Bryson, J. J. (2010). Robots should be slaves. In Y. Wilks (Ed.), Close engagements with artificial companions: Key social, psychological, ethical and design issues (pp. 63–74). Amsterdam: John Benjamins.
Zurück zum Zitat Calverley, D. J. (2008). Imaging a non-biological machine as a legal person. AI & Society, 22(4), 523–537. Calverley, D. J. (2008). Imaging a non-biological machine as a legal person. AI & Society, 22(4), 523–537.
Zurück zum Zitat Coeckelbergh, M. (2010). Moral appearances: Emotions, robots, and human morality. Ethics and Information Technology, 12(3), 235–241. Coeckelbergh, M. (2010). Moral appearances: Emotions, robots, and human morality. Ethics and Information Technology, 12(3), 235–241.
Zurück zum Zitat Coeckelbergh, M. (2012). Growing moral relations: Critique of moral status ascription. New York: Palgrave Macmillan. Coeckelbergh, M. (2012). Growing moral relations: Critique of moral status ascription. New York: Palgrave Macmillan.
Zurück zum Zitat Datteri, E. (2013). Predicting the long-term effects of human-robot interaction: A reflection on responsibility in medical robotics. Science and Engineering Ethics, 19(1), 139–160. Datteri, E. (2013). Predicting the long-term effects of human-robot interaction: A reflection on responsibility in medical robotics. Science and Engineering Ethics, 19(1), 139–160.
Zurück zum Zitat Dennett, D. C. (1996). Kinds of minds: Toward and understanding of consciousness. New York: Perseus Books. Dennett, D. C. (1996). Kinds of minds: Toward and understanding of consciousness. New York: Perseus Books.
Zurück zum Zitat Derrida, J. (2005). Paper machine (trans. by R. Bowlby). Stanford, CA: Stanford University Press. Derrida, J. (2005). Paper machine (trans. by R. Bowlby). Stanford, CA: Stanford University Press.
Zurück zum Zitat Feenberg, A. (1991). Critical theory of technology. New York: Oxford University Press. Feenberg, A. (1991). Critical theory of technology. New York: Oxford University Press.
Zurück zum Zitat Floridi, L. (2013). The ethic of information. Oxford: Oxford University Press. Floridi, L. (2013). The ethic of information. Oxford: Oxford University Press.
Zurück zum Zitat French, P. (1979). The corporation as a moral person. American Philosophical Quarterly, 16(3), 207–215.MathSciNet French, P. (1979). The corporation as a moral person. American Philosophical Quarterly, 16(3), 207–215.MathSciNet
Zurück zum Zitat Gladden, M. E. (2016). The diffuse intelligent other: An ontology of nonlocalizable robots as moral and legal actors. In M. Nørskov (Ed.), Social robots: Boundaries, potential, challenges (pp. 177–198). Burlington, VT: Ashgate. Gladden, M. E. (2016). The diffuse intelligent other: An ontology of nonlocalizable robots as moral and legal actors. In M. Nørskov (Ed.), Social robots: Boundaries, potential, challenges (pp. 177–198). Burlington, VT: Ashgate.
Zurück zum Zitat Gunkel, D. J. (2007). Thinking otherwise: Ethics, technology and other subjects. Ethics and Information Technology, 9(3), 165–177. Gunkel, D. J. (2007). Thinking otherwise: Ethics, technology and other subjects. Ethics and Information Technology, 9(3), 165–177.
Zurück zum Zitat Gunkel, D. J. (2012). The machine question: Critical perspectives on ai, robots and ethics. Cambridge, MA: MIT Press. Gunkel, D. J. (2012). The machine question: Critical perspectives on ai, robots and ethics. Cambridge, MA: MIT Press.
Zurück zum Zitat Hammond, D. N. (2015). Autonomous weapons and the problem of state accountability. Chicago Journal of International Law, 15(2), 652–687. Hammond, D. N. (2015). Autonomous weapons and the problem of state accountability. Chicago Journal of International Law, 15(2), 652–687.
Zurück zum Zitat Hanson, F. A. (2009). Beyond the skin bag: On the moral responsibility of extended agencies. Ethics and Information Technology, 11(1), 91–99. Hanson, F. A. (2009). Beyond the skin bag: On the moral responsibility of extended agencies. Ethics and Information Technology, 11(1), 91–99.
Zurück zum Zitat Heidegger, M. (1962). Being and time (trans. by John Macquarrie and Edward Robinson). New York: Harper and Row. Heidegger, M. (1962). Being and time (trans. by John Macquarrie and Edward Robinson). New York: Harper and Row.
Zurück zum Zitat Heidegger, M. (1977). The Question concerning technology and other essays (trans. by William Lovitt). New York: Harper and Row. Heidegger, M. (1977). The Question concerning technology and other essays (trans. by William Lovitt). New York: Harper and Row.
Zurück zum Zitat Johnson, D. G. (1985). Computer ethics. Upper Saddle River, NJ: Prentice Hall. Johnson, D. G. (1985). Computer ethics. Upper Saddle River, NJ: Prentice Hall.
Zurück zum Zitat Johnson, D. G. (2006). Computer systems: Moral entities but not moral agents. Ethics and Information Technology, 8(4), 195–204. Johnson, D. G. (2006). Computer systems: Moral entities but not moral agents. Ethics and Information Technology, 8(4), 195–204.
Zurück zum Zitat Johnson, D. G., & Miller, K. W. (2008). Un-making artificial moral agents. Ethics and Information Technology, 10(2–3), 123–133. Johnson, D. G., & Miller, K. W. (2008). Un-making artificial moral agents. Ethics and Information Technology, 10(2–3), 123–133.
Zurück zum Zitat Kant, I. (1963). Duties to animals and spirits. lectures on ethics (trans. by L. Infield) (pp. 239–241). New York: Harper and Row. Kant, I. (1963). Duties to animals and spirits. lectures on ethics (trans. by L. Infield) (pp. 239–241). New York: Harper and Row.
Zurück zum Zitat Keynes, J. M. (2010). Economic possibilities for our grandchildren. In Essays in persuasion (pp. 321–334). New York: Palgrave Macmillan. Keynes, J. M. (2010). Economic possibilities for our grandchildren. In Essays in persuasion (pp. 321–334). New York: Palgrave Macmillan.
Zurück zum Zitat Krishnan, A. (2009). Killer robots: Legality and ethicality of autonomous weapons. Burlington: Ashgate. Krishnan, A. (2009). Killer robots: Legality and ethicality of autonomous weapons. Burlington: Ashgate.
Zurück zum Zitat Latour, B. (2005). Reassembling the social: An introduction to actor-network-theory. Oxford: Oxford University Press. Latour, B. (2005). Reassembling the social: An introduction to actor-network-theory. Oxford: Oxford University Press.
Zurück zum Zitat Lokhorst, G. J., & van den Hoven, J. (2012). Responsibility for military robots. In P. Lin, K. Abney, & G. A. Bekey (Eds.), Robot ethics: The ethical and social implications of robots (pp. 145–155). Cambridge, MA: MIT Press. Lokhorst, G. J., & van den Hoven, J. (2012). Responsibility for military robots. In P. Lin, K. Abney, & G. A. Bekey (Eds.), Robot ethics: The ethical and social implications of robots (pp. 145–155). Cambridge, MA: MIT Press.
Zurück zum Zitat Lyotard, J. F. (1993). The postmodern condition: A report on knowledge (trans. by Geoff Bennington and Brian Massumi). Minneapolis, MN: University of Minnesota Press. Lyotard, J. F. (1993). The postmodern condition: A report on knowledge (trans. by Geoff Bennington and Brian Massumi). Minneapolis, MN: University of Minnesota Press.
Zurück zum Zitat Marx, K. (1977). Capital (trans. by Ben Fowkes). New York: Vintage Books. Marx, K. (1977). Capital (trans. by Ben Fowkes). New York: Vintage Books.
Zurück zum Zitat Matthias, A. (2004). The responsibility gap: Ascribing responsibility for the actions of learning automata. Ethics and Information Technology, 6(3), 175–183. Matthias, A. (2004). The responsibility gap: Ascribing responsibility for the actions of learning automata. Ethics and Information Technology, 6(3), 175–183.
Zurück zum Zitat Moore, G. E. (2005). Principia ethica. New York: Barnes & Noble Books. Moore, G. E. (2005). Principia ethica. New York: Barnes & Noble Books.
Zurück zum Zitat Mowshowitz, A. (2008). Technology as excuse for questionable ethics. AI & Society, 22(3), 271–282. Mowshowitz, A. (2008). Technology as excuse for questionable ethics. AI & Society, 22(3), 271–282.
Zurück zum Zitat Nissenbaum, H. (1996). Accountability in a computerized society. Science and Engineering Ethics, 2(1), 25–42. Nissenbaum, H. (1996). Accountability in a computerized society. Science and Engineering Ethics, 2(1), 25–42.
Zurück zum Zitat Reeves, B., & Nass, C. (1996). The media equation: How people treat computers, television, and new media like real people and places. Cambridge: Cambridge University Press. Reeves, B., & Nass, C. (1996). The media equation: How people treat computers, television, and new media like real people and places. Cambridge: Cambridge University Press.
Zurück zum Zitat Riceour, P. (2007). Reflections on the just (trans. by David Pellauer). Chicago: University of Chicago Press. Riceour, P. (2007). Reflections on the just (trans. by David Pellauer). Chicago: University of Chicago Press.
Zurück zum Zitat Rosenthal-von der Pütten, A. M., Krämer, N. C., Hoffmann, L., Sobieraj, S., & Eimler, S. C. (2013). An experimental study on emotional reactions towards a robot. International Journal of Social Robotics, 5(1), 17–34. Rosenthal-von der Pütten, A. M., Krämer, N. C., Hoffmann, L., Sobieraj, S., & Eimler, S. C. (2013). An experimental study on emotional reactions towards a robot. International Journal of Social Robotics, 5(1), 17–34.
Zurück zum Zitat Schulzke, M. (2013). Autonomous weapons and distributed responsibility. Philosophy & Technology, 26(2), 203–219. Schulzke, M. (2013). Autonomous weapons and distributed responsibility. Philosophy & Technology, 26(2), 203–219.
Zurück zum Zitat Sharkey, N. (2012). Killing made easy: From joysticks to politics. In P. Lin, K. Abney, & G. A. Bekey (Eds.), Robot ethics: The ethical and social implications of robots (pp. 111–128). Cambridge, MA: MIT Press. Sharkey, N. (2012). Killing made easy: From joysticks to politics. In P. Lin, K. Abney, & G. A. Bekey (Eds.), Robot ethics: The ethical and social implications of robots (pp. 111–128). Cambridge, MA: MIT Press.
Zurück zum Zitat Singer, P. (1975). Animal liberation: A new ethics for our treatment of animals. New York: New York Review Book. Singer, P. (1975). Animal liberation: A new ethics for our treatment of animals. New York: New York Review Book.
Zurück zum Zitat Singer, P. W. (2009). Wired for war: The robotics revolution and conflict in the twenty-first century. New York: Penguin Books. Singer, P. W. (2009). Wired for war: The robotics revolution and conflict in the twenty-first century. New York: Penguin Books.
Zurück zum Zitat Siponen, M. (2004). A pragmatic evaluation of the theory of information ethics. Ethics and Information Technology, 6(4), 279–290. Siponen, M. (2004). A pragmatic evaluation of the theory of information ethics. Ethics and Information Technology, 6(4), 279–290.
Zurück zum Zitat Sparrow, R. (2007). Killer robots. Journal of Applied Philosophy, 24(1), 62–77. Sparrow, R. (2007). Killer robots. Journal of Applied Philosophy, 24(1), 62–77.
Zurück zum Zitat Stahl, B. C. (2006). Responsible computers? A case for ascribing quasi-responsibility to computers independent of personhood or agency. Ethics and Information Technology, 8(4), 205–213. Stahl, B. C. (2006). Responsible computers? A case for ascribing quasi-responsibility to computers independent of personhood or agency. Ethics and Information Technology, 8(4), 205–213.
Zurück zum Zitat Sullins, J. P. (2006). When is a robot a moral agent? International Review of Information Ethics, 6(12), 23–30. Sullins, J. P. (2006). When is a robot a moral agent? International Review of Information Ethics, 6(12), 23–30.
Zurück zum Zitat Sullins, J. P. (2010). Robowarfare: Can robots be more ethical than humans on the battlefield? Ethics and Information Technology, 12(3), 263–275. Sullins, J. P. (2010). Robowarfare: Can robots be more ethical than humans on the battlefield? Ethics and Information Technology, 12(3), 263–275.
Zurück zum Zitat Suzuki, Y., Galli, L., Ikeda, A., Itakura, S., & Kitazaki, M. (2015). Measuring empathy for human and robot hand pain using electroencephalography. Scientific Reports, 5(1), 15924. doi:10.1038/srep15924. Suzuki, Y., Galli, L., Ikeda, A., Itakura, S., & Kitazaki, M. (2015). Measuring empathy for human and robot hand pain using electroencephalography. Scientific Reports, 5(1), 15924. doi:10.​1038/​srep15924.
Zurück zum Zitat Turing, A. (1999). Computing machinery and intelligence. In P. A. Meyer (Ed.), Computer media and communication: A reader (pp. 37–58). Oxford: Oxford University Press. Turing, A. (1999). Computing machinery and intelligence. In P. A. Meyer (Ed.), Computer media and communication: A reader (pp. 37–58). Oxford: Oxford University Press.
Zurück zum Zitat van de Poel, I., Nihle´n Fahlquist, J., Doorn, N., Zwart, S., & Royakkers, L. (2012). The problem of many hands: Climate change as an example. Science Engineering Ethics, 18(1), 49–67. van de Poel, I., Nihle´n Fahlquist, J., Doorn, N., Zwart, S., & Royakkers, L. (2012). The problem of many hands: Climate change as an example. Science Engineering Ethics, 18(1), 49–67.
Zurück zum Zitat Verbeek, P. P. (2011). Moralizing technology: Understanding and designing the morality of things. Chicago: University of Chicago Press. Verbeek, P. P. (2011). Moralizing technology: Understanding and designing the morality of things. Chicago: University of Chicago Press.
Zurück zum Zitat Wagenaar, W. A., & Groenewegen, J. (1987). Accidents at sea: Multiple causes and impossible consequences. International Journal of Man-Machine Studies, 27, 587–598. Wagenaar, W. A., & Groenewegen, J. (1987). Accidents at sea: Multiple causes and impossible consequences. International Journal of Man-Machine Studies, 27, 587–598.
Zurück zum Zitat Wallach, W. (2015). A dangerous master: How to keep technology from slipping beyond our control. New York: Basic Books. Wallach, W. (2015). A dangerous master: How to keep technology from slipping beyond our control. New York: Basic Books.
Zurück zum Zitat Wallach, W., & Allen, C. (2009). Moral machines: Teaching robots right from wrong. Oxford: Oxford University Press. Wallach, W., & Allen, C. (2009). Moral machines: Teaching robots right from wrong. Oxford: Oxford University Press.
Zurück zum Zitat Wiener, N. (1988). The human use of human beings: Cybernetics and society. Boston: Ad Capo Press. Wiener, N. (1988). The human use of human beings: Cybernetics and society. Boston: Ad Capo Press.
Zurück zum Zitat Winner, L. (1977). Autonomous technology: Technics-out-of-control as a theme in political thought. Cambridge, MA: MIT Press. Winner, L. (1977). Autonomous technology: Technics-out-of-control as a theme in political thought. Cambridge, MA: MIT Press.
Zurück zum Zitat Winograd. T. (1990). Thinking machines: Can there be? Are we? In D. Partridge & Y. Wilks (Eds.), The foundations of artificial intelligence: A sourcebook (pp. 167–189). Cambridge: Cambridge University Press. Winograd. T. (1990). Thinking machines: Can there be? Are we? In D. Partridge & Y. Wilks (Eds.), The foundations of artificial intelligence: A sourcebook (pp. 167–189). Cambridge: Cambridge University Press.
Zurück zum Zitat Žižek, S. (2006). Philosophy, the “Unknown Knowns,” and the public use of reason. Topoi, 25(1–2), 137–142. Žižek, S. (2006). Philosophy, the “Unknown Knowns,” and the public use of reason. Topoi, 25(1–2), 137–142.
Metadaten
Titel
Mind the gap: responsible robotics and the problem of responsibility
verfasst von
David J. Gunkel
Publikationsdatum
19.07.2017
Verlag
Springer Netherlands
Erschienen in
Ethics and Information Technology / Ausgabe 4/2020
Print ISSN: 1388-1957
Elektronische ISSN: 1572-8439
DOI
https://doi.org/10.1007/s10676-017-9428-2

Weitere Artikel der Ausgabe 4/2020

Ethics and Information Technology 4/2020 Zur Ausgabe

Premium Partner