Skip to main content

Artefactual Agency and Artefactual Moral Agency

  • Chapter
  • First Online:

Part of the book series: Philosophy of Engineering and Technology ((POET,volume 17))

Abstract

This chapter takes as its starting place that artefacts, in combination with humans, constitute human action and social practices, including moral actions and practices. Our concern is with what is regarded as a moral agent in these actions and practices. Ideas about artefactual ontology, artefactual agency, and artefactual moral agency are intertwined. Discourse on artefactual agency and artefactual moral agency seems to draw on three different conceptions of agency. The first has to do with the causal efficacy of artefacts in the production of events and states of affairs. The second can be thought of as acting for or on behalf of another entity; agents are those who perform tasks for others and/or represent others. The third conception of agency has to do with autonomy and is often used to ground discourse on morality and what it means to be human. The casual efficacy and acting for conceptions of agency are used to ground intelligible accounts of artefactual moral agency. Accounts of artefactual moral agency that draw on the autonomy conception of agency, however, are problematic when they use an analogy between human moral autonomy and some aspect of artefacts as the basis for attributing to artefacts the status associated with moral autonomy.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD   109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    This material is based upon work supported by the National Science Foundation under Grant No. 1058457. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

  2. 2.

    To be sure, refrigerators are more complex than, say, forks and bowls or hammers, but a complete typology of artefacts would take too long to introduce here. Later the distinction between computational and non-computational artefacts will be addressed.

  3. 3.

    Johnson and Powers (2008) used the metaphor of computer systems as surrogate agents to tease out the possibility of a form of responsibility for computer systems.

  4. 4.

    Because human agents ‘acting for’ have decision-making latitude and the possibility of renegotiation, trust is an important aspect of human-to-human delegations. Clients must trust that their agents will use their decision-making latitude in accordance with specified constraints. Successful delegation relationships generally build trust, that is, the more an agent acts successfully on a client’s behalf, the more the client is likely to trust the agent in the future. Arguably, trust is involved in person-to-artefact delegations. We trust refrigerators to keep our food cold, search engines to bring us relevant results, etc. We speak not just of trustworthy accountants but also of trustworthy (reliable) computing. Of course, what it means to trust a human agent to act on one’s behalf and what it means to trust an artefact or a technological system to act on one’s behalf are quite different.

References

  • Barad, K. (1996). Meeting the universe halfway: Realism and social constructivism without contradiction. In J. H. Nelson & J. Nelson (Eds.), Feminism, science and the philosophy of science (pp. 161–194). Dordrecht: Kluwer Academic Publishers.

    Chapter  Google Scholar 

  • Collins, H., & Kusch, M. (1998). The shape of actions. What humans and machines can do. Cambridge, MA: The MIT Press.

    Google Scholar 

  • Heath, J. (2009, October). The uses and abuses of agency theory. Business Ethics Quarterly, 19(4), 497–528. 32 p.

    Google Scholar 

  • Johnson, D., & Miller, K. (2008). Un-making artificial moral agents. Ethics and Information Technology, 10(2–3), 123–133.

    Article  Google Scholar 

  • Johnson, D. G., & Powers, T. M. (2008). Computers as surrogate agents. In Information technology and moral philosophy (pp. 251–269). Cambridge: Cambridge University Press.

    Google Scholar 

  • Latour, B. (1992). Where are the missing masses? The sociology of a few mundane artifacts. In W. E. Bijker & J. Law (Eds.), Shaping technology/building society. Studies in sociotechnical change (pp. 225–258). Cambridge, MA: The MIT Press.

    Google Scholar 

  • Law, J., & Hassard, J. (1999). Actor network theory and after. Oxford/Malden: Blackwell Publishers/The Sociological Review.

    Google Scholar 

  • Lee, N., & Brown, S. (1994). Otherness and the actor network. American Behavioral Scientist, 37(6), 772–790.

    Article  Google Scholar 

  • Noorman, M. (2009). Mind the gap a critique of human/technology analogies in artificial agent discourse. Maastricht: Universitaire Pers Maastricht.

    Google Scholar 

  • Suchman, L. (1998). Human/machine reconsidered. Cognitive Studies, 5(1), 5–13.

    Google Scholar 

  • Suchman, L. (2001). Human/machine reconsidered. Published by the Department of Sociology, Lancaster University at: http://www.comp.lancs.ac.uk/sociology/soc040ls.html

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Deborah G. Johnson .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer Science+Business Media Dordrecht

About this chapter

Cite this chapter

Johnson, D.G., Noorman, M. (2014). Artefactual Agency and Artefactual Moral Agency. In: Kroes, P., Verbeek, PP. (eds) The Moral Status of Technical Artefacts. Philosophy of Engineering and Technology, vol 17. Springer, Dordrecht. https://doi.org/10.1007/978-94-007-7914-3_9

Download citation

Publish with us

Policies and ethics