Skip to main content

Part of the book series: Philosophical Studies Series ((PSSP,volume 122))

Abstract

Understanding others’ intentions and representing them as being able to understand intentions are relevant factors in cooperation, as is the ability to represent shared goals and coordinated action plans (joint intentions). To endow artificial systems with cooperative functionality, they need to be enabled to adopt the goals of another individual and act together with the other to achieve these goals. Such systems may be embodied as robotic agents or as humanoid agents projected in virtual reality (“embodied cooperative systems”). A central question is how the processes involved interact and how their interplay can be modeled. For example, inter-agent cooperation relies very much on common ground, i.e. the mutually shared knowledge of the interlocutors. Nonverbal behaviors such as gaze and gestures are important means of coordinating attention between interlocutors (joint attention) in the pursuit of goals. In the context of cooperative settings, the view that humans are users of a certain “tool” has shifted to that of a “partnership” with artificial agents, insofar they can be considered as being able to take initiative as autonomous entities. This chapter will outline these ideas taking the virtual humanoid agent “Max” as an example.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    The aspect of “mutual benefit” often included in definitions of cooperation is not taken up here because artificial systems do not seem to have genuine interests that could benefit from cooperation; see (Stephan et al. 2008).

  2. 2.

    The BDI approach comes from Michael Bratman (Bratman 1987); one of its fundamentals can be traced back to the work of Daniel Dennett (Dennett 1987) on the behavior of intentional systems.

  3. 3.

    Dialogue translated from German to English.

  4. 4.

    On how to configure an artificial agent so as to enable him to adopt a first-person perspective see Wachsmuth (2008).

  5. 5.

    For further detail, the formal definition of joint attention, and the specification of epistemic actions that lead to the respective beliefs and goals see Pfeiffer-Leßmann and Wachsmuth (2009).

  6. 6.

    Note that, even when we have attempted to build a coherent comprehensive system, not all aspects described in this article have been integrated in one system, that is, different versions of “Max” were used to explore the above ideas in implemented systems.

References

  • Becker, Christian, Stefan Kopp, and Ipke Wachsmuth. 2004. Simulating the emotion dynamics of a multimodal conversational agent. In Affective dialogue systems, ed. E. André, L. Dybkjaer, W. Minker, and P. Heisterkamp, 154–165. Berlin: Springer.

    Chapter  Google Scholar 

  • Boukricha, Hana, Nhung Nguyen, and Ipke Wachsmuth. 2011. Sharing emotions and space – Empathy as a basis for cooperative spatial interaction. In Proceedings of the 11th international conference on intelligent virtual agents (IVA 2011), ed. Kopp, S., S. Marsella, K. Thorisson, and H.H. Vilhjalmsson, 350–362. Berlin: Springer.

    Google Scholar 

  • Bratman, Michael E. 1987. Intention, plans, and practical reason. Harvard: Harvard University Press.

    Google Scholar 

  • Bratman, Micheal E. 1992. Shared cooperative activity. Philosophical Review 101(2): 327–341.

    Article  Google Scholar 

  • Breazeal, Cynthia, Andrew Brooks, David Chilongo, Jesse Gray, Guy Hoffman, Cory Kidd, Hans Lee, Jeff Lieberman, and Andrea Lockerd. 2004. Working collaboratively with humanoid robots. In Proceedings of humanoids 2004, Los Angeles.

    Google Scholar 

  • Brinck, Ingar. 2003. The objects of attention. In Proceedings of ESPP 2003 (European Society of Philosophy and Psychology), Torino, 9–12 July 2003, 1–4.

    Google Scholar 

  • Cassell, Justine, J. Sullivan, S. Prevost, and E. Churchill (eds.). 2000. Embodied conversational agents. Cambridge, MA: MIT Press.

    Google Scholar 

  • Deák, Gedeon O., Ian Fasel, and Javier Movellan. 2001. The emergence of shared attention: Using robots to test developmental theories. In Proceedings of the first international workshop on epigenetic robotics, Lund University Cognitive Studies, vol. 85, 95–104.

    Google Scholar 

  • Dennett, Daniel C. 1987. The intentional stance. Cambridge, MA: MIT Press.

    Google Scholar 

  • Dretske, Fred I. 2006. Minimal rationality. In Rational animals? ed. Susan L. Hurley and Matthew Nudds, 107–116. Oxford: Oxford University Press.

    Chapter  Google Scholar 

  • Kaplan, Frédéric, and Verena V. Hafner. 2006. The challenges of joint attention. Interaction Studies 7(2): 135–169.

    Article  Google Scholar 

  • Kopp, Stefan, and Ipke Wachsmuth. 2004. Synthesizing multimodal utterances for conversational agents. Computer Animation and Virtual Worlds 15: 39–52.

    Article  Google Scholar 

  • Kopp, Stefan, Lars Gesellensetter, Nicole C. Krämer, and Ipke Wachsmuth. 2005. A conversational agent as museum guide – Design and evaluation of a real-world application. In Intelligent virtual agents, ed. Themis Panayiotopoulos, Jonathan Gratch, Ruth Aylett, Daniel Ballin, Patrick Olivier, and Thomas Rist, 329–343. Berlin: Springer.

    Chapter  Google Scholar 

  • Krämer, Nicole C. 2008. Theory of mind as a theoretical prerequisite to model communication with virtual humans. In Modeling communication with robots and virtual humans, ed. Ipke Wachsmuth and Günther Knoblich, 222–240. Berlin: Springer.

    Chapter  Google Scholar 

  • Leßmann, Nadine, Stefan Kopp, and Ipke Wachsmuth. 2006. Situated interaction with a virtual human – Perception, action, and cognition. In Situated communication, ed. Gert Rickheit and Ipke Wachsmuth, 287–323. Berlin: Mouton de Gruyter.

    Google Scholar 

  • Mattar, Nikita, and Ipke Wachsmuth. 2014. Let’s get personal: Assessing the impact of personal information in human-agent conversations. In Human-computer interaction, ed. M. Kurosu, 450–461. Berlin: Springer.

    Google Scholar 

  • Negrotti, Massimo. 2005. Humans and naturoids: From use to partnerships. In Yearbook of the artificial, Cultural dimensions of the user, vol. 3, ed. Massimo Negrotti, 9–15. Bern: Peter Lang European Academic Publishers.

    Google Scholar 

  • Pfeiffer-Leßmann, Nadine, and Ipke Wachsmuth. 2009. Formalizing joint attention in cooperative interaction with a virtual human. In KI 2009: Advances in artificial intelligence, ed. B. Mertsching, M. Hund, and Z. Aziz, 540–547. Berlin: Springer.

    Chapter  Google Scholar 

  • Pfeiffer-Leßmann, Nadine, Thies Pfeiffer, and Ipke Wachsmuth. 2012. An operational model of joint attention – Timing of gaze patterns in interactions between humans and a virtual human. In Proceedings of the 34th annual conference of the Cognitive Science Society, ed. N. Miyake, D. Peebles, and R.P. Cooper, 851–856. Austin: Cognitive Science Society.

    Google Scholar 

  • Poggi, Isabella, and Catherine Pelachaud. 2000. Performative facial expression in animated faces. In Embodied conversational agents, ed. J. Cassell, J. Sullivan, S. Prevost, and E. Churchill, 155–188. Cambridge, MA: MIT Press.

    Google Scholar 

  • Premack, David, and Guy Woodruff. 1978. Does the chimpanzee have a theory of mind? Behavioral and Brain Sciences 4: 512–526.

    Google Scholar 

  • Rao, A.S., and M.P. Georgeff. 1991. Modeling rational agents within a BDI-architecture. In Principles of knowledge representation and reasoning, ed. J. Allen, R. Fikes, and E. Sandewall, 473–484. San Mateo: Morgan Kaufmann.

    Google Scholar 

  • Schank, Roger C. 1971. Finding the conceptual content and intention in an utterance in natural language conversation. In Proceedings of IJCAI 1971 (International joint conference on artificial intelligence), London 1–3 Sept 1971, 444–454.

    Google Scholar 

  • Searle, John R., and Daniel Vanderveken. 1985. Foundations of illocutionary logic. Cambridge: Cambridge University Press.

    Google Scholar 

  • Stephan, Achim, Manuela Lenzen, Josep Call, and Matthias Uhl. 2008. Communication and cooperation in living beings and artificial agents. In Embodied communication in humans and machines, ed. Ipke Wachsmuth, Manuela Lenzen, and Günther Knoblich, 179–200. Oxford: Oxford University Press.

    Chapter  Google Scholar 

  • Tomasello, Michael, Malinda Carpenter, Josep Call, Tanya Behne, and Henrike Moll. 2005. Understanding and sharing intentions: The origins of cultural cognition. Behavioral and Brain Sciences 28: 675–691.

    Google Scholar 

  • Wachsmuth, Ipke. 2008. ‘I, Max’ – Communicating with an artificial agent. In Modeling communication with robots and virtual humans, ed. Ipke Wachsmuth and Günther Knoblich, 279–295. Berlin: Springer.

    Chapter  Google Scholar 

  • Wooldridge, Michael. 2002. An introduction to multiagent systems. Chichester: Wiley.

    Google Scholar 

Download references

Acknowledgments

The research reported here draws on the contributions by the members of Bielefeld University’s artificial intelligence group which is hereby gratefully acknowledged. Thanks also to Catrin Misselhorn and an anonymous referee for helpful comments to improve the text. Over many years this research has been supported by the Deutsche Forschungsgemeinschaft (DFG) in the Collaborative Research Centers 360 (Situated Artificial Communicators) and 673 (Alignment in Communication), the Excellence Cluster 277 CITEC (Cognitive Interaction Technology), and the Heinz Nixdorf MuseumsForum (HNF).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ipke Wachsmuth .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Wachsmuth, I. (2015). Embodied Cooperative Systems: From Tool to Partnership. In: Misselhorn, C. (eds) Collective Agency and Cooperation in Natural and Artificial Systems. Philosophical Studies Series, vol 122. Springer, Cham. https://doi.org/10.1007/978-3-319-15515-9_4

Download citation

Publish with us

Policies and ethics