skip to main content
10.1145/2559636.2559663acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
research-article

Adaptive emotional expression in robot-child interaction

Published:03 March 2014Publication History

ABSTRACT

Expressive behaviour is a vital aspect of human interaction. A model for adaptive emotion expression was developed for the Nao robot. The robot has an internal arousal and valence value, which are influenced by the emotional state of its interaction partner and emotional occurrences such as winning a game. It expresses these emotions through its voice, posture, whole body poses, eye colour and gestures. An experiment with 18 children (mean age 9) and two Nao robots was conducted to study the influence of adaptive emotion expression on the interaction behaviour and opinions of children. In a within-subjects design the children played a quiz with both an affective robot using the model for adaptive emotion expression and a non-affective robot without this model. The affective robot reacted to the emotions of the child using the implementation of the model, the emotions of the child were interpreted by a Wizard of Oz. The dependent variables, namely the behaviour and opinions of the children, were measured through video analysis and questionnaires. The results show that children react more expressively and more positively to a robot which adaptively expresses itself than to a robot which does not. The feedback of the children in the questionnaires further suggests that showing emotion through movement is considered a very positive trait for a robot. From their positive reactions we can conclude that children enjoy interacting with a robot which adaptively expresses itself through emotion and gesture more than with a robot which does not do this.

References

  1. R. Banse and K. Scherer. Acoustic profiles in vocal emotion expression. Journal of Personality and Social Psychology, 70(3):614--636, 1996.Google ScholarGoogle ScholarCross RefCross Ref
  2. A. Beck, L. Canamero, and K. Bard. Towards an affect space for robots to display emotional body language. In IEEE RoMan Conference, 2010.Google ScholarGoogle ScholarCross RefCross Ref
  3. A. Beck, L. Canamero, L. Damiano, G. Sommavilla, F. Tesser, and P. Cosi. Children interpretation of emotional body language displayed by a robot. In ICSR 2011, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. T. Beran, A. Ramirez-Serrano, R. Kuzyk, M. Fior, and S. Nugent. Understanding how children understand robots: Percieved animism in child-robot interaction. International Journal of Human-Computer Studies, 69:539--550, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. P. Borkenau and A. Liebler. Trait inferences: Sources of validity at zero acquaintance. Journal of Personality and Social Psychology, 62(4):645--657, 1992.Google ScholarGoogle ScholarCross RefCross Ref
  6. E. Butler, B. Egloff, F. Wilhelm, N. Smith, E. Erickson, and J. Gross. The social consequences of expressive suppression. Emotion, 3(1):48--67, 2003.Google ScholarGoogle ScholarCross RefCross Ref
  7. L. Canamero and J. Fredslund. I show you how I like you-can you read it in my face? IEEE Transactions on Systems Man and Cybernetics, 31(5):454--459, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. J. Cassell. Nudge Nudge Wink Wink: Elements of Face-to-Face Conversation for Embodied Conversational Agents, chapter Introduction, pages 29--63. MIT Press, 2000.Google ScholarGoogle Scholar
  9. J. Cassell, H. H. Vilhjálmsson, and T. Bickmore. Beat: the behavior expression animation toolkit. In SIGGRAPH 2001, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. I. Cohen, R. Looije, and M. Neerincx. Child's recognition of emotions in robots' face and body. In Proc. 6th Int. Conference on Human Robot Interaction 2011, pages 123--124. ACM, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. M. Coulson. Attributing emotion to static body postures: Recognition accuracy, confusions and viewpoint dependance. Journal of Nonverbal Behavior, 28:117--139, 2004.Google ScholarGoogle ScholarCross RefCross Ref
  12. M. De Meijer. The contribution of general features of body movement to the attribution of emotions. Journal of Nonverbal Behavior, 13:247--268, 1989.Google ScholarGoogle ScholarCross RefCross Ref
  13. P. Ekman. Facial expressions of emotion. American Psychologist, 48:384--392, 1993.Google ScholarGoogle ScholarCross RefCross Ref
  14. R. Gockley, J. Forlizzi, and R. Simmons. Interactions with a moody robot. In Proc. Conference on Human-Robot Interaction, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. U. Hess and S. Blairy. Facial mimicry and emotional contagion to dynamic emotional facial expressions and their influence on decoding accuracy. International Journal of Psychophysiology, 40(2):129--141, 2001.Google ScholarGoogle ScholarCross RefCross Ref
  16. J. Hirth, N. Schmitz, and K. Berns. Towards social robots: Designing an emotion-based architecture. International Journal of Social Robotics, 3:273--290, 2011.Google ScholarGoogle ScholarCross RefCross Ref
  17. S. Jung, H. taek Lim, S. Kwak, and F. Biocca. Personality and facial expressions in human-robot interaction. In Proc. 7th ACM/IEEE Int. conference on Human-Robot Interaction, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. T. Kanda, R. Sato, N. Saiwaki, and H. Ishiguro. A two-month field trial in an elementary school for long-term human-robot interaction. IEEE Transactions on Robotics, 23:962--971, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. N. Kaya and H. Epps. Relationship between color and emotion: a study of college students. College Student Journal, 38(3):396--405, 2004.Google ScholarGoogle Scholar
  20. J. Kessens, M. Neerincx, R. Looije, M. Kroes, and G. Bloothooft. Facial and vocal emotion expression of a personal computer assistant to engage, educate and motivate children. In 3rd IEEE Int. Conference on Affective Computing and Intelligent Interaction, Amsterdam, the Netherlands, September 2009.Google ScholarGoogle ScholarCross RefCross Ref
  21. A. Kim, H. Kum, O. Roh, S. You, and S. Lee. Robot gesture and user acceptance of information in human-robot interaction. In Proc. 7th ACM/IEEE Int. conference on Human-Robot Interaction, pages 279--280, New York, NY, USA, 2012. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. I. Leite, G. Castellano, A. Pereira, C. Martinho, and A. Paiva. Modelling empathic behaviour in a robotic game companion for children: an ethnographic study in real-world settings. In Proceedings of the seventh annual ACM/IEEE international conference on Human-\ Robot Interaction, pages 367--374, New York, NY, USA, 2012. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. G. McHugo, J. Lanzetta, D. Sullivan, R. Masters, and B. Englis. Emotional reactions to a political leader's expressive displays. Journal of Personality and Social Psychology, 49(6):1513--1529, 1985.Google ScholarGoogle ScholarCross RefCross Ref
  24. D. McNeill. Hand and Mind. The University of Chicago Press, 1992.Google ScholarGoogle Scholar
  25. P. Muris, C. Meesters, and R. Diederen. Psychometric poperties of the big five questionnaire for children (bfq-c) in a dutch sample of young adolescents. Personality and individual differences, 38(8):1757 -- 1769, 2005.Google ScholarGoogle Scholar
  26. J. Russell. A circumplex model of affect. Journal of Personality & Social Psychology, 39(6):1161--1178, 1980.Google ScholarGoogle ScholarCross RefCross Ref
  27. M. Salem, S. Kopp, I. Wachsmuth, K. Rohlfing, and F. Joublin. Generation and evaluation of communicative robot gesture. International Journal of Social Robotics, 4:201--217, 2012.Google ScholarGoogle ScholarCross RefCross Ref
  28. J. Schulte, C. Rosenberg, and S. Thrun. Spontaneious, short-term interaction with mobile robots. In Proc. Int. Conference on Robotics and Automation, 1999.Google ScholarGoogle Scholar
  29. F. Tanaka, A. Cicourel, and J. Movellan. Socialization between toddlers and robots at an early childhood education center. Proc. of the National Academy of Sciences, 104(46):17954--17958, 2007.Google ScholarGoogle ScholarCross RefCross Ref
  30. M. Tielman. Expressive behaviour in robot-child interaction. Master's thesis, Utrecht University, 2013.Google ScholarGoogle Scholar
  31. I. Van Dam. Meet my new robot best friend: an exploration of the effects of personality traits in a robot on enhancing friendship. Master's thesis, Universiteit Utrecht, 2013.Google ScholarGoogle Scholar
  32. J. Xu, J. Broekens, K. Hindriks, and M. Neerincx. Mood expression through parameterized functional behavior of robots. In 22nd IEEE RO-MAN, 2013.Google ScholarGoogle Scholar

Index Terms

  1. Adaptive emotional expression in robot-child interaction

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      HRI '14: Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction
      March 2014
      538 pages
      ISBN:9781450326582
      DOI:10.1145/2559636

      Copyright © 2014 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 3 March 2014

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      HRI '14 Paper Acceptance Rate32of132submissions,24%Overall Acceptance Rate242of1,000submissions,24%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader