ABSTRACT
Expressive behaviour is a vital aspect of human interaction. A model for adaptive emotion expression was developed for the Nao robot. The robot has an internal arousal and valence value, which are influenced by the emotional state of its interaction partner and emotional occurrences such as winning a game. It expresses these emotions through its voice, posture, whole body poses, eye colour and gestures. An experiment with 18 children (mean age 9) and two Nao robots was conducted to study the influence of adaptive emotion expression on the interaction behaviour and opinions of children. In a within-subjects design the children played a quiz with both an affective robot using the model for adaptive emotion expression and a non-affective robot without this model. The affective robot reacted to the emotions of the child using the implementation of the model, the emotions of the child were interpreted by a Wizard of Oz. The dependent variables, namely the behaviour and opinions of the children, were measured through video analysis and questionnaires. The results show that children react more expressively and more positively to a robot which adaptively expresses itself than to a robot which does not. The feedback of the children in the questionnaires further suggests that showing emotion through movement is considered a very positive trait for a robot. From their positive reactions we can conclude that children enjoy interacting with a robot which adaptively expresses itself through emotion and gesture more than with a robot which does not do this.
- R. Banse and K. Scherer. Acoustic profiles in vocal emotion expression. Journal of Personality and Social Psychology, 70(3):614--636, 1996.Google ScholarCross Ref
- A. Beck, L. Canamero, and K. Bard. Towards an affect space for robots to display emotional body language. In IEEE RoMan Conference, 2010.Google ScholarCross Ref
- A. Beck, L. Canamero, L. Damiano, G. Sommavilla, F. Tesser, and P. Cosi. Children interpretation of emotional body language displayed by a robot. In ICSR 2011, 2011. Google ScholarDigital Library
- T. Beran, A. Ramirez-Serrano, R. Kuzyk, M. Fior, and S. Nugent. Understanding how children understand robots: Percieved animism in child-robot interaction. International Journal of Human-Computer Studies, 69:539--550, 2011. Google ScholarDigital Library
- P. Borkenau and A. Liebler. Trait inferences: Sources of validity at zero acquaintance. Journal of Personality and Social Psychology, 62(4):645--657, 1992.Google ScholarCross Ref
- E. Butler, B. Egloff, F. Wilhelm, N. Smith, E. Erickson, and J. Gross. The social consequences of expressive suppression. Emotion, 3(1):48--67, 2003.Google ScholarCross Ref
- L. Canamero and J. Fredslund. I show you how I like you-can you read it in my face? IEEE Transactions on Systems Man and Cybernetics, 31(5):454--459, 2001. Google ScholarDigital Library
- J. Cassell. Nudge Nudge Wink Wink: Elements of Face-to-Face Conversation for Embodied Conversational Agents, chapter Introduction, pages 29--63. MIT Press, 2000.Google Scholar
- J. Cassell, H. H. Vilhjálmsson, and T. Bickmore. Beat: the behavior expression animation toolkit. In SIGGRAPH 2001, 2001. Google ScholarDigital Library
- I. Cohen, R. Looije, and M. Neerincx. Child's recognition of emotions in robots' face and body. In Proc. 6th Int. Conference on Human Robot Interaction 2011, pages 123--124. ACM, 2011. Google ScholarDigital Library
- M. Coulson. Attributing emotion to static body postures: Recognition accuracy, confusions and viewpoint dependance. Journal of Nonverbal Behavior, 28:117--139, 2004.Google ScholarCross Ref
- M. De Meijer. The contribution of general features of body movement to the attribution of emotions. Journal of Nonverbal Behavior, 13:247--268, 1989.Google ScholarCross Ref
- P. Ekman. Facial expressions of emotion. American Psychologist, 48:384--392, 1993.Google ScholarCross Ref
- R. Gockley, J. Forlizzi, and R. Simmons. Interactions with a moody robot. In Proc. Conference on Human-Robot Interaction, 2006. Google ScholarDigital Library
- U. Hess and S. Blairy. Facial mimicry and emotional contagion to dynamic emotional facial expressions and their influence on decoding accuracy. International Journal of Psychophysiology, 40(2):129--141, 2001.Google ScholarCross Ref
- J. Hirth, N. Schmitz, and K. Berns. Towards social robots: Designing an emotion-based architecture. International Journal of Social Robotics, 3:273--290, 2011.Google ScholarCross Ref
- S. Jung, H. taek Lim, S. Kwak, and F. Biocca. Personality and facial expressions in human-robot interaction. In Proc. 7th ACM/IEEE Int. conference on Human-Robot Interaction, 2012. Google ScholarDigital Library
- T. Kanda, R. Sato, N. Saiwaki, and H. Ishiguro. A two-month field trial in an elementary school for long-term human-robot interaction. IEEE Transactions on Robotics, 23:962--971, 2007. Google ScholarDigital Library
- N. Kaya and H. Epps. Relationship between color and emotion: a study of college students. College Student Journal, 38(3):396--405, 2004.Google Scholar
- J. Kessens, M. Neerincx, R. Looije, M. Kroes, and G. Bloothooft. Facial and vocal emotion expression of a personal computer assistant to engage, educate and motivate children. In 3rd IEEE Int. Conference on Affective Computing and Intelligent Interaction, Amsterdam, the Netherlands, September 2009.Google ScholarCross Ref
- A. Kim, H. Kum, O. Roh, S. You, and S. Lee. Robot gesture and user acceptance of information in human-robot interaction. In Proc. 7th ACM/IEEE Int. conference on Human-Robot Interaction, pages 279--280, New York, NY, USA, 2012. ACM. Google ScholarDigital Library
- I. Leite, G. Castellano, A. Pereira, C. Martinho, and A. Paiva. Modelling empathic behaviour in a robotic game companion for children: an ethnographic study in real-world settings. In Proceedings of the seventh annual ACM/IEEE international conference on Human-\ Robot Interaction, pages 367--374, New York, NY, USA, 2012. ACM. Google ScholarDigital Library
- G. McHugo, J. Lanzetta, D. Sullivan, R. Masters, and B. Englis. Emotional reactions to a political leader's expressive displays. Journal of Personality and Social Psychology, 49(6):1513--1529, 1985.Google ScholarCross Ref
- D. McNeill. Hand and Mind. The University of Chicago Press, 1992.Google Scholar
- P. Muris, C. Meesters, and R. Diederen. Psychometric poperties of the big five questionnaire for children (bfq-c) in a dutch sample of young adolescents. Personality and individual differences, 38(8):1757 -- 1769, 2005.Google Scholar
- J. Russell. A circumplex model of affect. Journal of Personality & Social Psychology, 39(6):1161--1178, 1980.Google ScholarCross Ref
- M. Salem, S. Kopp, I. Wachsmuth, K. Rohlfing, and F. Joublin. Generation and evaluation of communicative robot gesture. International Journal of Social Robotics, 4:201--217, 2012.Google ScholarCross Ref
- J. Schulte, C. Rosenberg, and S. Thrun. Spontaneious, short-term interaction with mobile robots. In Proc. Int. Conference on Robotics and Automation, 1999.Google Scholar
- F. Tanaka, A. Cicourel, and J. Movellan. Socialization between toddlers and robots at an early childhood education center. Proc. of the National Academy of Sciences, 104(46):17954--17958, 2007.Google ScholarCross Ref
- M. Tielman. Expressive behaviour in robot-child interaction. Master's thesis, Utrecht University, 2013.Google Scholar
- I. Van Dam. Meet my new robot best friend: an exploration of the effects of personality traits in a robot on enhancing friendship. Master's thesis, Universiteit Utrecht, 2013.Google Scholar
- J. Xu, J. Broekens, K. Hindriks, and M. Neerincx. Mood expression through parameterized functional behavior of robots. In 22nd IEEE RO-MAN, 2013.Google Scholar
Index Terms
- Adaptive emotional expression in robot-child interaction
Recommendations
Psychological responses to simulated displays of mismatched emotional expressions
Embodied agents are often designed with the ability to simulate human emotion. This paper investigates the psychological impact of simulated emotional expressions on computer users with a particular emphasis on how mismatched facial and audio ...
Fuzzy Controlled PAD Emotional State of a NAO Robot
TAAI '13: Proceedings of the 2013 Conference on Technologies and Applications of Artificial IntelligenceVarious Emotion Models (e.g. Circumplex Model [1], Vector Model [2], PANA (Positive Activation - Negative Activation) Model [3], PAD (Pleasure Arousal Dominance) Model [4], etc...) can be used to represent the different emotional states of a robot. In ...
Generation and Evaluation of Audio-Visual Anger Emotional Expression for Android Robot
HRI '20: Companion of the 2020 ACM/IEEE International Conference on Human-Robot InteractionRecent studies in human-human interaction (HHI) have revealed the propensity of negative emotional expression to initiate affiliating functions that are beneficial to the expresser and also help to foster cordiality and closeness amongst interlocutors. ...
Comments