ABSTRACT
Human communication involves a number of nonverbal cues that are seemingly unintentional, unconscious, and automatic-both in their production and perception-and convey rich information on the emotional state and intentions of an individual. One family of such cues is called "nonverbal leakage." In this paper, we explore whether people can read nonverbal leakage cues-particularly gaze cues-in humanlike robots and make inferences on robots' intentions, and whether the physical design of the robot affects these inferences. We designed a gaze cue for Geminoid-a highly humanlike android-and Robovie-a robot with stylized, abstract humanlike features-that allowed the robots to "leak" information on what they might have in mind. In a controlled laboratory experiment, we asked participants to play a game of guessing with either of the robots and evaluated how the gaze cue affected participants' task performance. We found that the gaze cue did, in fact, lead to better performance, from which we infer that the cue led to attributions of mental states and intentionality. Our results have implications for robot design, particularly for designing expression of intentionality, and for our understanding of how people respond to human social cues when they are enacted by robots.
- Ambady, N. and Rosenthal, R. Thin Slides of Expressive Behavior as Predictors of Interpersonal Consequences: A Meta-Analysis. Psychological Bulletin, 111 (2), 256--274, 1992.Google ScholarCross Ref
- Argyle, M. et al. The communication of friendly and hostile attitudes by verbal and non-verbal signals. European J. Social Psychology, 1 (3), 385--402, 1971.Google ScholarCross Ref
- Bailenson J. N. et al. Detecting digital chameleons. Computer in Human Behavior, 24, 66--87, 2008 Google ScholarDigital Library
- Baron-Cohen, S. Mindblindness: An Essay on Autism and Theory of Mind. MIT Press, 1995.Google Scholar
- Bartneck, C. et al. In your face, robot! The influence of a character's embodiment on how users perceive its emotional expressions. In Proc. D&E'04, 2004.Google Scholar
- Bayliss, A. P. and Tipper, S. P. Predictive Gaze Cues and Personality Judgments: Should Eye Trust You? Psych. Science, 17 (6), 514--520, 2006.Google ScholarCross Ref
- Blakemore, S. J. and Decety, J. From the perception of action to the understanding of intention. Nature Reviews Neuroscience, 2 (8), 561--567, 2001.Google ScholarCross Ref
- Breazeal, C. and Scassellati, B. How to build robots that make friends and influence people. In Proc. IROS, 1999.Google ScholarCross Ref
- Breazeal, C. Emotive qualities in robot speech. In Proc. IROS'01, 2001.Google ScholarCross Ref
- Breazeal, C. Designing Sociable Robots, MIT Press, 2002. Google ScholarDigital Library
- Bruce, A. et al. The role of expressiveness and attention in human-robot interaction. In Proc. ICRA'02, 4138--4142, 2002.Google Scholar
- Buller, D. B. et al. Testing interpersonal deception theory: The language of interpersonal deception. Comm. Theory, 6, 268--289, 1996.Google ScholarCross Ref
- Calder, A. J. et al. Reading the mind from eye gaze. Neuropsychologia, 40, 1129--1138, 2002.Google ScholarCross Ref
- Castelli, F. et al. Autism, Asperger syndrome and brain mechanisms for the attribution of mental states to animated shapes. Brain, 125, 1839--1849, 2002.Google ScholarCross Ref
- Castiello, U. Understanding other people's actions: Intention and attention. J. Exp. Psych.: Human Perc. & Perf., 29, 416--430, 2003.Google Scholar
- Chartrand, T. L. and Bargh, J. A. The chameleon effect: the perception-behavior link and social interaction. J. Pers. & Soc. Psych., 76, 893--910, 1999.Google Scholar
- Chawla, P. and Krauss, R. M. Gesture and speech in spontaneous and rehearsed narratives. J. Exp. Soc. Psych., 30 (6), 580--601, 1994.Google ScholarCross Ref
- DePaulo, B. M. et al. Cues to Deception. Psych. Bulletin. 129 (1), 74--118, 2003.Google ScholarCross Ref
- Ekman, P. and Friesen, W.V. Nonverbal leakage and clues to deception. Psychiatry, 32, 88--106, 1969.Google ScholarCross Ref
- Ekman, P. et al. The Duchenne smile: Emotional expression and brain physiology II. J. Pers. & Soc. Psych., 58, 342--353, 1990.Google Scholar
- Emery, N. J. The eyes have it: the neuroethology, function and evolution of social gaze. Neurosc. & Biobehav. Rev., 24 (6), 581--604, 2000.Google ScholarCross Ref
- Feldman, R. S. et al. Nonverbal Cues as Indicators of Verbal Dissembling. American Ed. Res. J., 15 (2), 217--231, 1978.Google ScholarCross Ref
- Freire, A. et al. Are Eyes Windows to a Deceiver's Soul? Children's Use of Another's Eye Gaze Cues in a Deceptive Situation. Dev. Psych., 40 (6), 1093--1104, 2004.Google ScholarCross Ref
- Frischen, A. et al. Gaze Cueing of Attention: Visual Attention, Social Cognition, and Individual Differences. Psych. Bulletin, 133 (4), 694--724, 2007.Google ScholarCross Ref
- Hemsley, G. D. and Doob, A. N. The effect of looking behavior on perceptions of a communicator's credibility. J. Appl. Soc. Psych., 8, 136--144, 1978.Google ScholarCross Ref
- Hinds, P. J. et al. Whose Job Is It Anyway? A Study of Human-Robot Interaction in a Collaborative Task. HCI J., 19 (1), 151--181, 2004. Google ScholarDigital Library
- Imai, M. et al. Robot mediated round table: Analysis of the effect of robot's gaze. In Proc. ROMAN'02, 411--416, 2002.Google Scholar
- Imai, M. et al. Physical relation and expression: joint attention for human-robot interaction. IEEE Trans. Ind. Elec., 50 (4), 636--643, 2003.Google ScholarCross Ref
- Gockley, R. et al. Interactions with a moody robot. In Proc. HRI'06, 2006. Google ScholarDigital Library
- Goetz, J. et al. Matching robot appearance and behavior to tasks to improve human-robot cooperation. In Proc. ROMAN'03, 2003.Google Scholar
- Goffman, E. Relations in public. New York: Harper & Row, 1971.Google Scholar
- Goldberg, L. R. et al. The International Personality Item Pool and the future of public-domain personality measures. Journal of Research in Personality, 40, 84--96, 2006.Google ScholarCross Ref
- Kanda, T. et al. Body Movement Analysis of Human-Robot Interaction. In Proc. IJCAI'03, 177--182, 2003. Google ScholarDigital Library
- Kiesler, S. and Goetz, J. Mental models of robotic assistants. In Proc. CHI'02, 576--577, 2002. Google ScholarDigital Library
- Kleinke, C. L. Gaze and eye contact: A research review. Psychological Bulletin, 10, 78--100, 1986.Google ScholarCross Ref
- Krauss, R. M. et al. Verbal, vocal and visible factors in judgments of another's affect. J. Pers. & Soc. Psych., 40, 312--320, 1981.Google Scholar
- Kraut, R. E. and Poe, D. Behavioral roots of person perception: The deception judgments of customs inspectors and laymen. J. Pers. & Soc. Psych., 39, 784--798, 1980.Google Scholar
- Leslie, A. M. Pretense and representation: The origins of "theory of mind". Psych. Review, 94, 412--426, 1987.Google ScholarCross Ref
- Miklósi, A. et al. Intentional behavior in dog-human communication: An experimental analysis of "showing" behavior in the dog. Animal Cognition, 3 (3), 159--166, 2000.Google ScholarCross Ref
- Miwa, H. et al. Effective Emotional Expressions with Emotion Expression Humanoid Robot WE-4RII. In Proc. IROS'04, 2004.Google Scholar
- Mutlu, B. et al. A Storytelling Robot: Modeling and Evaluation of Human-like Gaze Behavior. In Proc. Humanoids, 2006.Google ScholarCross Ref
- Nowak, K. Defining and differentiating copresence, social presence and presence as transportation. In Proc. Presence, 2001.Google Scholar
- Nowak, K. L. and Biocca, F. The Effect of the Agency and Anthropomorphism on Users' Sense of Telepresence, Copresence, and Social Presence in Virtual Environments. Presence: Teleoperators & Virtual Environments, 12 (5), 481--494, 2003. Google ScholarDigital Library
- Parise, S. et al. Cooperating with life-like interface agents. IBM WRC Technical Report, 1998.Google Scholar
- Perrett, D. I. and Emery, N. J. Understanding the intentions of others from visual signals: neurophysiological evidence. Cahiers de Psychologie Cognitive, 13, 683--694, 1994.Google Scholar
- Pierno, A. C. et al. When gaze turns into grasp. J. Cog. Neuroscience, 18, 2130--2137, 2006 Google ScholarDigital Library
- Scheeff, M. et al. Experiences with Sparky: A social robot. In Proc. Workshop on Interactive Robot Entertainment, 2000.Google Scholar
- Scherer, K. R. et al. Minimal cues in the vocal communication of affect: Judging emotion from content-masked speech. J. Psycholinguistic Research, 1, 269--285, 1972.Google ScholarCross Ref
- Scherer, K. R. et al. The voice of confidence: Paralinguistic cues and audience evaluation. J. Res. in Pers., 7, 31--44, 1973.Google ScholarCross Ref
- Sidner, C. L. et al. Where to look: a study of human-robot engagement. In Proc. IUI'04, 2004. Google ScholarDigital Library
- Sugiyama, O. et al. Three-layered draw-attention model for humanoid robots with gestures and verbal cues. In Proc. IROS'05, 2005.Google Scholar
- Surakka, V. and Hietanen, J. K. Facial and emotional reactions to Duchenne and non-Duchenne smiles. Intl. J. Psychophysiology, 29, 23--33, 1998.Google ScholarCross Ref
- Tojo, T. et al. A conversational robot utilizing facial and body expressions. In Proc. Systems, Man, and Cybernetics'00, 2000.Google Scholar
- Trafton, J. G. et al. Integrating vision and audition within a cognitive architecture to track conversations. In Proc. HRI'08, 2008. Google ScholarDigital Library
- Watson D. et al. Development and validation of brief measures of positive and negative affect: the PANAS scales. J Pers. Soc. Psychol., 54 (6), 1063--1070, 1988.Google ScholarCross Ref
- Waxer, P. H. Nonverbal cues for anxiety: an examination of emotional leakage. J. Abnormal Psych., 86 (3), 306--314, 1977.Google ScholarCross Ref
- Williams, L. M. et al. In Search of the "Duchenne Smile": Evidence from Eye Movements. J. Psychophysiology, 15 (2), 122--127, 2001.Google Scholar
- Zuckerman, M. et al. Verbal and nonverbal communication of deception. In L. Berkowitz (Ed.), Advances in experimental social psychology, 14, 1--59. New York: Academic Press, 1981.Google Scholar
Index Terms
- Nonverbal leakage in robots: communication of intentions through seemingly unintentional behavior
Recommendations
Footing in human-robot conversations: how robots might shape participant roles using gaze cues
HRI '09: Proceedings of the 4th ACM/IEEE international conference on Human robot interactionDuring conversations, speakers establish their and others' participant roles (who participates in the conversation and in what capacity)--or "footing" as termed by Goffman-using gaze cues. In this paper, we study how a robot can establish the ...
Conversational gaze mechanisms for humanlike robots
During conversations, speakers employ a number of verbal and nonverbal mechanisms to establish who participates in the conversation, when, and in what capacity. Gaze cues and mechanisms are particularly instrumental in establishing the participant roles ...
Multimodal Collaboration in Expository Discourse: Verbal and Nonverbal Moves Alignment
Speech and ComputerAbstractThe paper explores multimodal collaboration in expository discourse considering verbal and nonverbal moves used by the participants in turn-taking. It reports the results of an experiment which tested speech, gesture and gaze alignment as affected ...
Comments