Skip to main content
Top
Published in: International Journal of Social Robotics 4/2022

23-11-2021

How does Modality Matter? Investigating the Synthesis and Effects of Multi-modal Robot Behavior on Social Intelligence

Authors: Karen Tatarian, Rebecca Stower, Damien Rudaz, Marine Chamoux, Arvid Kappas, Mohamed Chetouani

Published in: International Journal of Social Robotics | Issue 4/2022

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Multi-modal behavior for social robots is crucial for the robot’s perceived social intelligence, ability to communicate nonverbally, and the extent to which the robot can be trusted. However, most of the research conducted so far has been with only one modality, thus there is still a lack of understanding of the effect of each modality when performed in a multi-modal interaction. This study presents a multi-modal interaction focusing on the following modalities: proxemics for social navigation, gaze mechanisms (for turn-taking floor-holding, turn-yielding and joint attention), kinesics (for symbolic, deictic, and beat gestures), and social dialogue. The multi-modal behaviors were evaluated through an experiment with 105 participants in a seven minute interaction to analyze the effects on perceived social intelligence through both objective and subjective measurements. The results show various insights of the effect of modalities in a multi-modal interaction onto several behavioral outcomes of the users, including taking physical suggestions, distances maintained during the interaction, wave gestures performed in greeting and closing, back-channeling, and how socially the robot is treated, while having no effect on self-disclosure and subjective liking.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Footnotes
1
Further subjective measurements, referring to the comparison between self-reported attitudes and behaviors towards social robots will be examined elsewhere.
 
2
Multi-modal Social Cues System Implementation GitHub Repository https://​github.​com/​KarenTatarian/​multimodal_​socialcues.
 
Literature
1.
go back to reference Abele A (1986) Functions of gaze in social interaction: communication and monitoring. J Nonverbal Behav 10(2):83–101CrossRef Abele A (1986) Functions of gaze in social interaction: communication and monitoring. J Nonverbal Behav 10(2):83–101CrossRef
2.
go back to reference Admoni H, Datsikas C, Scassellati B (2014) Speech and gaze conflicts in collaborative human-robot interactions. In: Proceedings of annual meeting of the cognitive science society (CogSci’14), vol 36, pp 104–109 Admoni H, Datsikas C, Scassellati B (2014) Speech and gaze conflicts in collaborative human-robot interactions. In: Proceedings of annual meeting of the cognitive science society (CogSci’14), vol 36, pp 104–109
3.
go back to reference Admoni H, Scassellati B (2017) Social eye gaze in human-robot interaction: a review. J Hum Robot Interact 6(1):25–63CrossRef Admoni H, Scassellati B (2017) Social eye gaze in human-robot interaction: a review. J Hum Robot Interact 6(1):25–63CrossRef
5.
go back to reference Andrist S, Pejsa T, Mutlu B, Gleicher M (2012) Designing effective gaze mechanisms for virtual agents. In: Proceedings of the SIGCHI conference on human factors in computing systems, (CHI’12). Association for Computing Machinery, New York, pp 705–714. https://doi.org/10.1145/2207676.2207777 Andrist S, Pejsa T, Mutlu B, Gleicher M (2012) Designing effective gaze mechanisms for virtual agents. In: Proceedings of the SIGCHI conference on human factors in computing systems, (CHI’12). Association for Computing Machinery, New York, pp 705–714. https://​doi.​org/​10.​1145/​2207676.​2207777
6.
go back to reference Andrist S, Tan, XZ, Gleicher, M, Mutlu B (2014) Conversational gaze aversion for humanlike robots. In: Proceedings of the 2014 ACM/IEEE international conference on human–robot interaction (HRI’14). Association for Computing Machinery, New York, pp 25-3-2. https://doi.org/10.1145/2559636.2559666 Andrist S, Tan, XZ, Gleicher, M, Mutlu B (2014) Conversational gaze aversion for humanlike robots. In: Proceedings of the 2014 ACM/IEEE international conference on human–robot interaction (HRI’14). Association for Computing Machinery, New York, pp 25-3-2. https://​doi.​org/​10.​1145/​2559636.​2559666
7.
go back to reference Argyle M, Cook M (1976) Gaze and mutual gaze. Cambridge University Press, Cambridge Argyle M, Cook M (1976) Gaze and mutual gaze. Cambridge University Press, Cambridge
8.
go back to reference Argyle M, Dean J (1965) Eye-contact, distance and affiliation. Sociometry 28:289–304CrossRef Argyle M, Dean J (1965) Eye-contact, distance and affiliation. Sociometry 28:289–304CrossRef
9.
go back to reference Birnbaum GE, Mizrahi M, Hoffman G, Reis HT, Finkel EJ, Sass O (2016) Machines as a source of consolation: Robot responsiveness increases human approach behavior and desire for companionship. In: 2016 11th ACM/IEEE international conference on human–robot interaction (HRI), pp 165–172. https://doi.org/10.1109/HRI.2016.7451748 Birnbaum GE, Mizrahi M, Hoffman G, Reis HT, Finkel EJ, Sass O (2016) Machines as a source of consolation: Robot responsiveness increases human approach behavior and desire for companionship. In: 2016 11th ACM/IEEE international conference on human–robot interaction (HRI), pp 165–172. https://​doi.​org/​10.​1109/​HRI.​2016.​7451748
13.
go back to reference Breazeal CL, Kidd CD, Thomaz AL, Hoffman G, Berlin M (2005) Effects of nonverbal communication on efficiency and robustness in human–robot teamwork. In: 2005 IEEE/RSJ International conference on intelligent robots and systems, IROS, pp 383–388. https://doi.org/10.1109/IROS.2005.1545011 Breazeal CL, Kidd CD, Thomaz AL, Hoffman G, Berlin M (2005) Effects of nonverbal communication on efficiency and robustness in human–robot teamwork. In: 2005 IEEE/RSJ International conference on intelligent robots and systems, IROS, pp 383–388. https://​doi.​org/​10.​1109/​IROS.​2005.​1545011
15.
go back to reference Chiu Cy, Hong YY, Krauss RM (1995) Gaze direction and fluency in conversational speech. Unpublished manuscript Chiu Cy, Hong YY, Krauss RM (1995) Gaze direction and fluency in conversational speech. Unpublished manuscript
16.
go back to reference Clark HH, Krych MA (2004) Speaking while monitoring addressees for understanding Clark HH, Krych MA (2004) Speaking while monitoring addressees for understanding
24.
go back to reference Fong T, Thorpe C, Baur C (2003) Collaboration, dialogue, human–robot interaction. In: Jarvis RA, Zelinsky A (eds) Robotics research. Springer, Berlin, pp 255–266CrossRef Fong T, Thorpe C, Baur C (2003) Collaboration, dialogue, human–robot interaction. In: Jarvis RA, Zelinsky A (eds) Robotics research. Springer, Berlin, pp 255–266CrossRef
25.
go back to reference Grosz BJ, Sidner CL (1986) Attention, intentions, and the structure of discourse. Comput Linguist 12(3):175–204 Grosz BJ, Sidner CL (1986) Attention, intentions, and the structure of discourse. Comput Linguist 12(3):175–204
26.
go back to reference Hall ET (1959) The silent language. Edward Hall, Doubleday Garden City Hall ET (1959) The silent language. Edward Hall, Doubleday Garden City
28.
go back to reference Hall E, Congress of CPCL (1966) The hidden dimension. Anchor books. Doubleday Hall E, Congress of CPCL (1966) The hidden dimension. Anchor books. Doubleday
29.
go back to reference Hall E (1974) Handbook for proxemic research. Studies in the anthropology of visual communication. Society for the Anthropology of Visual Communication Hall E (1974) Handbook for proxemic research. Studies in the anthropology of visual communication. Society for the Anthropology of Visual Communication
34.
go back to reference Kendon A (1967) Some functions of gaze-direction in social interaction. Acta Physiol (Oxf) 26(1):22–63 Kendon A (1967) Some functions of gaze-direction in social interaction. Acta Physiol (Oxf) 26(1):22–63
35.
go back to reference Kendon A (1990) Conducting interaction: patterns of behavior in focused encounters. Studies in interactional sociolinguistics. Cambridge University Press, Cambridge Kendon A (1990) Conducting interaction: patterns of behavior in focused encounters. Studies in interactional sociolinguistics. Cambridge University Press, Cambridge
36.
go back to reference Kennedy J, Baxter P, Belpaeme T (2015) The robot who tried too hard: social behaviour of a robot tutor can negatively affect child learning. In: Proceedings of the tenth annual ACM/IEEE international conference on human–robot interaction, HRI’15. Association for Computing Machinery, New York, pp 67–74. https://doi.org/10.1145/2696454.2696457 Kennedy J, Baxter P, Belpaeme T (2015) The robot who tried too hard: social behaviour of a robot tutor can negatively affect child learning. In: Proceedings of the tenth annual ACM/IEEE international conference on human–robot interaction, HRI’15. Association for Computing Machinery, New York, pp 67–74. https://​doi.​org/​10.​1145/​2696454.​2696457
37.
go back to reference Kipp M, Martin JC (2009) Gesture and emotion: can basic gestural form features discriminate emotions? In: 2009 3rd International conference on affective computing and intelligent interaction and workshops, pp 1–8 Kipp M, Martin JC (2009) Gesture and emotion: can basic gestural form features discriminate emotions? In: 2009 3rd International conference on affective computing and intelligent interaction and workshops, pp 1–8
38.
go back to reference Kirchner N, Alempijevic A, Dissanayake G (2011) Nonverbal robot-group interaction using an imitated gaze cue. In: Proceedings of the 6th international conference on human–robot interaction, HRI’11. Association for Computing Machinery, New York, pp 497–504. https://doi.org/10.1145/1957656.1957824 Kirchner N, Alempijevic A, Dissanayake G (2011) Nonverbal robot-group interaction using an imitated gaze cue. In: Proceedings of the 6th international conference on human–robot interaction, HRI’11. Association for Computing Machinery, New York, pp 497–504. https://​doi.​org/​10.​1145/​1957656.​1957824
40.
go back to reference Kruse T, Kirsch A, Sisbot EA, Alami R (2010) Exploiting human cooperation in human-centered robot navigation. In: RO-MAN. IEEE, pp 192–197 Kruse T, Kirsch A, Sisbot EA, Alami R (2010) Exploiting human cooperation in human-centered robot navigation. In: RO-MAN. IEEE, pp 192–197
41.
go back to reference Kucherenko T (2018) Data driven non-verbal behavior generation for humanoid robots. In: Proceedings of the 20th ACM international conference on multimodal interaction, ICMI’18. Association for Computing Machinery, New York, pp 520–523. https://doi.org/10.1145/3242969.3264970 Kucherenko T (2018) Data driven non-verbal behavior generation for humanoid robots. In: Proceedings of the 20th ACM international conference on multimodal interaction, ICMI’18. Association for Computing Machinery, New York, pp 520–523. https://​doi.​org/​10.​1145/​3242969.​3264970
44.
go back to reference Liu C, Ishi CT, Ishiguro H, Hagita N (2012) Generation of nodding, head tilting and eye gazing for human–robot dialogue interaction. In: Proceedings of the seventh annual ACM/IEEE international conference on human–robot interaction, HRI’12. Association for Computing Machinery, New York, pp 285–292. https://doi.org/10.1145/2157689.2157797 Liu C, Ishi CT, Ishiguro H, Hagita N (2012) Generation of nodding, head tilting and eye gazing for human–robot dialogue interaction. In: Proceedings of the seventh annual ACM/IEEE international conference on human–robot interaction, HRI’12. Association for Computing Machinery, New York, pp 285–292. https://​doi.​org/​10.​1145/​2157689.​2157797
45.
go back to reference Lucas GM, Boberg J, Traum D, Artstein R, Gratch J, Gainer A, Johnson E, Leuski A, Nakano M (2018) Getting to know each other: the role of social dialogue in recovery from errors in social robots. In: Proceedings of the 2018 ACM/IEEE international conference on human–robot interaction, HRI’18. Association for Computing Machinery, New York, pp 344–351. https://doi.org/10.1145/3171221.3171258 Lucas GM, Boberg J, Traum D, Artstein R, Gratch J, Gainer A, Johnson E, Leuski A, Nakano M (2018) Getting to know each other: the role of social dialogue in recovery from errors in social robots. In: Proceedings of the 2018 ACM/IEEE international conference on human–robot interaction, HRI’18. Association for Computing Machinery, New York, pp 344–351. https://​doi.​org/​10.​1145/​3171221.​3171258
46.
go back to reference McNeill D (1992) Hand and mind: what gestures reveal about thought. University of Chicago Press, Chicago McNeill D (1992) Hand and mind: what gestures reveal about thought. University of Chicago Press, Chicago
47.
go back to reference Mol L, Krahmer E, Swerts M (2009) Alignment in iconic gestures: does it make sense? In: Theobald BJ, Harvey R (eds) Proceedings of the eight international conference on auditory-visual speech processing (AVSP 2009), School of Computing Sciences, pp 3–8 Mol L, Krahmer E, Swerts M (2009) Alignment in iconic gestures: does it make sense? In: Theobald BJ, Harvey R (eds) Proceedings of the eight international conference on auditory-visual speech processing (AVSP 2009), School of Computing Sciences, pp 3–8
48.
go back to reference Moon AJ, Troniak DM, Gleeson B, Pan MK, Zheng M, Blumer BA, MacLean K, Crof EA (2014) Meet me where I’m gazing: how shared attention gaze affects human–robot handover timing. In: 2014 9th ACM/IEEE international conference on human–robot interaction (HRI), pp 334–341 Moon AJ, Troniak DM, Gleeson B, Pan MK, Zheng M, Blumer BA, MacLean K, Crof EA (2014) Meet me where I’m gazing: how shared attention gaze affects human–robot handover timing. In: 2014 9th ACM/IEEE international conference on human–robot interaction (HRI), pp 334–341
49.
50.
go back to reference Mutlu B, Shiwa T, Kanda T, Ishiguro H, Hagita N (2009) Footing in human–robot conversations: how robots might shape participant roles using gaze cues. In: Proceedings of the 4th ACM/IEEE international conference on human robot interaction, HRI’09. Association for Computing Machinery, New York, pp 61–68. https://doi.org/10.1145/1514095.1514109 Mutlu B, Shiwa T, Kanda T, Ishiguro H, Hagita N (2009) Footing in human–robot conversations: how robots might shape participant roles using gaze cues. In: Proceedings of the 4th ACM/IEEE international conference on human robot interaction, HRI’09. Association for Computing Machinery, New York, pp 61–68. https://​doi.​org/​10.​1145/​1514095.​1514109
52.
go back to reference Peters R, Broekens J, Neerincx MA (2017) Robots educate in style: the effect of context and non-verbal behaviour on children’s perceptions of warmth and competence. In: 2017 26th IEEE international symposium on robot and human interactive communication (RO-MAN), pp 449–455. https://doi.org/10.1109/ROMAN.2017.8172341 Peters R, Broekens J, Neerincx MA (2017) Robots educate in style: the effect of context and non-verbal behaviour on children’s perceptions of warmth and competence. In: 2017 26th IEEE international symposium on robot and human interactive communication (RO-MAN), pp 449–455. https://​doi.​org/​10.​1109/​ROMAN.​2017.​8172341
53.
go back to reference Peters R, Broekens J, Li K, Neerincx MA (2019) Robots expressing dominance: effects of behaviours and modulation. In: 2019 8th International conference on affective computing and intelligent interaction (ACII), pp 1–7 Peters R, Broekens J, Li K, Neerincx MA (2019) Robots expressing dominance: effects of behaviours and modulation. In: 2019 8th International conference on affective computing and intelligent interaction (ACII), pp 1–7
55.
go back to reference Qureshi AH, Nakamura Y, Yoshikawa Y, Ishiguro H (2017) Robot gains social intelligence through multimodal deep reinforcement learning. CoRR arXiv:1702.07492 Qureshi AH, Nakamura Y, Yoshikawa Y, Ishiguro H (2017) Robot gains social intelligence through multimodal deep reinforcement learning. CoRR arXiv:​1702.​07492
56.
go back to reference Saad E, Neerincx M, Hindriks K (2019) Welcoming robot behaviors for drawing attention. In: 2019 14th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, United States, pp 368–368. https://doi.org/10.1109/HRI.2019.8673325. Video Abstract; 14th annual ACM/IEEE international conference on human–robot interaction, HRI 2019 Conference date: 11-03-2019 Through 14-03-2019 Saad E, Neerincx M, Hindriks K (2019) Welcoming robot behaviors for drawing attention. In: 2019 14th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, United States, pp 368–368. https://​doi.​org/​10.​1109/​HRI.​2019.​8673325. Video Abstract; 14th annual ACM/IEEE international conference on human–robot interaction, HRI 2019 Conference date: 11-03-2019 Through 14-03-2019
60.
go back to reference Schegloff EA (1998) Body torque. Soc Res 65(5):536–596 Schegloff EA (1998) Body torque. Soc Res 65(5):536–596
62.
go back to reference Shi C, Shimada M, Kanda T, Ishiguro H, Hagita N (2011) Spatial formation model for initiating conversation. In: Spatial formation model for initiating conversation, robotics: science and systems Shi C, Shimada M, Kanda T, Ishiguro H, Hagita N (2011) Spatial formation model for initiating conversation. In: Spatial formation model for initiating conversation, robotics: science and systems
63.
go back to reference Sidner CL, Lee C, Morency LP, Forlines C (2006) The effect of head-nod recognition in human-robot conversation. In: Proceedings of the 1st ACM SIGCHI/SIGART conference on human–robot interaction, HRI’06. Association for Computing Machinery, New York, pp 290–296. https://doi.org/10.1145/1121241.1121291 Sidner CL, Lee C, Morency LP, Forlines C (2006) The effect of head-nod recognition in human-robot conversation. In: Proceedings of the 1st ACM SIGCHI/SIGART conference on human–robot interaction, HRI’06. Association for Computing Machinery, New York, pp 290–296. https://​doi.​org/​10.​1145/​1121241.​1121291
64.
go back to reference Sisbot EA, Marin-Urias LF, Alami R, Simeon T (2007) A human aware mobile robot motion planner. IEEE Trans Robotics 23(5):874–883CrossRef Sisbot EA, Marin-Urias LF, Alami R, Simeon T (2007) A human aware mobile robot motion planner. IEEE Trans Robotics 23(5):874–883CrossRef
65.
go back to reference Skantze G, Hjalmarsson A, Oertel C (2014) Turn-taking, feedback and joint attention in situated human–robot interaction. Speech Commun 65:50–66CrossRef Skantze G, Hjalmarsson A, Oertel C (2014) Turn-taking, feedback and joint attention in situated human–robot interaction. Speech Commun 65:50–66CrossRef
66.
go back to reference Strait M, Canning C, Scheutz M (2014) Let me tell you! Investigating the effects of robot communication strategies in advice-giving situations based on robot appearance, interaction modality and distance. In: Proceedings of the 2014 ACM/IEEE international conference on human–robot interaction—HRI’14. ACM Press, New York, pp 479–486 (2014). https://doi.org/10.1145/2559636.2559670 Strait M, Canning C, Scheutz M (2014) Let me tell you! Investigating the effects of robot communication strategies in advice-giving situations based on robot appearance, interaction modality and distance. In: Proceedings of the 2014 ACM/IEEE international conference on human–robot interaction—HRI’14. ACM Press, New York, pp 479–486 (2014). https://​doi.​org/​10.​1145/​2559636.​2559670
68.
go back to reference Thorndike EL (1920) Intelligence and its use. Harper’s Mag 140:227–235 Thorndike EL (1920) Intelligence and its use. Harper’s Mag 140:227–235
69.
go back to reference Torrey C, Fussell SR, Kiesler S (2013) How a robot should give advice. In: 2013 8th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 275–282 Torrey C, Fussell SR, Kiesler S (2013) How a robot should give advice. In: 2013 8th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 275–282
71.
go back to reference van Baaren RB, Holland RW, Kawakami K, van Knippenberg A (2004) Mimicry and prosocial behavior. Psychol Sci 15(1):71–74CrossRef van Baaren RB, Holland RW, Kawakami K, van Knippenberg A (2004) Mimicry and prosocial behavior. Psychol Sci 15(1):71–74CrossRef
72.
go back to reference Vertegaal R, Slagter R, van der Veer G, Nijholt A (2001) Eye gaze patterns in conversations: there is more to conversational agents than meets the eyes. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI’01. Association for Computing Machinery, New York, pp 301–308. https://doi.org/10.1145/365024.365119 Vertegaal R, Slagter R, van der Veer G, Nijholt A (2001) Eye gaze patterns in conversations: there is more to conversational agents than meets the eyes. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI’01. Association for Computing Machinery, New York, pp 301–308. https://​doi.​org/​10.​1145/​365024.​365119
74.
go back to reference Wang Y, Lucas G, Khooshabeh P, De Melo C, Gratch J (2015) Effects of emotional expressions on persuasion. Soc Influ 10(4):236–249CrossRef Wang Y, Lucas G, Khooshabeh P, De Melo C, Gratch J (2015) Effects of emotional expressions on persuasion. Soc Influ 10(4):236–249CrossRef
76.
go back to reference Yonezawa T, Yamazoe H, Utsumi A, Abe S (2007) Gaze-communicative behavior of stuffed-toy robot with joint attention and eye contact based on ambient gaze-tracking. In: Proceedings of the 9th international conference on multimodal interfaces, ICMI’07. Association for Computing Machinery, New York, pp 140–145. https://doi.org/10.1145/1322192.1322218 Yonezawa T, Yamazoe H, Utsumi A, Abe S (2007) Gaze-communicative behavior of stuffed-toy robot with joint attention and eye contact based on ambient gaze-tracking. In: Proceedings of the 9th international conference on multimodal interfaces, ICMI’07. Association for Computing Machinery, New York, pp 140–145. https://​doi.​org/​10.​1145/​1322192.​1322218
Metadata
Title
How does Modality Matter? Investigating the Synthesis and Effects of Multi-modal Robot Behavior on Social Intelligence
Authors
Karen Tatarian
Rebecca Stower
Damien Rudaz
Marine Chamoux
Arvid Kappas
Mohamed Chetouani
Publication date
23-11-2021
Publisher
Springer Netherlands
Published in
International Journal of Social Robotics / Issue 4/2022
Print ISSN: 1875-4791
Electronic ISSN: 1875-4805
DOI
https://doi.org/10.1007/s12369-021-00839-w

Other articles of this Issue 4/2022

International Journal of Social Robotics 4/2022 Go to the issue

Premium Partners