Skip to main content
Top

2021 | OriginalPaper | Chapter

Engagement-Based Adaptive Behaviors for Laboratory Guide in Human-Robot Dialogue

Authors : Koji Inoue, Divesh Lala, Kenta Yamamoto, Katsuya Takanashi, Tatsuya Kawahara

Published in: Increasing Naturalness and Flexibility in Spoken Dialogue Interaction

Publisher: Springer Singapore

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

We address an application of engagement recognition in human-robot dialogue. Engagement is defined as how much a user is interested in the current dialogue, and keeping users engaged is important for spoken dialogue systems. In this study, we apply a real-time engagement recognition model to laboratory guide by autonomous android ERICA which plays the role of the guide. According to an engagement score of a user, ERICA generates adaptive behaviors consisting of feedback utterances and additional explanations. A subject experiment showed that the adaptive behaviors increased both the engagement score and related subjective scores such as interest and empathy.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Footnotes
1
Demo video (in Japanese language) is available at https://​youtu.​be/​53I3lhJ6aUw.
 
Literature
1.
go back to reference Castellano G, Pereira A, Leite I, Paiva A, McOwan PW (2009) Detecting user engagement with a robot companion using task and social interaction-based features. In: ICMI, pp 119–126 Castellano G, Pereira A, Leite I, Paiva A, McOwan PW (2009) Detecting user engagement with a robot companion using task and social interaction-based features. In: ICMI, pp 119–126
2.
go back to reference Chiba Y, Nose T, Ito A (2017) Analysis of efficient multimodal features for estimating user’s willingness to talk: comparison of human-machine and human-human dialog. In: APSIPA ASC Chiba Y, Nose T, Ito A (2017) Analysis of efficient multimodal features for estimating user’s willingness to talk: comparison of human-machine and human-human dialog. In: APSIPA ASC
3.
go back to reference DeVault D, Artstein R, Benn G, Dey T, Fast E, Gainer A, Georgila K, Gratch J, Hartholt A, Lhommet M, Lucas G, Marsella S, Morbini F, Nazarian A, Scherer S, Stratou G, Suri A, Traum D, Wood R, Xu Y, Rizzo A, Morency LP (2014) SimSensei kiosk: a virtual human interviewer for healthcare decision support. In: AAMAS, pp 1061–1068 DeVault D, Artstein R, Benn G, Dey T, Fast E, Gainer A, Georgila K, Gratch J, Hartholt A, Lhommet M, Lucas G, Marsella S, Morbini F, Nazarian A, Scherer S, Stratou G, Suri A, Traum D, Wood R, Xu Y, Rizzo A, Morency LP (2014) SimSensei kiosk: a virtual human interviewer for healthcare decision support. In: AAMAS, pp 1061–1068
4.
go back to reference Glas N, Pelachaud C (2015) Definitions of engagement in human-agent interaction. In: International workshop on engagement in human computer interaction, pp 944–949 Glas N, Pelachaud C (2015) Definitions of engagement in human-agent interaction. In: International workshop on engagement in human computer interaction, pp 944–949
5.
go back to reference Gratch J, Wang N, Gerten J, Fast E, Duffy R (2007) Creating rapport with virtual agents. In: IVA, pp 125–138 Gratch J, Wang N, Gerten J, Fast E, Duffy R (2007) Creating rapport with virtual agents. In: IVA, pp 125–138
6.
go back to reference Hall L, Woods S, Aylett R, Newall L, Paiva A (2005) Achieving empathic engagement through affective interaction with synthetic characters. In: ICACII, pp 731–738 Hall L, Woods S, Aylett R, Newall L, Paiva A (2005) Achieving empathic engagement through affective interaction with synthetic characters. In: ICACII, pp 731–738
7.
go back to reference Inoue K, Hara K, Lala D, Nakamura S, Takanashi K, Kawahara T (2019) A job interview dialogue system with autonomous android ERICA. In: IWSDS. Submitted Inoue K, Hara K, Lala D, Nakamura S, Takanashi K, Kawahara T (2019) A job interview dialogue system with autonomous android ERICA. In: IWSDS. Submitted
8.
go back to reference Inoue K, Lala D, Takanashi K, Kawahara T (2018) Engagement recognition by a latent character model based on multimodal listener behaviors in spoken dialogue. APSIPA Trans Signal Inf Process 7(e9):1–16 Inoue K, Lala D, Takanashi K, Kawahara T (2018) Engagement recognition by a latent character model based on multimodal listener behaviors in spoken dialogue. APSIPA Trans Signal Inf Process 7(e9):1–16
9.
go back to reference Inoue K, Lala D, Takanashi K, Kawahara T (2018) Engagement recognition in spoken dialogue via neural network by aggregating different annotators’ models. In: Interspeech, pp 616–620 Inoue K, Lala D, Takanashi K, Kawahara T (2018) Engagement recognition in spoken dialogue via neural network by aggregating different annotators’ models. In: Interspeech, pp 616–620
10.
go back to reference Inoue K, Lala D, Takanashi K, Kawahara T (2018) Latent character model for engagement recognition based on multimodal behaviors. In: IWSDS Inoue K, Lala D, Takanashi K, Kawahara T (2018) Latent character model for engagement recognition based on multimodal behaviors. In: IWSDS
11.
go back to reference Inoue K, Milhorat P, Lala D, Zhao T, Kawahara T (2016) Talking with ERICA, an autonomous android. In: SIGDIAL, pp 212–215 Inoue K, Milhorat P, Lala D, Zhao T, Kawahara T (2016) Talking with ERICA, an autonomous android. In: SIGDIAL, pp 212–215
12.
go back to reference Kawahara T (2018) Spoken dialogue system for a human-like conversational robot ERICA. In: IWSDS Kawahara T (2018) Spoken dialogue system for a human-like conversational robot ERICA. In: IWSDS
13.
go back to reference Kawahara T, Yamaguchi T, Inoue K, Takanashi K, Ward NG (2016) Prediction and generation of backchannel form for attentive listening systems. In: Interspeech, pp 2890–2894 Kawahara T, Yamaguchi T, Inoue K, Takanashi K, Ward NG (2016) Prediction and generation of backchannel form for attentive listening systems. In: Interspeech, pp 2890–2894
14.
go back to reference Lala D, Milhorat P, Inoue K, Ishida M, Takanashi K, Kawahara T (2017) Attentive listening system with backchanneling, response generation and flexible turn-taking. In: SIGDIAL, pp 127–136 Lala D, Milhorat P, Inoue K, Ishida M, Takanashi K, Kawahara T (2017) Attentive listening system with backchanneling, response generation and flexible turn-taking. In: SIGDIAL, pp 127–136
15.
go back to reference Nakano YI, Ishii R (2010) Estimating user’s engagement from eye-gaze behaviors in human-agent conversations. In: IUI, pp 139–148 Nakano YI, Ishii R (2010) Estimating user’s engagement from eye-gaze behaviors in human-agent conversations. In: IUI, pp 139–148
16.
go back to reference Poggi I (2007) Mind, hands, face and body: a goal and belief view of multimodal communication. Berlin, Weidler Poggi I (2007) Mind, hands, face and body: a goal and belief view of multimodal communication. Berlin, Weidler
17.
go back to reference Rich C, Ponsler B, Holroyd A, Sidner CL (2010) Recognizing engagement in human-robot interaction. In: HRI, pp 375–382 Rich C, Ponsler B, Holroyd A, Sidner CL (2010) Recognizing engagement in human-robot interaction. In: HRI, pp 375–382
18.
go back to reference Sanghvi J, Castellano G, Leite I, Pereira A, McOwan PW, Paiva A (2011) Automatic analysis of affective postures and body motion to detect engagement with a game companion. In: HRI, pp 305–311 Sanghvi J, Castellano G, Leite I, Pereira A, McOwan PW, Paiva A (2011) Automatic analysis of affective postures and body motion to detect engagement with a game companion. In: HRI, pp 305–311
19.
go back to reference Sidner CL, Lee C, Kidd CD, Lesh N, Rich C (2005) Explorations in engagement for humans and robots. Artif Intell 166(1–2):140–164CrossRef Sidner CL, Lee C, Kidd CD, Lesh N, Rich C (2005) Explorations in engagement for humans and robots. Artif Intell 166(1–2):140–164CrossRef
20.
go back to reference Sun, M., Zhao, Z., Ma, X.: Sensing and handling engagement dynamics in human-robot interaction involving peripheral computing devices. In: CHI, pp. 556–567 (2017) Sun, M., Zhao, Z., Ma, X.: Sensing and handling engagement dynamics in human-robot interaction involving peripheral computing devices. In: CHI, pp. 556–567 (2017)
21.
go back to reference Swartout W, Traum D, Artstein R, Noren D, Debevec P, Bronnenkant K, Williams J, Leuski A, Narayanan S, Piepol D, Lane C, Morie J, Aggarwal P, Liewer M, Yuan-Jen C, Gerten J, Chu S, White K (2010) Ada and grace: toward realistic and engaging virtual museum guides. In: IVA, pp 286–300 Swartout W, Traum D, Artstein R, Noren D, Debevec P, Bronnenkant K, Williams J, Leuski A, Narayanan S, Piepol D, Lane C, Morie J, Aggarwal P, Liewer M, Yuan-Jen C, Gerten J, Chu S, White K (2010) Ada and grace: toward realistic and engaging virtual museum guides. In: IVA, pp 286–300
22.
go back to reference Ueno S, Moriya T, Mimura M, Sakai S, Shinohara Y, Yamaguchi Y, Aono Y, Kawahara T (2018) Encoder transfer for attention-based acoustic-to-word speech recognition. In: Interspeech, pp 2424–2428 Ueno S, Moriya T, Mimura M, Sakai S, Shinohara Y, Yamaguchi Y, Aono Y, Kawahara T (2018) Encoder transfer for attention-based acoustic-to-word speech recognition. In: Interspeech, pp 2424–2428
23.
go back to reference Xu Q, Li L, Wang G (2013) Designing engagement-aware agents for multiparty conversations. In: CHI, pp 2233–2242 Xu Q, Li L, Wang G (2013) Designing engagement-aware agents for multiparty conversations. In: CHI, pp 2233–2242
24.
go back to reference Yu C, Aoki PM, Woodruff A (2004) Detecting user engagement in everyday conversations. In: ICSLP, pp 1329–1332 Yu C, Aoki PM, Woodruff A (2004) Detecting user engagement in everyday conversations. In: ICSLP, pp 1329–1332
25.
go back to reference Yu Z, Nicolich-Henkin L, Black AW, Rudnicky AI (2016) A Wizard-of-Oz study on a non-task-oriented dialog systems that reacts to user engagement. In: SIGDIAL, pp 55–63 Yu Z, Nicolich-Henkin L, Black AW, Rudnicky AI (2016) A Wizard-of-Oz study on a non-task-oriented dialog systems that reacts to user engagement. In: SIGDIAL, pp 55–63
26.
go back to reference Yu Z, Ramanarayanan V, Lange P, Suendermann-Oeft D (2017) An open-source dialog system with real-time engagement tracking for job interview training applications. In: IWSDS Yu Z, Ramanarayanan V, Lange P, Suendermann-Oeft D (2017) An open-source dialog system with real-time engagement tracking for job interview training applications. In: IWSDS
Metadata
Title
Engagement-Based Adaptive Behaviors for Laboratory Guide in Human-Robot Dialogue
Authors
Koji Inoue
Divesh Lala
Kenta Yamamoto
Katsuya Takanashi
Tatsuya Kawahara
Copyright Year
2021
Publisher
Springer Singapore
DOI
https://doi.org/10.1007/978-981-15-9323-9_11