Skip to main content
Top

2018 | OriginalPaper | Chapter

6. Analysis of Head Motions and Speech, and Head Motion Control in an Android Robot

Authors : Carlos Toshinori Ishi, Hiroshi Ishiguro, Norihiro Hagita

Published in: Geminoid Studies

Publisher: Springer Singapore

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

With the aim of automatically generating head motions during speech utterances, analyses are conducted to verify the relations between head motions and linguistic and paralinguistic information carried by speech utterances. Motion-captured data are recorded during natural dialogue, and the rotation angles are estimated from the head marker data. Analysis results show that nods frequently occur during speech utterances, not only for expressing specific dialogue acts such as agreement and affirmation, but also to indicate syntactic or semantic units, which appear at the last syllable of the phrases, in strong phrase boundaries. The dependence on linguistic, prosodic and voice quality information of other head motions, including shakes and tilts, is also analyzed, and the potential for using this to automatically generate head motions is discussed. Intra-speaker variability and inter-speaker variability on the relations between head motion and dialogue acts are also analyzed. Finally, a method for controlling the head actuators of an android based on the rotation angles is proposed, and the mapping from human head motions is evaluated.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Ishi, C.T., J. Haas, F. P. Wilbers, H. Ishiguro, N. Hagita. 2007. Analysis of head motions and speech, and head motion control in an android. In Proceedings of the 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2007), 548–553. Ishi, C.T., J. Haas, F. P. Wilbers, H. Ishiguro, N. Hagita. 2007. Analysis of head motions and speech, and head motion control in an android. In Proceedings of the 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2007), 548–553.
2.
go back to reference Ishi, C.T., C. Liu, H. Ishiguro, N. Hagita. 2010. Head motions during dialogue speech and nod timing control in humanoid robots. In Proceedings of the 5th ACM/IEEE International Conference on Human-robot interaction (HRI 2010), 293–300. Ishi, C.T., C. Liu, H. Ishiguro, N. Hagita. 2010. Head motions during dialogue speech and nod timing control in humanoid robots. In Proceedings of the 5th ACM/IEEE International Conference on Human-robot interaction (HRI 2010), 293–300.
3.
go back to reference Sidner, C., C. Lee, L.-P. Morency, C. Forlines. 2006. The effect of head-nod recognition in human-robot conversation. In Proceedings of the HRI 2006, 290–296. Sidner, C., C. Lee, L.-P. Morency, C. Forlines. 2006. The effect of head-nod recognition in human-robot conversation. In Proceedings of the HRI 2006, 290–296.
4.
go back to reference Morency, L.-P., C. Sidner, C. Lee, and T. Darrell. 2007. Head gestures for perceptual interfaces: The role of context in improving recognition. Artificial Intelligence 171 (8–9): 568–585.CrossRef Morency, L.-P., C. Sidner, C. Lee, and T. Darrell. 2007. Head gestures for perceptual interfaces: The role of context in improving recognition. Artificial Intelligence 171 (8–9): 568–585.CrossRef
5.
go back to reference Yehia, H.C., T. Kuratate, and E. Vatikiotis-Bateson. 2002. Linking facial animation, head motion and speech acoustics. Journal of Phonetics 30: 555–568.CrossRef Yehia, H.C., T. Kuratate, and E. Vatikiotis-Bateson. 2002. Linking facial animation, head motion and speech acoustics. Journal of Phonetics 30: 555–568.CrossRef
6.
go back to reference Sargin, M.E., O. Aran, A. Karpov, F. Ofli, Y. Yasinnik, S. Wilson, E. Erzin, Y. Yemez, and A.M. Tekalp. 2006. Combined gesture-speech analysis and speech driven gesture synthesis. In Proceedings of the IEEE international conference on multimedia. Sargin, M.E., O. Aran, A. Karpov, F. Ofli, Y. Yasinnik, S. Wilson, E. Erzin, Y. Yemez, and A.M. Tekalp. 2006. Combined gesture-speech analysis and speech driven gesture synthesis. In Proceedings of the IEEE international conference on multimedia.
7.
go back to reference Munhall, K.G., J.A. Jones, D.E. Callan, T. Kuratate, and E. Vatikiotis-Bateson. 2004. Visual prosody and speech intelligibility—Head movement improves auditory speech perception. Psychological Science 15 (2): 133–137.CrossRef Munhall, K.G., J.A. Jones, D.E. Callan, T. Kuratate, and E. Vatikiotis-Bateson. 2004. Visual prosody and speech intelligibility—Head movement improves auditory speech perception. Psychological Science 15 (2): 133–137.CrossRef
8.
go back to reference Graf, H.P., E. Cosatto, V. Strom, and F.J. Huang. 2002. Visual prosody: Facial movements accompanying speech. In Proceedings of the IEEE international conference on automatic face and gesture recognition (FGR’02). Graf, H.P., E. Cosatto, V. Strom, and F.J. Huang. 2002. Visual prosody: Facial movements accompanying speech. In Proceedings of the IEEE international conference on automatic face and gesture recognition (FGR’02).
9.
go back to reference Beskow, J., B. Granstrom, and D. House. 2006. Visual correlates to prominence in several expressive modes. In Proceedings of the interspeech 2006—ICSLP, 1272–1275. Beskow, J., B. Granstrom, and D. House. 2006. Visual correlates to prominence in several expressive modes. In Proceedings of the interspeech 2006—ICSLP, 1272–1275.
10.
go back to reference Busso, C., Z. Deng, M. Grimm, U. Neumann, and S. Narayanan. 2007. Rigid head motion in expressive speech animation: analysis and synthesis. IEEE Transactions on Audio, Speech and Language Processing. Busso, C., Z. Deng, M. Grimm, U. Neumann, and S. Narayanan. 2007. Rigid head motion in expressive speech animation: analysis and synthesis. IEEE Transactions on Audio, Speech and Language Processing.
11.
go back to reference Iwano, Y., S. Kageyama, E. Morikawa, S. Nakazato, and K. Shirai. 1996. Analysis of head movements and its role in spoken dialogue. In Proceedings of the ICSLP’96, 2167–2170. Iwano, Y., S. Kageyama, E. Morikawa, S. Nakazato, and K. Shirai. 1996. Analysis of head movements and its role in spoken dialogue. In Proceedings of the ICSLP’96, 2167–2170.
12.
go back to reference Foster, M.E., and J. Oberlander. 2007. Corpus-based generation of head and eyebrow motion for an embodied conversational agent. Language Resources and Evaluation 41 (3–4): 305–323.CrossRef Foster, M.E., and J. Oberlander. 2007. Corpus-based generation of head and eyebrow motion for an embodied conversational agent. Language Resources and Evaluation 41 (3–4): 305–323.CrossRef
13.
go back to reference Ishi, C.T., J. Haas, F.P. Wilbers, H. Ishiguro, and N. Hagita. 2007. Analysis of head motions and speech, and head motion control in an android. In Proceedings of the IROS 2007, 548–553. Ishi, C.T., J. Haas, F.P. Wilbers, H. Ishiguro, and N. Hagita. 2007. Analysis of head motions and speech, and head motion control in an android. In Proceedings of the IROS 2007, 548–553.
14.
go back to reference Ishi, C.T., C. Liu, H. Ishiguro, and N. Hagita. Head motion during dialogue speech and nod timing control in humanoid robots. In Proceedings of 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI 2010), 293–300, 2010. Ishi, C.T., C. Liu, H. Ishiguro, and N. Hagita. Head motion during dialogue speech and nod timing control in humanoid robots. In Proceedings of 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI 2010), 293–300, 2010.
15.
go back to reference Stegmann, M.B., D.D. Gomez. 2002. A brief introduction to statistical shape analysis, Published online. Stegmann, M.B., D.D. Gomez. 2002. A brief introduction to statistical shape analysis, Published online.
16.
go back to reference Ishi, C.T., H. Ishiguro, N. Hagita. 2006. Analysis of prosodic and linguistic cues of phrase finals for turn-taking and dialog acts. In Proceedings of the interspeech’2006—ICSLP, 2006–2009. Ishi, C.T., H. Ishiguro, N. Hagita. 2006. Analysis of prosodic and linguistic cues of phrase finals for turn-taking and dialog acts. In Proceedings of the interspeech’2006—ICSLP, 2006–2009.
17.
go back to reference Ishi, C.T. 2005. Perceptually-related F0 parameters for automatic classification of phrase final tones. Transactions on Information and Systems E88-D(3), 481–488.CrossRef Ishi, C.T. 2005. Perceptually-related F0 parameters for automatic classification of phrase final tones. Transactions on Information and Systems E88-D(3), 481–488.CrossRef
18.
go back to reference Ishi, C.T., H. Ishiguro, and N. Hagita. 2008. Automatic extraction of paralinguistic information using prosodic features related to F0, duration and voice quality. Speech Communication 50 (6): 531–543.CrossRef Ishi, C.T., H. Ishiguro, and N. Hagita. 2008. Automatic extraction of paralinguistic information using prosodic features related to F0, duration and voice quality. Speech Communication 50 (6): 531–543.CrossRef
19.
go back to reference Ishi, C.T., H. Ishiguro, and N. Hagita. 2013. Analysis of relationship between head motion events and speech in dialogue conversations. Speech Communication 57 (2014): 233–243. Ishi, C.T., H. Ishiguro, and N. Hagita. 2013. Analysis of relationship between head motion events and speech in dialogue conversations. Speech Communication 57 (2014): 233–243.
20.
go back to reference Minato, T., M. Shimada, H. Ishiguro, and S. Itakura. 2004. Development of an android robot for studying human-robot interaction, 424–434. Springer Verlag: Innovations in applied artificial intelligence. Minato, T., M. Shimada, H. Ishiguro, and S. Itakura. 2004. Development of an android robot for studying human-robot interaction, 424–434. Springer Verlag: Innovations in applied artificial intelligence.
Metadata
Title
Analysis of Head Motions and Speech, and Head Motion Control in an Android Robot
Authors
Carlos Toshinori Ishi
Hiroshi Ishiguro
Norihiro Hagita
Copyright Year
2018
Publisher
Springer Singapore
DOI
https://doi.org/10.1007/978-981-10-8702-8_6