Skip to main content
Top
Published in: Artificial Life and Robotics 4/2021

16-09-2021 | Original Article

Empathetic robot evaluation through emotion estimation analysis and facial expression synchronization from biological information

Authors: Peeraya Sripian, Muhammad Nur Adilin Mohd Anuardi, Yushun Kajihara, Midori Sugaya

Published in: Artificial Life and Robotics | Issue 4/2021

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Empathy is an important factor in human communication. For a robot to apply a matching emotion in human–robot communication, the robot needs to be able to understand human feelings. Therefore, in this study, we aimed to improve the human impression of the robot using a robot that expresses human-like expressions by synchronizing with human biological information and changing the expressions in real time. We first measured and estimated human emotions using an emotion estimation method based on biological information (brain waves and heartbeats). The three-emotion estimation methods were proposed and evaluated in the preliminary experiment. Among the three-emotion estimation methods proposed, the one that yields the highest impression rating was chosen to be used in the second experiment which was based on the emotional value in each cycle method. Then, we developed a robot that shows expressions in two patterns: (1) synchronized emotion (same emotion as subject conveyed) and (2) inversed emotion with the human. The subjects evaluated the robot’s expression from both patterns using semantic differential (SD) method while having their biological information measured based on the selected emotion estimation method from previous preliminary experiment. The evaluation by SD method and biological information results showed that when the human experienced the happiness emotion, and the robot synchronized and expressed the same emotion, this could increase the intimacy between human and robot. Here, it can be said that the impression created by the robot’s expression can be improved using biological information.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Literature
2.
go back to reference Hiroi Y, Ito A, Nakano E (2008) Improvement of user familiarity using robot avatar for the human symbiosis robot. J Jpn Soc Kansei Eng 7(4):797–805CrossRef Hiroi Y, Ito A, Nakano E (2008) Improvement of user familiarity using robot avatar for the human symbiosis robot. J Jpn Soc Kansei Eng 7(4):797–805CrossRef
3.
go back to reference Yamaguci R, Miyamoto R (2018) Delineating the neural basis of cognitive and affective empathy in adult women during a novel facial assessment and observation task. Jpn J Clin Neuro-physiol 46(6):567–577 Yamaguci R, Miyamoto R (2018) Delineating the neural basis of cognitive and affective empathy in adult women during a novel facial assessment and observation task. Jpn J Clin Neuro-physiol 46(6):567–577
5.
go back to reference Hasegawa R, Fujimura T (2014) A New issue for the practical application of an eeg-based brain-machine interface (BMI); an emotional communication aid by the CG avatars that exhibit a variety of facial expressions. J Inst Image Info Television Eng 68(12):902–906 Hasegawa R, Fujimura T (2014) A New issue for the practical application of an eeg-based brain-machine interface (BMI); an emotional communication aid by the CG avatars that exhibit a variety of facial expressions. J Inst Image Info Television Eng 68(12):902–906
6.
go back to reference Yamano M, Usui T, Hashimoto M (2008) A proposal of human-robot interaction design based on emotional synchronization. Human-Agent Interaction (HAI) Symposium Yamano M, Usui T, Hashimoto M (2008) A proposal of human-robot interaction design based on emotional synchronization. Human-Agent Interaction (HAI) Symposium
7.
go back to reference Jimenez F, Yoshikawa T, Furuhashi T, Kanoh M (2016) Effects of collaborative learning with robots using model of emotional expressions. J Jpn Soc Fuzzy Theory Intell Info 28(4):700–704 Jimenez F, Yoshikawa T, Furuhashi T, Kanoh M (2016) Effects of collaborative learning with robots using model of emotional expressions. J Jpn Soc Fuzzy Theory Intell Info 28(4):700–704
8.
go back to reference Tanizaki Y, Jimenez F, Yoshikawa T, Furuhashi T (2018) Impression effects of educational support robots using sympathy expressions method by body movement and facial expression. J Jpn Soc Fuzzy Theory Intell Info 30(5):700–708 Tanizaki Y, Jimenez F, Yoshikawa T, Furuhashi T (2018) Impression effects of educational support robots using sympathy expressions method by body movement and facial expression. J Jpn Soc Fuzzy Theory Intell Info 30(5):700–708
10.
go back to reference Kurono Y, Sripian P, Chen F, Sugaya M (2019) A preliminary experiment on the estimation of emotion using facial expression and biological signals. International Conference on Human-Computer Interaction, pp 133–142 Kurono Y, Sripian P, Chen F, Sugaya M (2019) A preliminary experiment on the estimation of emotion using facial expression and biological signals. International Conference on Human-Computer Interaction, pp 133–142
11.
go back to reference Sripian P, Kurono Y, Yoshida R, Sugaya M (2019) Study of empathy on robot expression based on emotion estimated from facial expression and biological signals.In: 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), pp 1–8 Sripian P, Kurono Y, Yoshida R, Sugaya M (2019) Study of empathy on robot expression based on emotion estimated from facial expression and biological signals.In: 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), pp 1–8
12.
go back to reference Ikeda Y, Horie R, Sugaya M (2017) Estimate emotion with biological information for robot interaction. In: 21st International Conference on Knowledge-Based and Intelligent Information & Engineering Systems (KES-2017), Marseille, France, pp 6–8 Ikeda Y, Horie R, Sugaya M (2017) Estimate emotion with biological information for robot interaction. In: 21st International Conference on Knowledge-Based and Intelligent Information & Engineering Systems (KES-2017), Marseille, France, pp 6–8
13.
go back to reference Russell JA (1980) A circumplex model of affect. J Pers Soc Psychol 39(6):1161–1178CrossRef Russell JA (1980) A circumplex model of affect. J Pers Soc Psychol 39(6):1161–1178CrossRef
14.
go back to reference NeuroSky (2004) MindWave Mobile. http;//store.neurosky.com NeuroSky (2004) MindWave Mobile. http;//store.neurosky.com
15.
go back to reference Panicker SS, Gayathri P (2019) A survey of machine learning techniques in physiology based mental stress detection systems. Biocybern Biomed Eng 39(2):444–469CrossRef Panicker SS, Gayathri P (2019) A survey of machine learning techniques in physiology based mental stress detection systems. Biocybern Biomed Eng 39(2):444–469CrossRef
17.
go back to reference Ragot M, Martin N, Em S, Pallamin N, Diverrez JM (2017), Emotion recognition using physiological signals: laboratory vs. wearable sensors. Advances in human factors in wearable technologies and game design, pp 15–22 Ragot M, Martin N, Em S, Pallamin N, Diverrez JM (2017), Emotion recognition using physiological signals: laboratory vs. wearable sensors. Advances in human factors in wearable technologies and game design, pp 15–22
18.
go back to reference López-Gil JM, Virgili-Gomá J, Gil R, García R (2016) Method for improving EEG based emotion recognition by combining it with synchronized biometric and eye tracking technologies in a non-invasive and low cost way. Front Comput Neurosci 10:1–14 López-Gil JM, Virgili-Gomá J, Gil R, García R (2016) Method for improving EEG based emotion recognition by combining it with synchronized biometric and eye tracking technologies in a non-invasive and low cost way. Front Comput Neurosci 10:1–14
19.
go back to reference Noguchi A, Watanabe D (2010) Estimation of emotional strength received from the accelerated linear motion. Tokyo University of Technology Noguchi A, Watanabe D (2010) Estimation of emotional strength received from the accelerated linear motion. Tokyo University of Technology
20.
go back to reference Baveye Y, Dellandrea E, Chamaret C, Chen L (2015) LIRIS-ACCEDE: a video database for affective content analysis. IEEE Trans Affect Comput 6(1):43–55CrossRef Baveye Y, Dellandrea E, Chamaret C, Chen L (2015) LIRIS-ACCEDE: a video database for affective content analysis. IEEE Trans Affect Comput 6(1):43–55CrossRef
21.
go back to reference Hayashi F (1978) The fundamental dimensions of interpersonal cognitive structure. Bulletin of the school of education. Psychology 25:233–247 Hayashi F (1978) The fundamental dimensions of interpersonal cognitive structure. Bulletin of the school of education. Psychology 25:233–247
22.
go back to reference Eerola T, Vuoskoski JK (2011) A comparison of the discrete and dimensional models of emotion in music. Psychol Music 39(1):18–49CrossRef Eerola T, Vuoskoski JK (2011) A comparison of the discrete and dimensional models of emotion in music. Psychol Music 39(1):18–49CrossRef
23.
go back to reference Mori K, Iwanaga M (2014) Recent progress on music and motion studies: psychological response, peripheral nervous system activity, and musico-acoustic features. Jpn Psychol Rev 57(2):215–234 Mori K, Iwanaga M (2014) Recent progress on music and motion studies: psychological response, peripheral nervous system activity, and musico-acoustic features. Jpn Psychol Rev 57(2):215–234
Metadata
Title
Empathetic robot evaluation through emotion estimation analysis and facial expression synchronization from biological information
Authors
Peeraya Sripian
Muhammad Nur Adilin Mohd Anuardi
Yushun Kajihara
Midori Sugaya
Publication date
16-09-2021
Publisher
Springer Japan
Published in
Artificial Life and Robotics / Issue 4/2021
Print ISSN: 1433-5298
Electronic ISSN: 1614-7456
DOI
https://doi.org/10.1007/s10015-021-00696-w

Other articles of this Issue 4/2021

Artificial Life and Robotics 4/2021 Go to the issue