Skip to main content
Top

2018 | OriginalPaper | Chapter

10. Emotion Detection and Regulation from Personal Assistant Robot in Smart Environment

Authors : José Carlos Castillo, Álvaro Castro-González, Fernándo Alonso-Martín, Antonio Fernández-Caballero, Miguel Ángel Salichs

Published in: Personal Assistants: Emerging Computational Technologies

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

This paper introduces a proposal for integrating personal assistant robots with social capacities in smart environments. The personal robot will be a fundamental element for the detection and healthy regulation of the affect of the environment’s inhabitants. A full description of the main features of the proposed personal assistant robot are introduced. Also, the multi-modal emotion detection and emotion regulation modules are fully described. Machine learning techniques are employed for emotion recognition from voice and images and both outputs are merged to achieve the detected emotion.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Weiser, M., Gold, R., Brown, J.S.: The origins of ubiquitous computing research at PARC in the late 1980s. IBM Syst. J. 38(4), 693–696 (1999)CrossRef Weiser, M., Gold, R., Brown, J.S.: The origins of ubiquitous computing research at PARC in the late 1980s. IBM Syst. J. 38(4), 693–696 (1999)CrossRef
3.
go back to reference Castillo, J.C., Castro-González, Á., Fernández-Caballero, A., Latorre, J.M., Pastor, J.M., Fernández-Sotos, A., Salichs, M.A.: Software architecture for smart emotion recognition and regulation of the ageing adult. Cogn. Comput. 8(2), 357–367 (2016)CrossRef Castillo, J.C., Castro-González, Á., Fernández-Caballero, A., Latorre, J.M., Pastor, J.M., Fernández-Sotos, A., Salichs, M.A.: Software architecture for smart emotion recognition and regulation of the ageing adult. Cogn. Comput. 8(2), 357–367 (2016)CrossRef
4.
go back to reference Fernández-Caballero, A., Martínez-Rodrigo, A., Pastor, J.M., Castillo, J.C., Lozano-Monasor, E., López, M.T., Zangróniz, R., Latorre, J.M., Fernández-Sotos, A.: Smart environment architecture for emotion recognition and regulation. J. Biomed. Inf. 64, 55–73 (2016)CrossRef Fernández-Caballero, A., Martínez-Rodrigo, A., Pastor, J.M., Castillo, J.C., Lozano-Monasor, E., López, M.T., Zangróniz, R., Latorre, J.M., Fernández-Sotos, A.: Smart environment architecture for emotion recognition and regulation. J. Biomed. Inf. 64, 55–73 (2016)CrossRef
5.
go back to reference Castillo, J.C., Fernández-Caballero, A., Castro-González, Á., Salichs, M.A., López, M.T.: A framework for recognizing and regulating emotions in the elderly. Ambient Assisted Living and Daily Activities, pp. 320–327 (2014) Castillo, J.C., Fernández-Caballero, A., Castro-González, Á., Salichs, M.A., López, M.T.: A framework for recognizing and regulating emotions in the elderly. Ambient Assisted Living and Daily Activities, pp. 320–327 (2014)
6.
go back to reference Fernández-Caballero, A., Latorre, J.M., Pastor, J.M., Fernández-Sotos, A.: Improvement of the elderly quality of life and care through smart emotion regulation. Ambient Assisted Living and Daily Activities, pp. 348–355 (2014) Fernández-Caballero, A., Latorre, J.M., Pastor, J.M., Fernández-Sotos, A.: Improvement of the elderly quality of life and care through smart emotion regulation. Ambient Assisted Living and Daily Activities, pp. 348–355 (2014)
7.
go back to reference Bartneck, C., Forlizzi, J.: A design-centred framework for social human-robot interaction. In: 13th IEEE International Workshop on Robot and Human Interactive Communication, pp. 591–594 (2004) Bartneck, C., Forlizzi, J.: A design-centred framework for social human-robot interaction. In: 13th IEEE International Workshop on Robot and Human Interactive Communication, pp. 591–594 (2004)
8.
go back to reference Moon, Y.E.: Sony AIBO: the world’s first entertainment robot. Harvard Business School Case 502-010 (2001) Moon, Y.E.: Sony AIBO: the world’s first entertainment robot. Harvard Business School Case 502-010 (2001)
9.
go back to reference van Breemen, A., Yan, X., Meerbeek, B.: iCat: an animated user-interface robot with personality. In: The Fourth International Joint Conference on Autonomous Agents and Multiagent Systems, pp. 143–144 (2005) van Breemen, A., Yan, X., Meerbeek, B.: iCat: an animated user-interface robot with personality. In: The Fourth International Joint Conference on Autonomous Agents and Multiagent Systems, pp. 143–144 (2005)
10.
go back to reference Shibata, T., Inoue, k., Irie, R.: Emotional robot for intelligent system-artificial emotional creature project. In: 5th IEEE International Workshop on Robot and Human Communication, pp. 466–471 (1996) Shibata, T., Inoue, k., Irie, R.: Emotional robot for intelligent system-artificial emotional creature project. In: 5th IEEE International Workshop on Robot and Human Communication, pp. 466–471 (1996)
11.
go back to reference Setapen, A., Breazeal, C.: DragonBot: a platform for longitudinal cloud-HRI. Human-Robot Interaction (2012) Setapen, A., Breazeal, C.: DragonBot: a platform for longitudinal cloud-HRI. Human-Robot Interaction (2012)
12.
go back to reference Jiang, M., Zhang, L.: Big data analytics as a service for affective humanoid service robots. Proc. Comput. Sci. 53, 141–148 (2015)CrossRef Jiang, M., Zhang, L.: Big data analytics as a service for affective humanoid service robots. Proc. Comput. Sci. 53, 141–148 (2015)CrossRef
13.
go back to reference Alvarez, M., Galan, R., Matia, F., Rodriguez-Losada, D., Jimenez, A.: An emotional model for a guide robot. IEEE Trans. Syst. Man Cybern.? Part A Syst. Hum. 40(5), 982–992 (2010) Alvarez, M., Galan, R., Matia, F., Rodriguez-Losada, D., Jimenez, A.: An emotional model for a guide robot. IEEE Trans. Syst. Man Cybern.? Part A Syst. Hum. 40(5), 982–992 (2010)
14.
go back to reference Pérula-Martínez, R., Salichs, E., Encinar, I.P., Castro-González, Á., Salichs, M.A.: Improving the expressiveness of a social robot through luminous devices. In: 10th ACM/IEEE International Conference on Human-Robot Interaction. Extended Abstracts, pp. 5–6 (2015) Pérula-Martínez, R., Salichs, E., Encinar, I.P., Castro-González, Á., Salichs, M.A.: Improving the expressiveness of a social robot through luminous devices. In: 10th ACM/IEEE International Conference on Human-Robot Interaction. Extended Abstracts, pp. 5–6 (2015)
15.
go back to reference Mirnig, N., Tan, Y.K., Chang, T.W., Chua, Y.W., Dung, T.A., Li, H., Tscheligi, M.: Screen feedback in human-robot interaction: how to enhance robot expressiveness. In: The 23rd IEEE International Symposium on Robot and Human Interactive Communication, pp. 224–230 (2014) Mirnig, N., Tan, Y.K., Chang, T.W., Chua, Y.W., Dung, T.A., Li, H., Tscheligi, M.: Screen feedback in human-robot interaction: how to enhance robot expressiveness. In: The 23rd IEEE International Symposium on Robot and Human Interactive Communication, pp. 224–230 (2014)
16.
go back to reference Pantic, M., Rothkrantz, L.: Automatic analysis of facial expressions: the state of the art. IEEE Trans. Pattern Anal. Mach. Intell. 22, 1424–1445 (2000)CrossRef Pantic, M., Rothkrantz, L.: Automatic analysis of facial expressions: the state of the art. IEEE Trans. Pattern Anal. Mach. Intell. 22, 1424–1445 (2000)CrossRef
17.
go back to reference Khatri, N.N., Shah, Z.H., Patel, S.A.: Facial expression recognition: a survey. Int. J. Comput. Sci. Inf. Technol. 5(1), 149–152 (2014) Khatri, N.N., Shah, Z.H., Patel, S.A.: Facial expression recognition: a survey. Int. J. Comput. Sci. Inf. Technol. 5(1), 149–152 (2014)
18.
go back to reference Lang, P.J.: The emotion probe: studies of motivation and attention. Am. Psychol. 50(5), 372–385 (1995)CrossRef Lang, P.J.: The emotion probe: studies of motivation and attention. Am. Psychol. 50(5), 372–385 (1995)CrossRef
19.
go back to reference Russell, J.A.: A circumplex model of affect. J. Pers. Soc. Psychol. 39(6), 1161–1178 (1980)CrossRef Russell, J.A.: A circumplex model of affect. J. Pers. Soc. Psychol. 39(6), 1161–1178 (1980)CrossRef
20.
go back to reference Libkuman, T.M., Otani, H., Kern, R., Viger, S.G., Novak, N.: Multidimensional normative ratings for the international affective picture system. Behav. Res. Methods 39, 326–334 (2007)CrossRef Libkuman, T.M., Otani, H., Kern, R., Viger, S.G., Novak, N.: Multidimensional normative ratings for the international affective picture system. Behav. Res. Methods 39, 326–334 (2007)CrossRef
21.
go back to reference Cowie, R., Douglas-Cowie, E., Romano, A.: Changing emotional tone in dialogue and its prosodic correlates. In: ESCA Tutorial and Research Workshop on Dialogue and Prosody, pp. 41–46 (1999) Cowie, R., Douglas-Cowie, E., Romano, A.: Changing emotional tone in dialogue and its prosodic correlates. In: ESCA Tutorial and Research Workshop on Dialogue and Prosody, pp. 41–46 (1999)
22.
go back to reference Alonso-Martin, F., Castro-González, A., Gorostiza, J., Salichs, M.A.: Multidomain voice activity detection during human-robot interaction. In: International Conference on Social Robotics, pp. 64–73 (2013) Alonso-Martin, F., Castro-González, A., Gorostiza, J., Salichs, M.A.: Multidomain voice activity detection during human-robot interaction. In: International Conference on Social Robotics, pp. 64–73 (2013)
23.
go back to reference Alonso-Martin, F., Malfaz, M., Sequeira, J., Gorostiza, J., Salichs, M.A.: A multimodal emotion detection system during human-robot interaction. Sensors 13(11), 15549–15581 (2013)CrossRef Alonso-Martin, F., Malfaz, M., Sequeira, J., Gorostiza, J., Salichs, M.A.: A multimodal emotion detection system during human-robot interaction. Sensors 13(11), 15549–15581 (2013)CrossRef
24.
go back to reference Liberman, M., Davis, K., Grossman, M., Martey, N., Bell, J.: Emotional Prosody Speech and Transcripts. Linguistic Data Consortium, Philadelphia (2002) Liberman, M., Davis, K., Grossman, M., Martey, N., Bell, J.: Emotional Prosody Speech and Transcripts. Linguistic Data Consortium, Philadelphia (2002)
25.
go back to reference Vlasenko, B., Schuller, B.: Combining frame and turn-level information for robust recognition of emotions within speech. Interspeech, pp. 27–31 (2007) Vlasenko, B., Schuller, B.: Combining frame and turn-level information for robust recognition of emotions within speech. Interspeech, pp. 27–31 (2007)
26.
go back to reference Schuller, B., Arsic, D.: Emotion recognition in the noise applying large acoustic feature sets. Speech Prosody, 276–289 (2006) Schuller, B., Arsic, D.: Emotion recognition in the noise applying large acoustic feature sets. Speech Prosody, 276–289 (2006)
27.
go back to reference Steidl, S.: Automatic Classification of Emotion Related User States in Spontaneous Children’s Speech, pp. 1–250. University of Erlangen, Logos-Verlag (2009) Steidl, S.: Automatic Classification of Emotion Related User States in Spontaneous Children’s Speech, pp. 1–250. University of Erlangen, Logos-Verlag (2009)
28.
go back to reference Holmes, G., Donkin, A., Witten, I.: WEKA: a machine learning workbench. The IEEE Australian New Zealand Intelligent Information Systems Conference, pp. 357–361 (1994) Holmes, G., Donkin, A., Witten, I.: WEKA: a machine learning workbench. The IEEE Australian New Zealand Intelligent Information Systems Conference, pp. 357–361 (1994)
29.
go back to reference Viola, P., Jones, M.J.: Robust real-time face detection. Int. J. Comput. Vis. 57(2), 137–154 (2004)CrossRef Viola, P., Jones, M.J.: Robust real-time face detection. Int. J. Comput. Vis. 57(2), 137–154 (2004)CrossRef
30.
go back to reference Rowley, H.A., Baluja, S., Kanade, T.: Neural network-based face detection. IEEE Trans. Pattern Anal. Mach. Intell. 20(1), 23–38 (1998)CrossRef Rowley, H.A., Baluja, S., Kanade, T.: Neural network-based face detection. IEEE Trans. Pattern Anal. Mach. Intell. 20(1), 23–38 (1998)CrossRef
31.
go back to reference Osuna, E., Freund, R., Girosit, F.: Training support vector machines: an application to face detection. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 130–136 (1997) Osuna, E., Freund, R., Girosit, F.: Training support vector machines: an application to face detection. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 130–136 (1997)
32.
go back to reference Kobayashi, H., Hara, F.: Facial interaction between animated 3d face robot and human beings. In: The IEEE International Conference on Systems, Man, and Cybernetics. Computational Cybernetics and Simulation 4, pp. 3732–3737 Kobayashi, H., Hara, F.: Facial interaction between animated 3d face robot and human beings. In: The IEEE International Conference on Systems, Man, and Cybernetics. Computational Cybernetics and Simulation 4, pp. 3732–3737
33.
go back to reference Padgett, C., Cottrell, G.: Representing face images for emotion classification. Advances in Neural Information Processing Systems 9 (1997) Padgett, C., Cottrell, G.: Representing face images for emotion classification. Advances in Neural Information Processing Systems 9 (1997)
34.
go back to reference Cootes, T., Edwards, G., Taylor, C.: Active appearance models. In: 5th European Conference on Computer Vision, pp. 484–498 (1998) Cootes, T., Edwards, G., Taylor, C.: Active appearance models. In: 5th European Conference on Computer Vision, pp. 484–498 (1998)
35.
go back to reference Terzopoulos, D., Waters, K.: Analysis and synthesis of facial image sequences using physical and anatomical models. IEEE Trans. Pattern Anal. Mach. Intell. 15, 569–579 (1993)CrossRef Terzopoulos, D., Waters, K.: Analysis and synthesis of facial image sequences using physical and anatomical models. IEEE Trans. Pattern Anal. Mach. Intell. 15, 569–579 (1993)CrossRef
36.
go back to reference Lucey, S., Matthews, I., Hu, C.: AAM derived face representations for robust facial action recognition. In: 7th International Conference on in Automatic Face and Gesture Recognition (2006) Lucey, S., Matthews, I., Hu, C.: AAM derived face representations for robust facial action recognition. In: 7th International Conference on in Automatic Face and Gesture Recognition (2006)
37.
go back to reference Kearney, G., McKenzie, S.: Machine interpretation of emotion: design of a memory-based expert system for interpreting facial expressions in terms of signalled emotions. Cogn. Sci. 17, 589–622 (1993)CrossRef Kearney, G., McKenzie, S.: Machine interpretation of emotion: design of a memory-based expert system for interpreting facial expressions in terms of signalled emotions. Cogn. Sci. 17, 589–622 (1993)CrossRef
38.
go back to reference Ekman, P., Friesen, W.: Constants across cultures in the face and emotion. J. Pers. Soc. Psychol. 17(2), 124–129 (1971)CrossRef Ekman, P., Friesen, W.: Constants across cultures in the face and emotion. J. Pers. Soc. Psychol. 17(2), 124–129 (1971)CrossRef
39.
go back to reference Russell, J., Dols, J.: The Psychology of Facial Expression. Cambridge University Press (1997) Russell, J., Dols, J.: The Psychology of Facial Expression. Cambridge University Press (1997)
40.
go back to reference Littlewort, G., Whitehill, J., Wu, T.-F., Butko, N., Ruvolo, P., Movellan, J., Bartlett, M.: The motion in emotion—a CERT based approach to the FERA emotion challenge. Face Gesture 2011, 897–902 (2011) Littlewort, G., Whitehill, J., Wu, T.-F., Butko, N., Ruvolo, P., Movellan, J., Bartlett, M.: The motion in emotion—a CERT based approach to the FERA emotion challenge. Face Gesture 2011, 897–902 (2011)
41.
go back to reference Ekman, P., Friesen, W., Hager, J.: Facial action coding system: A technique for the measurement of facial movement. Number A Human Face. Consulting Psychologists Press, Palo Alto, USA (1978) Ekman, P., Friesen, W., Hager, J.: Facial action coding system: A technique for the measurement of facial movement. Number A Human Face. Consulting Psychologists Press, Palo Alto, USA (1978)
42.
go back to reference Wierzbicki, R.J., Tschoeppe, C., Ruf, T., Garbas, J.U.: EDIS-emotion-driven interactive systems. Int. SERIES Inf. Syst. Manag. Creat. Media 1, 59–68 (2013) Wierzbicki, R.J., Tschoeppe, C., Ruf, T., Garbas, J.U.: EDIS-emotion-driven interactive systems. Int. SERIES Inf. Syst. Manag. Creat. Media 1, 59–68 (2013)
43.
go back to reference Küblbeck, C., Ernst, A.: Face detection and tracking in video sequences using the modified census transformation. Image Vis. Comput. 24, 564–572 (2006)CrossRef Küblbeck, C., Ernst, A.: Face detection and tracking in video sequences using the modified census transformation. Image Vis. Comput. 24, 564–572 (2006)CrossRef
44.
go back to reference Gabrielsson, A., Lindstrom, E.: The role of structure in the musical expression of emotions. In: Theory, Research, and Applications, Handbook of Music and Emotion, pp. 367–400 (2010) Gabrielsson, A., Lindstrom, E.: The role of structure in the musical expression of emotions. In: Theory, Research, and Applications, Handbook of Music and Emotion, pp. 367–400 (2010)
45.
go back to reference van der Zwaag, M.D., Westerink, J.L., van den Broek, E.L.: Emotional and psychophysiological responses to tempo, mode, and percussiveness. Musicae Sci. 15(2), 250–269 (2011)CrossRef van der Zwaag, M.D., Westerink, J.L., van den Broek, E.L.: Emotional and psychophysiological responses to tempo, mode, and percussiveness. Musicae Sci. 15(2), 250–269 (2011)CrossRef
46.
go back to reference Trochidis, K., Bigand, E.: Investigation of the effect of mode and tempo on emotional responses to music using EEG power asymmetry. J. Psychophysiol. 27(3), 142–147 (2013)CrossRef Trochidis, K., Bigand, E.: Investigation of the effect of mode and tempo on emotional responses to music using EEG power asymmetry. J. Psychophysiol. 27(3), 142–147 (2013)CrossRef
47.
go back to reference Fernández-Sotos, A., Fernández-Caballero, A., Latorre, J.M.: Influence of tempo and rhythmic unit in musical emotion regulation. Front. Comput. Neurosci. 10, 80 (2016)CrossRef Fernández-Sotos, A., Fernández-Caballero, A., Latorre, J.M.: Influence of tempo and rhythmic unit in musical emotion regulation. Front. Comput. Neurosci. 10, 80 (2016)CrossRef
48.
go back to reference Fernández-Sotos, A., Fernández-Caballero, A., Latorre, J.M.: Elicitation of emotions through music: the influence of note value. In: Artificial Computation in Biology and Medicine, 488–497 (2014) Fernández-Sotos, A., Fernández-Caballero, A., Latorre, J.M.: Elicitation of emotions through music: the influence of note value. In: Artificial Computation in Biology and Medicine, 488–497 (2014)
49.
go back to reference Sokolova, M.V., Fernández-Caballero, A., Ros, L., Fernández-Aguilar, L., Latorre, J.M.: Experimentation on emotion regulation with single-colored images. In: ICT-based Solutions in Real Life Situations, Ambient Assisted Living, pp. 265–276 (2015) Sokolova, M.V., Fernández-Caballero, A., Ros, L., Fernández-Aguilar, L., Latorre, J.M.: Experimentation on emotion regulation with single-colored images. In: ICT-based Solutions in Real Life Situations, Ambient Assisted Living, pp. 265–276 (2015)
50.
go back to reference Sokolova, M.V., Fernández-Caballero, A.: A review on the role of color and light in affective computing. Appl. Sci. 5(3), 275–293 (2015)CrossRef Sokolova, M.V., Fernández-Caballero, A.: A review on the role of color and light in affective computing. Appl. Sci. 5(3), 275–293 (2015)CrossRef
51.
go back to reference Ortiz-García-Cervigón, V., Sokolova, M.V., García-Muñoz, R., Fernández-Caballero, A.: LED strips for color- and illumination-based emotion regulation at home. In: ICT-based Solutions in Real Life Situations, Ambient Assisted Living, pp. 277–287 (2015) Ortiz-García-Cervigón, V., Sokolova, M.V., García-Muñoz, R., Fernández-Caballero, A.: LED strips for color- and illumination-based emotion regulation at home. In: ICT-based Solutions in Real Life Situations, Ambient Assisted Living, pp. 277–287 (2015)
Metadata
Title
Emotion Detection and Regulation from Personal Assistant Robot in Smart Environment
Authors
José Carlos Castillo
Álvaro Castro-González
Fernándo Alonso-Martín
Antonio Fernández-Caballero
Miguel Ángel Salichs
Copyright Year
2018
DOI
https://doi.org/10.1007/978-3-319-62530-0_10

Premium Partner