Skip to main content
Top
Published in: International Journal of Social Robotics 5/2017

14-09-2017 | Survey

Automatic Affect Perception Based on Body Gait and Posture: A Survey

Authors: Benjamin Stephens-Fripp, Fazel Naghdy, David Stirling, Golshah Naghdy

Published in: International Journal of Social Robotics | Issue 5/2017

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

There has been a growing interest in machine-based recognition of emotions from body gait and its combination with other modalities. In order to highlight the major trends and state of the art in this area, the literature dealing with machine-based human emotion perception through gait and posture is explored. Initially the effectiveness of human intellect and intuition in perceiving emotions in a range of cultures is examined. Subsequently, major studies in machine-based affect recognition are reviewed and their performance is compared. The survey concludes by critically analysing some of the issues raised in affect recognition using gait and posture, and identifying gaps in the current understanding in this area.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference de Gelder B (2009) Why bodies? Twelve reasons for including bodily expressions in affective neuroscience. Philos Trans R Soc B Biol Sci 364:3475–3484CrossRef de Gelder B (2009) Why bodies? Twelve reasons for including bodily expressions in affective neuroscience. Philos Trans R Soc B Biol Sci 364:3475–3484CrossRef
2.
go back to reference Kale A, Sundaresan A, Rajagopalan A, Cuntoor NP, Roy-Chowdhury AK, Kruger V et al (2004) Identification of humans using gait. Image Process IEEE Trans 13:1163–1173CrossRef Kale A, Sundaresan A, Rajagopalan A, Cuntoor NP, Roy-Chowdhury AK, Kruger V et al (2004) Identification of humans using gait. Image Process IEEE Trans 13:1163–1173CrossRef
3.
go back to reference Van Der Zee S, Poppe R, Taylor P, Anderson R (2015) To freeze or not to freeze: A motion-capture approach to detecting deceit. In: Proceedings of the Hawaii international conference on system sciences, Kauai, HI Van Der Zee S, Poppe R, Taylor P, Anderson R (2015) To freeze or not to freeze: A motion-capture approach to detecting deceit. In: Proceedings of the Hawaii international conference on system sciences, Kauai, HI
4.
go back to reference Alaqtash M, Sarkodie-Gyan T, Yu H, Fuentes O, Brower R, Abdelgawad A (2011) Automatic classification of pathological gait patterns using ground reaction forces and machine learning algorithms. In: Engineering in medicine and biology society, EMBC. Annual international conference of the IEEE, pp 453–457 Alaqtash M, Sarkodie-Gyan T, Yu H, Fuentes O, Brower R, Abdelgawad A (2011) Automatic classification of pathological gait patterns using ground reaction forces and machine learning algorithms. In: Engineering in medicine and biology society, EMBC. Annual international conference of the IEEE, pp 453–457
5.
go back to reference Walk RD, Walters KL (1988) Perception of the smile and other emotions of the body and face at different distances. Bull Psychon Soc 26:510–510 Walk RD, Walters KL (1988) Perception of the smile and other emotions of the body and face at different distances. Bull Psychon Soc 26:510–510
6.
go back to reference Kleinsmith A, Bianchi-Berthouze N (2013) Affective body expression perception and recognition: a survey. Affect Comput IEEE Trans 4:15–33CrossRef Kleinsmith A, Bianchi-Berthouze N (2013) Affective body expression perception and recognition: a survey. Affect Comput IEEE Trans 4:15–33CrossRef
7.
go back to reference Ekman P, Friesen WV (1969) Nonverbal leakage and clues to deception. Psychiatry 32:88–106CrossRef Ekman P, Friesen WV (1969) Nonverbal leakage and clues to deception. Psychiatry 32:88–106CrossRef
8.
go back to reference Karg M, Kuhnlenz K, Buss M (2010) Recognition of affect based on gait patterns. Syst Man Cybern Part B Cybern IEEE Trans 40:1050–1061CrossRef Karg M, Kuhnlenz K, Buss M (2010) Recognition of affect based on gait patterns. Syst Man Cybern Part B Cybern IEEE Trans 40:1050–1061CrossRef
9.
go back to reference Tajadura-Jiménez A, Basia M, Deroy O, Fairhurst M, Marquardt N, Bianchi-Berthouze N (2015) As light as your footsteps: altering walking sounds to change perceived body weight, emotional state and gait. In: Proceedings of the 33rd annual ACM conference on human factors in computing systems, pp 2943–2952 Tajadura-Jiménez A, Basia M, Deroy O, Fairhurst M, Marquardt N, Bianchi-Berthouze N (2015) As light as your footsteps: altering walking sounds to change perceived body weight, emotional state and gait. In: Proceedings of the 33rd annual ACM conference on human factors in computing systems, pp 2943–2952
10.
go back to reference Zeng Z, Pantic M, Roisman GI, Huang TS (2009) A survey of affect recognition methods: audio, visual, and spontaneous expressions. Pattern Anal Mach Intell IEEE Trans 31:39–58CrossRef Zeng Z, Pantic M, Roisman GI, Huang TS (2009) A survey of affect recognition methods: audio, visual, and spontaneous expressions. Pattern Anal Mach Intell IEEE Trans 31:39–58CrossRef
11.
go back to reference Karg M, Samadani AA, Gorbet R, Kuhnlenz K, Hoey J, Kulic D (2013) Body movements for affective expression: a survey of automatic recognition and generation. Affect Comput IEEE Trans 4:341–359CrossRef Karg M, Samadani AA, Gorbet R, Kuhnlenz K, Hoey J, Kulic D (2013) Body movements for affective expression: a survey of automatic recognition and generation. Affect Comput IEEE Trans 4:341–359CrossRef
12.
go back to reference Zacharatos H, Gatzoulis C, Chrysanthou YL (2014) Automatic emotion recognition based on body movement analysis: a survey. Comput Gr Appl IEEE 34:35–45CrossRef Zacharatos H, Gatzoulis C, Chrysanthou YL (2014) Automatic emotion recognition based on body movement analysis: a survey. Comput Gr Appl IEEE 34:35–45CrossRef
13.
go back to reference McColl D, Hong A, Hatakeyama N, Nejat G, Benhabib B (2016) A survey of autonomous human affect detection methods for social robots engaged in natural HRI. J Intell Robot Syst 82:101–133CrossRef McColl D, Hong A, Hatakeyama N, Nejat G, Benhabib B (2016) A survey of autonomous human affect detection methods for social robots engaged in natural HRI. J Intell Robot Syst 82:101–133CrossRef
14.
go back to reference Kozlowski LT, Cutting JE (1977) Recognizing the sex of a walker from a dynamic point-light display. Percept Psychophys 21:575–580CrossRef Kozlowski LT, Cutting JE (1977) Recognizing the sex of a walker from a dynamic point-light display. Percept Psychophys 21:575–580CrossRef
15.
go back to reference Cutting JE, Kozlowski LT (1977) Recognizing friends by their walk: gait perception without familiarity cues. Bull Psychon Soc 9:353–356CrossRef Cutting JE, Kozlowski LT (1977) Recognizing friends by their walk: gait perception without familiarity cues. Bull Psychon Soc 9:353–356CrossRef
16.
go back to reference Brownlow S, Dixon AR, Egbert CA, Radcliffe RD (1997) Perception of movement and dancer characteristics from point-light displays of dance. Psychol Rec 47:411CrossRef Brownlow S, Dixon AR, Egbert CA, Radcliffe RD (1997) Perception of movement and dancer characteristics from point-light displays of dance. Psychol Rec 47:411CrossRef
17.
go back to reference Demeijer M (1989) The contribution of general features of body movement to the attribution of emotions. J Nonverbal Behav 13:247–268 (Win) CrossRef Demeijer M (1989) The contribution of general features of body movement to the attribution of emotions. J Nonverbal Behav 13:247–268 (Win) CrossRef
18.
go back to reference Wallbott HG (1998) Bodily expression of emotion. Eur J Soc Psychol 28:879–896CrossRef Wallbott HG (1998) Bodily expression of emotion. Eur J Soc Psychol 28:879–896CrossRef
19.
go back to reference Coulson M (2004) Attributing emotion to static body postures: recognition accuracy, confusions, and viewpoint dependence. J Nonverbal Behav 28:117–139CrossRef Coulson M (2004) Attributing emotion to static body postures: recognition accuracy, confusions, and viewpoint dependence. J Nonverbal Behav 28:117–139CrossRef
20.
go back to reference Pollick FE, Paterson HM, Bruderlin A, Sanford AJ (2001) Perceiving affect from arm movement. Cognition 82:B51–B61CrossRef Pollick FE, Paterson HM, Bruderlin A, Sanford AJ (2001) Perceiving affect from arm movement. Cognition 82:B51–B61CrossRef
21.
go back to reference Dittrich WH, Troscianko T, Lea SE, Morgan D (1996) Perception of emotion from dynamic point-light displays represented in dance. Perception 25:727–738CrossRef Dittrich WH, Troscianko T, Lea SE, Morgan D (1996) Perception of emotion from dynamic point-light displays represented in dance. Perception 25:727–738CrossRef
22.
go back to reference de Gelder B, Van den Stock J, Meeren HK, Sinke CB, Kret ME, Tamietto M (2010) Standing up for the body. Recent progress in uncovering the networks involved in the perception of bodies and bodily expressions. Neurosci Biobehav Rev 34:513–27CrossRef de Gelder B, Van den Stock J, Meeren HK, Sinke CB, Kret ME, Tamietto M (2010) Standing up for the body. Recent progress in uncovering the networks involved in the perception of bodies and bodily expressions. Neurosci Biobehav Rev 34:513–27CrossRef
23.
go back to reference Schneider S, Christensen A, Haussinger FB, Fallgatter AJ, Giese MA, Ehlis AC (2014) Show me how you walk and I tell you how you feel—a functional near-infrared spectroscopy study on emotion perception based on human gait. Neuroimage 85(Pt 1):380–90CrossRef Schneider S, Christensen A, Haussinger FB, Fallgatter AJ, Giese MA, Ehlis AC (2014) Show me how you walk and I tell you how you feel—a functional near-infrared spectroscopy study on emotion perception based on human gait. Neuroimage 85(Pt 1):380–90CrossRef
24.
go back to reference Ekman P, Friesen WV (1971) Constants across cultures in the face and emotion. J Pers Soc Psychol 17:124–129CrossRef Ekman P, Friesen WV (1971) Constants across cultures in the face and emotion. J Pers Soc Psychol 17:124–129CrossRef
25.
go back to reference Crivelli C, Jarillo S, Russell JA, Fernandez-Dols JM (2016) Reading emotions from faces in two indigenous societies. J Exp Psychol Gen 145:830–43CrossRef Crivelli C, Jarillo S, Russell JA, Fernandez-Dols JM (2016) Reading emotions from faces in two indigenous societies. J Exp Psychol Gen 145:830–43CrossRef
26.
go back to reference Kleinsmith A, De Silva PR, Bianchi-Berthouze N (2006) Cross-cultural differences in recognizing affect from body posture. Interact Comput 18:1371–1389CrossRef Kleinsmith A, De Silva PR, Bianchi-Berthouze N (2006) Cross-cultural differences in recognizing affect from body posture. Interact Comput 18:1371–1389CrossRef
27.
go back to reference Elfenbein HA (2015) In-group advantage and other-group bias in facial emotion recognition. In: Understanding facial expressions in communication: cross-cultural and multidisciplinary perspectives ed, pp 57–71 Elfenbein HA (2015) In-group advantage and other-group bias in facial emotion recognition. In: Understanding facial expressions in communication: cross-cultural and multidisciplinary perspectives ed, pp 57–71
28.
go back to reference Quiros-Ramirez MA (2015) Considering cross-cultural context in the automatic recognition of emotions. Int J Mach Learn Cybernet 6:119–127CrossRef Quiros-Ramirez MA (2015) Considering cross-cultural context in the automatic recognition of emotions. Int J Mach Learn Cybernet 6:119–127CrossRef
29.
go back to reference Zen G, Porzi L, Sangineto E, Ricci E, Sebe N (2016) Learning personalized models for facial expression analysis and gesture recognition. IEEE Trans Multimed 18:775–788CrossRef Zen G, Porzi L, Sangineto E, Ricci E, Sebe N (2016) Learning personalized models for facial expression analysis and gesture recognition. IEEE Trans Multimed 18:775–788CrossRef
30.
go back to reference Wilson PA, Lewandowska-Tomaszczyk B (2014) Affective robotics: modelling and testing cultural prototypes. Cognit Comput 6:814–840CrossRef Wilson PA, Lewandowska-Tomaszczyk B (2014) Affective robotics: modelling and testing cultural prototypes. Cognit Comput 6:814–840CrossRef
31.
go back to reference Atkinson AP, Dittrich WH, Gemmell AJ, Young AW (2004) Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception 33:717–746CrossRef Atkinson AP, Dittrich WH, Gemmell AJ, Young AW (2004) Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception 33:717–746CrossRef
32.
go back to reference Gross MM, Crane EA, Fredrickson BL (2012) Effort-shape and kinematic assessment of bodily expression of emotion during gait. Hum Mov Sci 31:202–221CrossRef Gross MM, Crane EA, Fredrickson BL (2012) Effort-shape and kinematic assessment of bodily expression of emotion during gait. Hum Mov Sci 31:202–221CrossRef
33.
go back to reference Nayak N, Sethi R, Song B, Roy-Chowdhury A (2011) Motion pattern analysis for modeling and recognition of complex human activities. Guide to video analysis of humans: looking at people Nayak N, Sethi R, Song B, Roy-Chowdhury A (2011) Motion pattern analysis for modeling and recognition of complex human activities. Guide to video analysis of humans: looking at people
34.
go back to reference Lankes M, Bernhaupt R, Tscheligi M (2010) Evaluating user experience factors using experiments: expressive artificial faces embedded in contexts. In: Bernhaupt R (ed) Evaluating user experience in games: concepts and methods. Springer, London, pp 165–183CrossRef Lankes M, Bernhaupt R, Tscheligi M (2010) Evaluating user experience factors using experiments: expressive artificial faces embedded in contexts. In: Bernhaupt R (ed) Evaluating user experience in games: concepts and methods. Springer, London, pp 165–183CrossRef
35.
go back to reference Buisine S, Courgeon M, Charles A, Clavel C, Martin J-C, Tan N et al (2014) The role of body postures in the recognition of emotions in contextually rich scenarios. Int J Hum Comput Interact 30:52–62CrossRef Buisine S, Courgeon M, Charles A, Clavel C, Martin J-C, Tan N et al (2014) The role of body postures in the recognition of emotions in contextually rich scenarios. Int J Hum Comput Interact 30:52–62CrossRef
36.
go back to reference Willis ML, Palermo R, Burke D (2011) Judging approachability on the face of it: the influence of face and body expressions on the perception of approachability. Emotion 11:514–23CrossRef Willis ML, Palermo R, Burke D (2011) Judging approachability on the face of it: the influence of face and body expressions on the perception of approachability. Emotion 11:514–23CrossRef
37.
go back to reference Kret ME, de Gelder B (2010) Social context influences recognition of bodily expressions. Exp Brain Res 203 169–180 Kret ME, de Gelder B (2010) Social context influences recognition of bodily expressions. Exp Brain Res 203 169–180
38.
go back to reference Van den Stock J, Vandenbulcke M, Sinke CB, de Gelder B (2014) Affective scenes influence fear perception of individual body expressions. Hum Brain Mapp 35:492–502CrossRef Van den Stock J, Vandenbulcke M, Sinke CB, de Gelder B (2014) Affective scenes influence fear perception of individual body expressions. Hum Brain Mapp 35:492–502CrossRef
39.
go back to reference Kret ME, Roelofs K, Stekelenburg JJ, de Gelder B (2013) Emotional signals from faces, bodies and scenes influence observers’ face expressions, fixations and pupil-size. Front Hum Neurosci 7:810. doi:10.3389/fnhum.2013.00810 Kret ME, Roelofs K, Stekelenburg JJ, de Gelder B (2013) Emotional signals from faces, bodies and scenes influence observers’ face expressions, fixations and pupil-size. Front Hum Neurosci 7:810. doi:10.​3389/​fnhum.​2013.​00810
40.
go back to reference Muller PM, Amin S, Verma P, Andriluka M, Bulling A (2015) Emotion recognition from embedded bodily expressions and speech during dyadic interactions. In: 2015 International conference on affective computing and intelligent interaction, ACII 2015, pp 663–669 Muller PM, Amin S, Verma P, Andriluka M, Bulling A (2015) Emotion recognition from embedded bodily expressions and speech during dyadic interactions. In: 2015 International conference on affective computing and intelligent interaction, ACII 2015, pp 663–669
41.
go back to reference Kapur A, Kapur A, Virji-Babul N, Tzanetakis G, Driessen PF (2005) Gesture-based affective computing on motion capture data. In: Tao J, Picard RW (eds) Affective computing and intelligent interaction, proceedings. vol 3784. Springer, Berlin, pp 1–7 Kapur A, Kapur A, Virji-Babul N, Tzanetakis G, Driessen PF (2005) Gesture-based affective computing on motion capture data. In: Tao J, Picard RW (eds) Affective computing and intelligent interaction, proceedings. vol 3784. Springer, Berlin, pp 1–7
42.
go back to reference Lim A, Okuno HG (2014) The MEI eobot: towards using motherese to develop multimodal emotional intelligence. IEEE Trans Auton Ment Dev 6:126–138CrossRef Lim A, Okuno HG (2014) The MEI eobot: towards using motherese to develop multimodal emotional intelligence. IEEE Trans Auton Ment Dev 6:126–138CrossRef
43.
go back to reference Pedregosa F, Varoquaux G, Gramfort A, Michel V, Thirion B, Grisel O et al (2011) Scikit-learn: machine learning in python. J Mach Learn Res 12:2825–2830MathSciNetMATH Pedregosa F, Varoquaux G, Gramfort A, Michel V, Thirion B, Grisel O et al (2011) Scikit-learn: machine learning in python. J Mach Learn Res 12:2825–2830MathSciNetMATH
44.
go back to reference Bianchi-Berthouze N, Kleinsmith A (2003) A categorical approach to affective gesture recognition. Connect Sci 15:259–269CrossRef Bianchi-Berthouze N, Kleinsmith A (2003) A categorical approach to affective gesture recognition. Connect Sci 15:259–269CrossRef
46.
go back to reference Garber-Barron M, Mei S (2012) Using body movement and posture for emotion detection in non-acted scenarios. In: Fuzzy systems (FUZZ-IEEE). IEEE international conference on 2012, pp 1–8 Garber-Barron M, Mei S (2012) Using body movement and posture for emotion detection in non-acted scenarios. In: Fuzzy systems (FUZZ-IEEE). IEEE international conference on 2012, pp 1–8
47.
go back to reference Kleinsmith A, Bianchi-Berthouze N, Steed A (2011) Automatic recognition of non-acted affective postures. Syst Man Cybern Part B Cybern IEEE Trans 41:1027–1038CrossRef Kleinsmith A, Bianchi-Berthouze N, Steed A (2011) Automatic recognition of non-acted affective postures. Syst Man Cybern Part B Cybern IEEE Trans 41:1027–1038CrossRef
48.
go back to reference Janssen D, Schollhorn WI, Lubienetzki J, Folling K, Kokenge H, Davids K (2008) Recognition of emotions in gait patterns by means of artificial neural nets. J Nonverbal Behav 32:79–92CrossRef Janssen D, Schollhorn WI, Lubienetzki J, Folling K, Kokenge H, Davids K (2008) Recognition of emotions in gait patterns by means of artificial neural nets. J Nonverbal Behav 32:79–92CrossRef
49.
go back to reference Fawver B, Beatty GF, Naugle KM, Hass CJ, Janelle CM (2015) Emotional state impacts center of pressure displacement before forward gait initiation. J Appl Biomech 31:35–40CrossRef Fawver B, Beatty GF, Naugle KM, Hass CJ, Janelle CM (2015) Emotional state impacts center of pressure displacement before forward gait initiation. J Appl Biomech 31:35–40CrossRef
50.
go back to reference Giraud T, Jáuregui DAG, Hua J, Isableu B, Filaire E, Scanff CL et al (2013) Assessing postural control for affect recognition using video and force plates. In: Proceedings—2013 Humaine association conference on affective computing and intelligent interaction, ACII 2013, pp 109–115 Giraud T, Jáuregui DAG, Hua J, Isableu B, Filaire E, Scanff CL et al (2013) Assessing postural control for affect recognition using video and force plates. In: Proceedings—2013 Humaine association conference on affective computing and intelligent interaction, ACII 2013, pp 109–115
51.
go back to reference Shotton J, Sharp T, Kipman A, Fitzgibbon A, Finocchio M, Blake A et al (2013) Real-time human pose recognition in parts from single depth images. Commun ACM 56:116–124CrossRef Shotton J, Sharp T, Kipman A, Fitzgibbon A, Finocchio M, Blake A et al (2013) Real-time human pose recognition in parts from single depth images. Commun ACM 56:116–124CrossRef
52.
go back to reference Xiao Y, Yuan J, Thalmann D (2013) Human-virtual human interaction by upper body gesture understanding. In: Proceedings of the 19th ACM symposium on virtual reality software and technology, pp 133–142 Xiao Y, Yuan J, Thalmann D (2013) Human-virtual human interaction by upper body gesture understanding. In: Proceedings of the 19th ACM symposium on virtual reality software and technology, pp 133–142
53.
go back to reference Li S, Cui L, Zhu C, Li B, Zhao N, Zhu T (2016) Emotion recognition using Kinect motion capture data of human gaits. PeerJ 4:e2364CrossRef Li S, Cui L, Zhu C, Li B, Zhao N, Zhu T (2016) Emotion recognition using Kinect motion capture data of human gaits. PeerJ 4:e2364CrossRef
54.
go back to reference Xu J, Sakazawa S (2014) Temporal fusion approach using segment weight for affect recognition from body movements. In: 2014 ACM conference on multimedia, MM 2014, pp 833–836 Xu J, Sakazawa S (2014) Temporal fusion approach using segment weight for affect recognition from body movements. In: 2014 ACM conference on multimedia, MM 2014, pp 833–836
55.
go back to reference Bernhardt D, Robinson P (2007) Detecting affect from non-stylised body motions. In: 2nd International conference on affective computing and intelligent interaction, ACII 2007 vol 4738 LNCS (ed). Lisbon, pp 59–70 Bernhardt D, Robinson P (2007) Detecting affect from non-stylised body motions. In: 2nd International conference on affective computing and intelligent interaction, ACII 2007 vol 4738 LNCS (ed). Lisbon, pp 59–70
56.
go back to reference Sanghvi J, Castellano G, Leite I, Pereira A, McOwan PW, Paiva A (2011) Automatic analysis of affective postures and body motion to detect engagement with a game companion. In: Human–robot interaction (HRI), 2011 6th ACM/IEEE international conference, pp 305–311 Sanghvi J, Castellano G, Leite I, Pereira A, McOwan PW, Paiva A (2011) Automatic analysis of affective postures and body motion to detect engagement with a game companion. In: Human–robot interaction (HRI), 2011 6th ACM/IEEE international conference, pp 305–311
57.
go back to reference Laban R (1956) Principles of dance and movement notation. Macdonald & Evans, New York Laban R (1956) Principles of dance and movement notation. Macdonald & Evans, New York
58.
go back to reference Hachimura K, Takashina K, Yoshimura M (2005) Analysis and evaluation of dancing movement based on LMA. In: Robot and human interactive communication, 2005. ROMAN 2005. IEEE international workshop, pp 294–299 Hachimura K, Takashina K, Yoshimura M (2005) Analysis and evaluation of dancing movement based on LMA. In: Robot and human interactive communication, 2005. ROMAN 2005. IEEE international workshop, pp 294–299
59.
go back to reference Zacharatos H, Gatzoulis C, Chrysanthou Y, Aristidou A (2013) Emotion recognition for exergames using Laban movement analysis. In: 6th International conference on motion in games, MIG 2013, Dublin, pp 39–43 Zacharatos H, Gatzoulis C, Chrysanthou Y, Aristidou A (2013) Emotion recognition for exergames using Laban movement analysis. In: 6th International conference on motion in games, MIG 2013, Dublin, pp 39–43
60.
go back to reference Fourati N, Pelachaud C (2015) Multi-level classification of emotional body expression. In: 2015 11th IEEE international conference and workshops on automatic face and gesture recognition, FG Fourati N, Pelachaud C (2015) Multi-level classification of emotional body expression. In: 2015 11th IEEE international conference and workshops on automatic face and gesture recognition, FG
61.
go back to reference Woo Hyun K, Jeong Woo P, Won Hyong L, Myung Jin C, Hui Sung L (2013) LMA based emotional motion representation using RGB-D camera. In: Human–robot interaction (HRI), 2013 8th ACM/IEEE international conference, pp 163–164 Woo Hyun K, Jeong Woo P, Won Hyong L, Myung Jin C, Hui Sung L (2013) LMA based emotional motion representation using RGB-D camera. In: Human–robot interaction (HRI), 2013 8th ACM/IEEE international conference, pp 163–164
62.
go back to reference McColl D, Nejat G, Ieee (2014) Determining the affective body language of older adults during socially assistive HRI.2014 Ieee/Rsj international conference on intelligent robots and systems (Iros 2014), pp 2633–2638 McColl D, Nejat G, Ieee (2014) Determining the affective body language of older adults during socially assistive HRI.2014 Ieee/Rsj international conference on intelligent robots and systems (Iros 2014), pp 2633–2638
63.
go back to reference McColl D, Jiang C, Nejat G (2016) Classifying a Person’s degree of accessibility from natural body language during social human–robot interactions. IEEE Trans Cybern PP:1–15CrossRef McColl D, Jiang C, Nejat G (2016) Classifying a Person’s degree of accessibility from natural body language during social human–robot interactions. IEEE Trans Cybern PP:1–15CrossRef
64.
go back to reference Hall M, Frank E, Holmes G, Pfahringer B, Reutemann P, Witten IH (2009) The WEKA data mining software: an update. SIGKDD Explor Newsl 11:10–18CrossRef Hall M, Frank E, Holmes G, Pfahringer B, Reutemann P, Witten IH (2009) The WEKA data mining software: an update. SIGKDD Explor Newsl 11:10–18CrossRef
65.
go back to reference Piana S, Staglian A, Odone F, Camurri A (2016) Adaptive body gesture representation for automatic emotion recognition. ACM Trans Interact Intell Syst 6:1–31CrossRef Piana S, Staglian A, Odone F, Camurri A (2016) Adaptive body gesture representation for automatic emotion recognition. ACM Trans Interact Intell Syst 6:1–31CrossRef
66.
go back to reference Senecal S, Cuel L, Aristidou A, Magnenat-Thalmann N (2016) Continuous body emotion recognition system during theater performances. Comput Anim Virtual Worlds 27:311–320CrossRef Senecal S, Cuel L, Aristidou A, Magnenat-Thalmann N (2016) Continuous body emotion recognition system during theater performances. Comput Anim Virtual Worlds 27:311–320CrossRef
67.
go back to reference Kaza K, Psaltis A, Stefanidis K, Apostolakis KC, Thermos S, Dimitropoulos K, Daras P (2016) Body motion analysis for emotion recognition in serious games. In: International Conference on Universal Access in Human-Computer Interaction, July. Springer, pp 33–42 Kaza K, Psaltis A, Stefanidis K, Apostolakis KC, Thermos S, Dimitropoulos K, Daras P (2016) Body motion analysis for emotion recognition in serious games. In: International Conference on Universal Access in Human-Computer Interaction, July. Springer, pp 33–42
68.
go back to reference Arunnehru J, Geetha MK (2017) Automatic human emotion recognition in surveillance video. In: Intelligent Techniques in Signal Processing for Multimedia Security. Springer, pp 321–342 Arunnehru J, Geetha MK (2017) Automatic human emotion recognition in surveillance video. In: Intelligent Techniques in Signal Processing for Multimedia Security. Springer, pp 321–342
69.
go back to reference Park H, Park JII, Kim UM, Woo N (2004) Emotion recognition from dance image sequences using contour approximation. In: Lecture notes in computer science (including subseries lecture notes in artificial intelligence and lecture notes in bioinformatics) vol 3138 (ed), pp 547–555 Park H, Park JII, Kim UM, Woo N (2004) Emotion recognition from dance image sequences using contour approximation. In: Lecture notes in computer science (including subseries lecture notes in artificial intelligence and lecture notes in bioinformatics) vol 3138 (ed), pp 547–555
70.
go back to reference Barakova EI, Lourens T (2010) Expressing and interpreting emotional movements in social games with robots. Pers Ubiquit Comput 14:457–467CrossRef Barakova EI, Lourens T (2010) Expressing and interpreting emotional movements in social games with robots. Pers Ubiquit Comput 14:457–467CrossRef
71.
go back to reference Lourens T, van Berkel R, Barakova E (2010) Communicating emotions and mental states to robots in a real time parallel framework using Laban movement analysis. Robot Auton Syst 58:1256–1265CrossRef Lourens T, van Berkel R, Barakova E (2010) Communicating emotions and mental states to robots in a real time parallel framework using Laban movement analysis. Robot Auton Syst 58:1256–1265CrossRef
72.
go back to reference Samadani AA, Ghodsi A, Kulic D (2013) Discriminative functional analysis of human movements. Pattern Recogn Lett 34:1829–1839CrossRef Samadani AA, Ghodsi A, Kulic D (2013) Discriminative functional analysis of human movements. Pattern Recogn Lett 34:1829–1839CrossRef
73.
go back to reference Venture G, Kadone H, Zhang TX, Grezes J, Berthoz A, Hicheur H (2014) Recognizing emotions conveyed by human gait. Int J Soc Robot 6:621–632CrossRef Venture G, Kadone H, Zhang TX, Grezes J, Berthoz A, Hicheur H (2014) Recognizing emotions conveyed by human gait. Int J Soc Robot 6:621–632CrossRef
74.
go back to reference Kar R, Chakraborty A, Konar A, Janarthanan R (2013) Emotion recognition system by gesture analysis using fuzzy sets. In: Lecture notes in computer science (including subseries lecture notes in artificial intelligence and lecture notes in bioinformatics) vol 8298 LNCS (ed), pp 354–363 Kar R, Chakraborty A, Konar A, Janarthanan R (2013) Emotion recognition system by gesture analysis using fuzzy sets. In: Lecture notes in computer science (including subseries lecture notes in artificial intelligence and lecture notes in bioinformatics) vol 8298 LNCS (ed), pp 354–363
75.
go back to reference Samadani AA, Gorbet R, Kulic D (2014) Affective movement recognition based on generative and discriminative stochastic dynamic models. Hum Mach Syst IEEE Trans 44:454–467CrossRef Samadani AA, Gorbet R, Kulic D (2014) Affective movement recognition based on generative and discriminative stochastic dynamic models. Hum Mach Syst IEEE Trans 44:454–467CrossRef
76.
go back to reference D’mello SK, Kory J (2015) A review and meta-analysis of multimodal affect detection systems. ACM Comput Surv 47:1–36CrossRef D’mello SK, Kory J (2015) A review and meta-analysis of multimodal affect detection systems. ACM Comput Surv 47:1–36CrossRef
77.
go back to reference Gunes H, Piccardi M (2005) Fusing face and body gesture for machine recognition of emotions. In: Robot and human interactive communication 2005. ROMAN 2005. IEEE International workshop on 2005, pp 306–311 Gunes H, Piccardi M (2005) Fusing face and body gesture for machine recognition of emotions. In: Robot and human interactive communication 2005. ROMAN 2005. IEEE International workshop on 2005, pp 306–311
78.
go back to reference Gunes H, Piccardi M (2009) Automatic temporal segment detection and affect recognition from face and body display. Syst Man Cybern Part B Cybern IEEE Trans 39:64–84CrossRef Gunes H, Piccardi M (2009) Automatic temporal segment detection and affect recognition from face and body display. Syst Man Cybern Part B Cybern IEEE Trans 39:64–84CrossRef
79.
go back to reference Shan C, Gong S, McOwan PW (2007) Beyond facial expressions: learning human emotion from body gestures. In: BMVC, 2007, pp 1–10 Shan C, Gong S, McOwan PW (2007) Beyond facial expressions: learning human emotion from body gestures. In: BMVC, 2007, pp 1–10
80.
go back to reference Shizhi C, YingLi T, Qingshan L, Metaxas DN (2011) Recognizing expressions from face and body gesture by temporal normalized motion and appearance features. In: Computer vision and pattern recognition workshops (CVPRW). IEEE computer society conference on 2011, pp 7–12 Shizhi C, YingLi T, Qingshan L, Metaxas DN (2011) Recognizing expressions from face and body gesture by temporal normalized motion and appearance features. In: Computer vision and pattern recognition workshops (CVPRW). IEEE computer society conference on 2011, pp 7–12
81.
go back to reference Shizhi C, YingLi T (2013) Margin-constrained multiple kernel learning based multi-modal fusion for affect recognition. In: Automatic face and gesture recognition (FG), 2013 10th IEEE international conference and workshops on 2013, pp 1–7 Shizhi C, YingLi T (2013) Margin-constrained multiple kernel learning based multi-modal fusion for affect recognition. In: Automatic face and gesture recognition (FG), 2013 10th IEEE international conference and workshops on 2013, pp 1–7
82.
go back to reference Kessous L, Castellano G, Caridakis G (2010) Multimodal emotion recognition in speech-based interaction using facial expression, body gesture and acoustic analysis. J Multimodal User Interfaces 3:33–48CrossRef Kessous L, Castellano G, Caridakis G (2010) Multimodal emotion recognition in speech-based interaction using facial expression, body gesture and acoustic analysis. J Multimodal User Interfaces 3:33–48CrossRef
83.
go back to reference Camurri A, Coletta P, Massari A, Mazzarino B, Peri M, Ricchetti M et al (2004) Toward real-time multimodal processing: EyesWeb 4.0. In: Proceedings of the artificial intelligence and the simulation of behaviour (AISB), 2004 convention: motion. Emotion and cognition 2004, pp 22–26 Camurri A, Coletta P, Massari A, Mazzarino B, Peri M, Ricchetti M et al (2004) Toward real-time multimodal processing: EyesWeb 4.0. In: Proceedings of the artificial intelligence and the simulation of behaviour (AISB), 2004 convention: motion. Emotion and cognition 2004, pp 22–26
84.
go back to reference Calvo RA, Mello SD (2010) Affect detection: an interdisciplinary review of models, methods, and their applications. IEEE Trans Affect Comput 1:18–37CrossRef Calvo RA, Mello SD (2010) Affect detection: an interdisciplinary review of models, methods, and their applications. IEEE Trans Affect Comput 1:18–37CrossRef
85.
go back to reference Russell JA (2003) Core affect and the psychological construction of emotion. Psychol Rev 110:145–72CrossRef Russell JA (2003) Core affect and the psychological construction of emotion. Psychol Rev 110:145–72CrossRef
86.
go back to reference Lewis M, Cañamero L (2013) Are discrete emotions useful in human-robot interaction? Feedback from motion capture analysis. In: Proceedings—2013 Humaine association conference on affective computing and intelligent interaction, ACII 2013, pp 97–102 Lewis M, Cañamero L (2013) Are discrete emotions useful in human-robot interaction? Feedback from motion capture analysis. In: Proceedings—2013 Humaine association conference on affective computing and intelligent interaction, ACII 2013, pp 97–102
87.
go back to reference Matthias Rehm AK, Segato N (2015) Perception of affective body movements in HRI across age groups: comparison between results from Denmark and Japan, pp 25–32 Matthias Rehm AK, Segato N (2015) Perception of affective body movements in HRI across age groups: comparison between results from Denmark and Japan, pp 25–32
88.
go back to reference Lisin DA, Mattar MA, Blaschko MB, Learned-Miller EG, Benfield MC (2005) Combining local and global image features for object class recognition. In: Computer vision and pattern recognition-workshops, 2005. CVPR workshops. IEEE Computer society conference on 2005, pp 47–47 Lisin DA, Mattar MA, Blaschko MB, Learned-Miller EG, Benfield MC (2005) Combining local and global image features for object class recognition. In: Computer vision and pattern recognition-workshops, 2005. CVPR workshops. IEEE Computer society conference on 2005, pp 47–47
89.
go back to reference Wang L, Zhou H, Low SC, Leckie C (2009) Action recognition via multi-feature fusion and Gaussian process classification. In: 2009 Workshop on applications of computer vision, WACV 2009. Snowbird, UT Wang L, Zhou H, Low SC, Leckie C (2009) Action recognition via multi-feature fusion and Gaussian process classification. In: 2009 Workshop on applications of computer vision, WACV 2009. Snowbird, UT
90.
go back to reference Yu H, Liu H (2015) Combining appearance and geometric features for facial expression recognition. In: 6th International conference on graphic and image processing, ICGIP 2014 Yu H, Liu H (2015) Combining appearance and geometric features for facial expression recognition. In: 6th International conference on graphic and image processing, ICGIP 2014
Metadata
Title
Automatic Affect Perception Based on Body Gait and Posture: A Survey
Authors
Benjamin Stephens-Fripp
Fazel Naghdy
David Stirling
Golshah Naghdy
Publication date
14-09-2017
Publisher
Springer Netherlands
Published in
International Journal of Social Robotics / Issue 5/2017
Print ISSN: 1875-4791
Electronic ISSN: 1875-4805
DOI
https://doi.org/10.1007/s12369-017-0427-6

Other articles of this Issue 5/2017

International Journal of Social Robotics 5/2017 Go to the issue

EditorialNotes

Preface

Premium Partners