Skip to main content
Erschienen in: Journal on Multimodal User Interfaces 2/2020

14.03.2020 | Original Paper

Interactive sonification strategies for the motion and emotion of dance performances

verfasst von: Steven Landry, Myounghoon Jeon

Erschienen in: Journal on Multimodal User Interfaces | Ausgabe 2/2020

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Sonification has the potential to communicate a variety of data types to listeners including not just cognitive information, but also emotions and aesthetics. The goal of our dancer sonification project is to “sonify emotions as well as motions” of a dance performance via musical sonification. To this end, we developed and evaluated sonification strategies for adding a layer of emotional mappings to data sonification. Experiment 1 developed and evaluated four musical sonifications (i.e., sin-ification, MIDI-fication, melody module, and melody and arrangement module) to see their emotional effects. Videos were recorded of a professional dancer interacting with each of the four musical sonification strategies. Forty-eight participants provided ratings of musicality, emotional expressivity, and sound-motion/emotion compatibility via an online survey. Results suggest that increasing musical mappings led to higher ratings for each dimension for dance-type gestures. Experiment 2 used the musical sonification framework to develop four sonification scenarios that aimed to communicate a target emotion (happy, sad, angry, and tender). Thirty participants compared four interactive sonification scenarios with four pre-composed dance choreographies featuring the same musical and gestural palettes. Both forced choice and multi-dimensional emotional evaluations were collected, as well as motion/emotion compatibility ratings. Results show that having both music and dance led to higher accuracy scores for most target emotions, compared to music or dance conditions alone. These findings can contribute to the fields of movement sonification, algorithmic music composition, as well as affective computing in general, by describing strategies for conveying emotion through sound.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Anhänge
Nur mit Berechtigung zugänglich
Literatur
1.
Zurück zum Zitat Dubus G, Bresin R (2011) Sonification of physical quantities throughout history: a meta-study of previous mapping strategies. Int Community Audit Disp Dubus G, Bresin R (2011) Sonification of physical quantities throughout history: a meta-study of previous mapping strategies. Int Community Audit Disp
2.
Zurück zum Zitat Winters RM, Wanderley MM (2013) Sonification of emotion: strategies for continuous display of arousal and valence. In: The 3rd international conference on music and emotion, Jyväskylä, Finland, June 11–15, 2013. University of Jyväskylä, Department of Music Winters RM, Wanderley MM (2013) Sonification of emotion: strategies for continuous display of arousal and valence. In: The 3rd international conference on music and emotion, Jyväskylä, Finland, June 11–15, 2013. University of Jyväskylä, Department of Music
3.
Zurück zum Zitat Roddy S, Furlong D (2014) Embodied aesthetics in auditory display. Organised Sound 19(1):70–77CrossRef Roddy S, Furlong D (2014) Embodied aesthetics in auditory display. Organised Sound 19(1):70–77CrossRef
4.
Zurück zum Zitat Schaffert N, Mattes K, Barrass S, Effenberg AO (2009) Exploring function and aesthetics in sonifications for elite sports. In: Proceedings of the 2nd international conference on music communication science (ICoMCS2), vol 83.HCSNet, p 86 Schaffert N, Mattes K, Barrass S, Effenberg AO (2009) Exploring function and aesthetics in sonifications for elite sports. In: Proceedings of the 2nd international conference on music communication science (ICoMCS2), vol 83.HCSNet, p 86
5.
Zurück zum Zitat Camurri A, Mazzarino B, Volpe G (2003) Analysis of expressive gesture: the eyesweb expressive gesture processing library. In: Camurri A, Volpe G (eds) International gesture workshop. Springer, Berlin, pp 460–467 Camurri A, Mazzarino B, Volpe G (2003) Analysis of expressive gesture: the eyesweb expressive gesture processing library. In: Camurri A, Volpe G (eds) International gesture workshop. Springer, Berlin, pp 460–467
6.
Zurück zum Zitat Camurri A, De Poli G, Friberg A, Leman M, Volpe G (2005) The MEGA project: analysis and synthesis of multisensory expressive gesture in performing art applications. J New Music Res 34(1):5–21CrossRef Camurri A, De Poli G, Friberg A, Leman M, Volpe G (2005) The MEGA project: analysis and synthesis of multisensory expressive gesture in performing art applications. J New Music Res 34(1):5–21CrossRef
7.
Zurück zum Zitat Friberg A, Bresin R, Sundberg J (2006) Overview of the KTH rule system for musical performance. Adv Cognit Psychol 2(2–3):145–161CrossRef Friberg A, Bresin R, Sundberg J (2006) Overview of the KTH rule system for musical performance. Adv Cognit Psychol 2(2–3):145–161CrossRef
8.
Zurück zum Zitat Ekman P (2016) What scientists who study emotion agree about. Perspect Psychol Sci 11(1):31–34CrossRef Ekman P (2016) What scientists who study emotion agree about. Perspect Psychol Sci 11(1):31–34CrossRef
9.
Zurück zum Zitat Gabrielsson A, Juslin PN (1996) Emotional expression in music performance: between the performer's intention and the listener's experience. Psychol Music 24(1):68–91CrossRef Gabrielsson A, Juslin PN (1996) Emotional expression in music performance: between the performer's intention and the listener's experience. Psychol Music 24(1):68–91CrossRef
10.
Zurück zum Zitat Jeon M (2017) Emotions and affect in human factors and human–computer interaction: taxonomy, theories, approaches, and methods. In: Jeon M (ed)Emotions and affect in human factors and human–computer interaction. Elsevier, pp 3–26 Jeon M (2017) Emotions and affect in human factors and human–computer interaction: taxonomy, theories, approaches, and methods. In: Jeon M (ed)Emotions and affect in human factors and human–computer interaction. Elsevier, pp 3–26
11.
Zurück zum Zitat Juslin PN, Laukka P (2003) Communication of emotions in vocal expression and music performance: different channels, same code? Psychol Bull 129(5):770CrossRef Juslin PN, Laukka P (2003) Communication of emotions in vocal expression and music performance: different channels, same code? Psychol Bull 129(5):770CrossRef
12.
Zurück zum Zitat Sterkenburg J, Jeon M, Plummer C (2014) Auditory emoticons: iterative design and acoustic characteristics of emotional auditory icons and earcons. In: Kurosu M (ed) International conference on human–computer interaction. Springer, Cham, pp 633–640 Sterkenburg J, Jeon M, Plummer C (2014) Auditory emoticons: iterative design and acoustic characteristics of emotional auditory icons and earcons. In: Kurosu M (ed) International conference on human–computer interaction. Springer, Cham, pp 633–640
13.
Zurück zum Zitat Boone RT, Cunningham JG (1998) Children's decoding of emotion in expressive body movement: the development of cue attunement. Dev Psychol 34(5):1007CrossRef Boone RT, Cunningham JG (1998) Children's decoding of emotion in expressive body movement: the development of cue attunement. Dev Psychol 34(5):1007CrossRef
14.
Zurück zum Zitat De Meijer M (1989) The contribution of general features of body movement to the attribution of emotions. J Nonverbal Behav 13(4):247–268MathSciNetCrossRef De Meijer M (1989) The contribution of general features of body movement to the attribution of emotions. J Nonverbal Behav 13(4):247–268MathSciNetCrossRef
15.
Zurück zum Zitat Lagerlöf I, Djerf M (2009) Children's understanding of emotion in dance. Eur J Dev Psychol 6(4):409–431CrossRef Lagerlöf I, Djerf M (2009) Children's understanding of emotion in dance. Eur J Dev Psychol 6(4):409–431CrossRef
16.
Zurück zum Zitat Baulch E (2008) Music and dance. J R Anthropol Inst (NS) 14:890–935CrossRef Baulch E (2008) Music and dance. J R Anthropol Inst (NS) 14:890–935CrossRef
17.
Zurück zum Zitat Hagen EH, Bryant GA (2003) Music and dance as a coalition signaling system. Hum Nat 14(1):21–51CrossRef Hagen EH, Bryant GA (2003) Music and dance as a coalition signaling system. Hum Nat 14(1):21–51CrossRef
18.
Zurück zum Zitat Krumhansl CL, Schenck DL (1997) Can dance reflect the structural and expressive qualities of music? A perceptual experiment on Balanchine's choreography of Mozart's Divertimento No. 15. Musicae Scientiae 1(1):63–85CrossRef Krumhansl CL, Schenck DL (1997) Can dance reflect the structural and expressive qualities of music? A perceptual experiment on Balanchine's choreography of Mozart's Divertimento No. 15. Musicae Scientiae 1(1):63–85CrossRef
19.
Zurück zum Zitat Grieser DL, Kuhl PK (1988) Maternal speech to infants in a tonal language: Support for universal prosodic features in motherese. Dev Psychol 24(1):14CrossRef Grieser DL, Kuhl PK (1988) Maternal speech to infants in a tonal language: Support for universal prosodic features in motherese. Dev Psychol 24(1):14CrossRef
20.
Zurück zum Zitat Hermann T, Hunt A, Neuhoff JG (2011) The sonification handbook. Logos Verlag, Berlin Hermann T, Hunt A, Neuhoff JG (2011) The sonification handbook. Logos Verlag, Berlin
21.
Zurück zum Zitat Barrass S, Vickers P (2011) Sonification design and aesthetics. In: Hermann T, Hunt A, Neuhoff JG (eds) The sonification handbook, chap 7. Logos Publishing House, Berlin, pp 145–171 Barrass S, Vickers P (2011) Sonification design and aesthetics. In: Hermann T, Hunt A, Neuhoff JG (eds) The sonification handbook, chap 7. Logos Publishing House, Berlin, pp 145–171
22.
Zurück zum Zitat Edworthy J (2012) Medical audible alarms: a review. J Am Med Inform Assoc 20(3):584–589CrossRef Edworthy J (2012) Medical audible alarms: a review. J Am Med Inform Assoc 20(3):584–589CrossRef
23.
Zurück zum Zitat Cvach M (2012) Monitor alarm fatigue: an integrative review. Biomed Instrum Technol 46(4):268–277CrossRef Cvach M (2012) Monitor alarm fatigue: an integrative review. Biomed Instrum Technol 46(4):268–277CrossRef
24.
Zurück zum Zitat Barrass S (2012) The aesthetic turn in sonification towards a social and cultural medium. AI Soc 27(2):177–181CrossRef Barrass S (2012) The aesthetic turn in sonification towards a social and cultural medium. AI Soc 27(2):177–181CrossRef
25.
Zurück zum Zitat Russell JA (1980) A circumplex model of affect. J Pers Soc Psychol 39(6):1161 CrossRef Russell JA (1980) A circumplex model of affect. J Pers Soc Psychol 39(6):1161 CrossRef
26.
Zurück zum Zitat Henkelmann C (2007) Improving the aesthetic quality of realtime motion data sonification. Computer Graphics Technical Report CG-2007-4. University of Bonn Henkelmann C (2007) Improving the aesthetic quality of realtime motion data sonification. Computer Graphics Technical Report CG-2007-4. University of Bonn
27.
Zurück zum Zitat Burger B, Thompson MR, Luck G, Saarikallio S, Toiviainen P (2013) Influences of rhythm-and timbre-related musical features on characteristics of music-induced movement. Front psychol 4:183CrossRef Burger B, Thompson MR, Luck G, Saarikallio S, Toiviainen P (2013) Influences of rhythm-and timbre-related musical features on characteristics of music-induced movement. Front psychol 4:183CrossRef
28.
Zurück zum Zitat Camurri A, Lagerlöf I, Volpe G (2003) Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques. Int J Hum Comput Stud 59(1):213–225CrossRef Camurri A, Lagerlöf I, Volpe G (2003) Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques. Int J Hum Comput Stud 59(1):213–225CrossRef
29.
Zurück zum Zitat Schubert E, Ferguson S., Farrar N, McPherson GE (2011). Sonification of emotion I: film music. In: Proceedings of the international conference on auditory display (ICAD2011) Schubert E, Ferguson S., Farrar N, McPherson GE (2011). Sonification of emotion I: film music. In: Proceedings of the international conference on auditory display (ICAD2011)
30.
Zurück zum Zitat Juslin PN, Friberg A, Bresin R (2001) Toward a computational model of expression in music performance: The GERM model. Musicae Scientiae 5(1_suppl):63–122CrossRef Juslin PN, Friberg A, Bresin R (2001) Toward a computational model of expression in music performance: The GERM model. Musicae Scientiae 5(1_suppl):63–122CrossRef
31.
Zurück zum Zitat Juslin PN (2000) Cue utilization in communication of emotion in music performance: relating performance to perception. J Exp Psychol Hum Percept Perform 26(6):1797CrossRef Juslin PN (2000) Cue utilization in communication of emotion in music performance: relating performance to perception. J Exp Psychol Hum Percept Perform 26(6):1797CrossRef
32.
Zurück zum Zitat Holler J, Beattie G (2003) How iconic gestures and speech interact in the representation of meaning: are both aspects really integral to the process? Semiotica 146:81–116 Holler J, Beattie G (2003) How iconic gestures and speech interact in the representation of meaning: are both aspects really integral to the process? Semiotica 146:81–116
33.
Zurück zum Zitat Busso C, Deng Z, Yildirim S, Bulut M, Lee CM, Kazemzadeh, A, Lee S, Neumann U, Narayanan S (2004) Analysis of emotion recognition using facial expressions, speech and multimodal information. In: Proceedings of the 6th international conference on multimodal interfaces (ICMI'04), ACM, pp 205–211 Busso C, Deng Z, Yildirim S, Bulut M, Lee CM, Kazemzadeh, A, Lee S, Neumann U, Narayanan S (2004) Analysis of emotion recognition using facial expressions, speech and multimodal information. In: Proceedings of the 6th international conference on multimodal interfaces (ICMI'04), ACM, pp 205–211
34.
Zurück zum Zitat Guizatdinova I, Guo Z (2003) Sonification of facial expressions. In: Proceedings of new interaction techniques '03. pp 44–51 Guizatdinova I, Guo Z (2003) Sonification of facial expressions. In: Proceedings of new interaction techniques '03. pp 44–51
35.
Zurück zum Zitat Tanveer MI, Anam AI, Rahman AM, Ghosh S, Yeasin M (2012) FEPS: a sensory substitution system for the blind to perceive facial expressions. In: Proceedings of the 14th international ACM SIGACCESS conference on computers and accessibility (ASSETS '12), ACM, pp 207–208 Tanveer MI, Anam AI, Rahman AM, Ghosh S, Yeasin M (2012) FEPS: a sensory substitution system for the blind to perceive facial expressions. In: Proceedings of the 14th international ACM SIGACCESS conference on computers and accessibility (ASSETS '12), ACM, pp 207–208
36.
Zurück zum Zitat Zhang R, Jeon M, Park CH, Howard A (2015) Robotic sonification for promoting emotional and social interactions of children with ASD. In: Proceedings of the tenth annual ACM/IEEE international conference on human-robot interaction extended abstracts (HRI'15), ACM, pp 111–112 Zhang R, Jeon M, Park CH, Howard A (2015) Robotic sonification for promoting emotional and social interactions of children with ASD. In: Proceedings of the tenth annual ACM/IEEE international conference on human-robot interaction extended abstracts (HRI'15), ACM, pp 111–112
37.
Zurück zum Zitat Canazza S, Poli G, Rodà A, Vidolin A (2003) An abstract control space for communication of sensory expressive intentions in music performance. J New Music Res 32(3):281–294CrossRef Canazza S, Poli G, Rodà A, Vidolin A (2003) An abstract control space for communication of sensory expressive intentions in music performance. J New Music Res 32(3):281–294CrossRef
Metadaten
Titel
Interactive sonification strategies for the motion and emotion of dance performances
verfasst von
Steven Landry
Myounghoon Jeon
Publikationsdatum
14.03.2020
Verlag
Springer International Publishing
Erschienen in
Journal on Multimodal User Interfaces / Ausgabe 2/2020
Print ISSN: 1783-7677
Elektronische ISSN: 1783-8738
DOI
https://doi.org/10.1007/s12193-020-00321-3

Weitere Artikel der Ausgabe 2/2020

Journal on Multimodal User Interfaces 2/2020 Zur Ausgabe