Skip to main content

2018 | OriginalPaper | Buchkapitel

38. Methods for Studying Music-Related Body Motion

verfasst von : Alexander Refsum Jensenius

Erschienen in: Springer Handbook of Systematic Musicology

Verlag: Springer Berlin Heidelberg

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

This chapter presents an overview of some methodological approaches and technologies that can be used in the study of music-related body motion. The aim is not to cover all possible approaches, but rather to highlight some of the ones that are more relevant from a musicological point of view. This includes methods for video-based and sensor-based motion analyses, both qualitative and quantitative. It also includes discussions of the strengths and weaknesses of the different methods, and reflections on how the methods can be used in connection to other data in question, such as physiological or neurological data, symbolic notation, sound recordings and contextual data.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Literatur
38.1
Zurück zum Zitat M.M. Wanderley, M. Battier (Eds.): Trends in Gestural Control of Music (IRCAM – Centre Pompidou, Paris 2000) M.M. Wanderley, M. Battier (Eds.): Trends in Gestural Control of Music (IRCAM – Centre Pompidou, Paris 2000)
38.2
Zurück zum Zitat A. Gritten, E. King (Eds.): Music and gesture (Ashgate, Hampshire 2006) A. Gritten, E. King (Eds.): Music and gesture (Ashgate, Hampshire 2006)
38.3
Zurück zum Zitat A. Gritten, E. King (Eds.): New Perspectives on Music and Gesture (Ashgate, Hampshire 2011) A. Gritten, E. King (Eds.): New Perspectives on Music and Gesture (Ashgate, Hampshire 2011)
38.4
Zurück zum Zitat R.I. Godoy, M. Leman (Eds.): Musical Gestures: Sound, Movement, and Meaning (Routledge, New York 2010) R.I. Godoy, M. Leman (Eds.): Musical Gestures: Sound, Movement, and Meaning (Routledge, New York 2010)
38.5
Zurück zum Zitat A.R. Jensenius, M.M. Wanderley, R.I. Godøy, M. Leman: Musical gestures: Concepts and methods in research. In: Musical Gestures: Sound, Movement, and Meaning, ed. by R.I. Godøy, M. Leman (Routledge, New York 2010) pp. 12–35 A.R. Jensenius, M.M. Wanderley, R.I. Godøy, M. Leman: Musical gestures: Concepts and methods in research. In: Musical Gestures: Sound, Movement, and Meaning, ed. by R.I. Godøy, M. Leman (Routledge, New York 2010) pp. 12–35
38.6
Zurück zum Zitat W. Barlow: Alexander-princippet (Borgen forlag, Copenhagen 1975) W. Barlow: Alexander-princippet (Borgen forlag, Copenhagen 1975)
38.7
Zurück zum Zitat R. Feitis: Ida Rolf Talks about Rolfing and Physical Reality (Harper and Row, New York 1978) R. Feitis: Ida Rolf Talks about Rolfing and Physical Reality (Harper and Row, New York 1978)
38.8
Zurück zum Zitat A. Pierce, R. Pierce: Expressive Movement: Posture and Action in Daily Life, Sports, and the Performing Arts (Perseus, Cambridge 1989)CrossRef A. Pierce, R. Pierce: Expressive Movement: Posture and Action in Daily Life, Sports, and the Performing Arts (Perseus, Cambridge 1989)CrossRef
38.9
Zurück zum Zitat E. Findlay: Rhythm and Movement – Applications of Dalcroze Eurhythmics (Summy-Birchard, Miami 1971) E. Findlay: Rhythm and Movement – Applications of Dalcroze Eurhythmics (Summy-Birchard, Miami 1971)
38.10
Zurück zum Zitat M. Parker: Benesh Movement Notation for Ballet (Royal Academy of Dance, London 1996) M. Parker: Benesh Movement Notation for Ballet (Royal Academy of Dance, London 1996)
38.11
Zurück zum Zitat A.H. Guest: Labanotation (Routledge, New York 2004) A.H. Guest: Labanotation (Routledge, New York 2004)
38.12
Zurück zum Zitat W. Choensawat, M. Nakamura, K. Hachimura: GenLaban: A tool for generating Labanotation from motion capture data, Multimed. Tools Appl. 74(23), 10823–10846 (2014)CrossRef W. Choensawat, M. Nakamura, K. Hachimura: GenLaban: A tool for generating Labanotation from motion capture data, Multimed. Tools Appl. 74(23), 10823–10846 (2014)CrossRef
38.13
Zurück zum Zitat C.A. Schrader: A Sense of Dance: Exploring Your Movement Potential (Human Kinetics, Champaign 2004) C.A. Schrader: A Sense of Dance: Exploring Your Movement Potential (Human Kinetics, Champaign 2004)
38.14
Zurück zum Zitat R. Laban, F.C. Lawrence: Effort (Macdonald Evans, London 1947) R. Laban, F.C. Lawrence: Effort (Macdonald Evans, London 1947)
38.15
Zurück zum Zitat E. Haga: Correspondences Between Music and Body Movement, Ph.D. Thesis (University of Oslo, Oslo 2008) E. Haga: Correspondences Between Music and Body Movement, Ph.D. Thesis (University of Oslo, Oslo 2008)
38.16
Zurück zum Zitat L. Campbell, M. Wanderley: The Observation of Movement, MUMT 609 Report (McGill University, Montreal 2005) L. Campbell, M. Wanderley: The Observation of Movement, MUMT 609 Report (McGill University, Montreal 2005)
38.17
Zurück zum Zitat E. Van Dyck, P.-J. Maes, J. Hargreaves, M. Lesaffre, M. Leman: Expressing induced emotions through free dance movement, J. Nonverbal Behav. 37(3), 175–190 (2013)CrossRef E. Van Dyck, P.-J. Maes, J. Hargreaves, M. Lesaffre, M. Leman: Expressing induced emotions through free dance movement, J. Nonverbal Behav. 37(3), 175–190 (2013)CrossRef
38.18
Zurück zum Zitat M.A.R. Ahad, J.K. Tan, H. Kim, S. Ishikawa: Motion history image: Its variants and applications, Mach. Vis. Appl. 23(2), 255–281 (2012)CrossRef M.A.R. Ahad, J.K. Tan, H. Kim, S. Ishikawa: Motion history image: Its variants and applications, Mach. Vis. Appl. 23(2), 255–281 (2012)CrossRef
38.19
Zurück zum Zitat A. Camurri, I. Lagerlöf, G. Volpe: Recognizing emotion from dance movement: Comparison of spectator recognition and automated techniques, Appl. Affect. Comput. Hum.-Comput. Interact. 59(1/2), 213–225 (2003) A. Camurri, I. Lagerlöf, G. Volpe: Recognizing emotion from dance movement: Comparison of spectator recognition and automated techniques, Appl. Affect. Comput. Hum.-Comput. Interact. 59(1/2), 213–225 (2003)
38.20
Zurück zum Zitat A.R. Jensenius: Some video abstraction techniques for displaying body movement in analysis and performance, Leonardo 46(1), 53–60 (2013)CrossRef A.R. Jensenius: Some video abstraction techniques for displaying body movement in analysis and performance, Leonardo 46(1), 53–60 (2013)CrossRef
38.21
Zurück zum Zitat T.B. Moeslund, E. Granum: A survey of computer vision-based human motion capture, Comput. Vis. Image Underst. 81(3), 231–268 (2001)CrossRef T.B. Moeslund, E. Granum: A survey of computer vision-based human motion capture, Comput. Vis. Image Underst. 81(3), 231–268 (2001)CrossRef
38.22
Zurück zum Zitat T.B. Moeslund, A. Hilton, V. Krüger: A survey of advances in vision-based human motion capture and analysis, Comput. Vis. Image Underst. 104(2/3), 90–126 (2006)CrossRef T.B. Moeslund, A. Hilton, V. Krüger: A survey of advances in vision-based human motion capture and analysis, Comput. Vis. Image Underst. 104(2/3), 90–126 (2006)CrossRef
38.23
Zurück zum Zitat S.S. Rautaray, A. Agrawal: Vision based hand gesture recognition for human computer interaction: A survey, Artif. Intell. Rev. 43(1), 1–54 (2015)CrossRef S.S. Rautaray, A. Agrawal: Vision based hand gesture recognition for human computer interaction: A survey, Artif. Intell. Rev. 43(1), 1–54 (2015)CrossRef
38.24
Zurück zum Zitat A. Camurri, B. Mazzarino, G. Volpe: Analysis of expressive gesture: The EyesWeb expressive gesture processing library. In: Gesture-based Communication in Human-Computer Interaction, Lecture Notes in Computer Science, Vol. 2915, ed. by A. Camurri, G. Volpe (Springer, Berlin, Heidelberg 2004) pp. 460–467CrossRef A. Camurri, B. Mazzarino, G. Volpe: Analysis of expressive gesture: The EyesWeb expressive gesture processing library. In: Gesture-based Communication in Human-Computer Interaction, Lecture Notes in Computer Science, Vol. 2915, ed. by A. Camurri, G. Volpe (Springer, Berlin, Heidelberg 2004) pp. 460–467CrossRef
38.25
Zurück zum Zitat J.M. Zmölnig: Gem for pd – Recent progress. In: Proc. Int. Comput. Music Conf., Miami (2004) J.M. Zmölnig: Gem for pd – Recent progress. In: Proc. Int. Comput. Music Conf., Miami (2004)
38.26
Zurück zum Zitat G. Levin: Computer vision for artists and designers: Pedagogic tools and techniques for novice programmers, AI Society 20(4), 462–482 (2006)CrossRef G. Levin: Computer vision for artists and designers: Pedagogic tools and techniques for novice programmers, AI Society 20(4), 462–482 (2006)CrossRef
38.27
Zurück zum Zitat L. Sigal, A. Balan, M. Black: Humaneva: Synchronized video and motion capture dataset and baseline algorithm for evaluation of articulated human motion, Int. J. Comput. Vis. 87(1), 4–27 (2010)CrossRef L. Sigal, A. Balan, M. Black: Humaneva: Synchronized video and motion capture dataset and baseline algorithm for evaluation of articulated human motion, Int. J. Comput. Vis. 87(1), 4–27 (2010)CrossRef
38.28
Zurück zum Zitat M.M. Wanderley, D. Birnbaum, J. Malloch, E. Sinyor, J. Boissinot: SensorWiki.org: A collaborative resource for researchers and interface designers. In: Proc. Int. Conf. New Interfaces Music. Expr., Paris (2006) pp. 180–183 M.M. Wanderley, D. Birnbaum, J. Malloch, E. Sinyor, J. Boissinot: SensorWiki.org: A collaborative resource for researchers and interface designers. In: Proc. Int. Conf. New Interfaces Music. Expr., Paris (2006) pp. 180–183
38.29
Zurück zum Zitat R. Begg, M. Palaniswami: Computational intelligence for movement sciences: Neural networks and other emerging techniques (IGI Global, Hershey 2006)CrossRef R. Begg, M. Palaniswami: Computational intelligence for movement sciences: Neural networks and other emerging techniques (IGI Global, Hershey 2006)CrossRef
38.30
Zurück zum Zitat H. Zhou, H. Hu: Human motion tracking for rehabilitation – A survey, Biomed. Signal Process. Control 3(1), 1–18 (2008)CrossRef H. Zhou, H. Hu: Human motion tracking for rehabilitation – A survey, Biomed. Signal Process. Control 3(1), 1–18 (2008)CrossRef
38.31
Zurück zum Zitat W.M. Richard: A sensor classification scheme, IEEE Trans. Ultrason. Ferroelectr. Freq. Control UFFC-34(2), 124–126 (1987) W.M. Richard: A sensor classification scheme, IEEE Trans. Ultrason. Ferroelectr. Freq. Control UFFC-34(2), 124–126 (1987)
38.32
Zurück zum Zitat S. Patel, H. Park, P. Bonato, L. Chan, M. Rodgers: A review of wearable sensors and systems with application in rehabilitation, J. NeuroEng. Rehabil. 9(1), 21 (2012)CrossRef S. Patel, H. Park, P. Bonato, L. Chan, M. Rodgers: A review of wearable sensors and systems with application in rehabilitation, J. NeuroEng. Rehabil. 9(1), 21 (2012)CrossRef
38.33
Zurück zum Zitat G. Bishop, G. Welch, B.D. Allen: Tracking: Beyond 15 minutes of thought. In: SIGGRAPH Course 11 (ACM, Los Angeles 2001) pp. 6–11 G. Bishop, G. Welch, B.D. Allen: Tracking: Beyond 15 minutes of thought. In: SIGGRAPH Course 11 (ACM, Los Angeles 2001) pp. 6–11
38.34
Zurück zum Zitat F. Vogt, G. Mccaig, M.A. Ali, S.S. Fels: Tongue ‘n’ groove: An ultrasound based music controller. In: Proc. Int. Conf. New Interfaces Music. Expr., Dublin (2002) pp. 181–185 F. Vogt, G. Mccaig, M.A. Ali, S.S. Fels: Tongue ‘n’ groove: An ultrasound based music controller. In: Proc. Int. Conf. New Interfaces Music. Expr., Dublin (2002) pp. 181–185
38.35
Zurück zum Zitat M. Ciglar: An ultrasound based instrument generating audible and tactile sound. In: Proc. Int. Conf. New Interfaces Music. Expr (2010) pp. 19–22 M. Ciglar: An ultrasound based instrument generating audible and tactile sound. In: Proc. Int. Conf. New Interfaces Music. Expr (2010) pp. 19–22
38.36
Zurück zum Zitat F. Styns, L. van Noorden, D. Moelants, M. Leman: Walking on music, Hum. Mov. Sci. 26(5), 769–785 (2007)CrossRef F. Styns, L. van Noorden, D. Moelants, M. Leman: Walking on music, Hum. Mov. Sci. 26(5), 769–785 (2007)CrossRef
38.37
Zurück zum Zitat E.R. Miranda, M.M. Wanderley: New Digital Musical Instruments: Control and Interaction Beyond the Keyboard (A-R Editions, Middleton 2006) E.R. Miranda, M.M. Wanderley: New Digital Musical Instruments: Control and Interaction Beyond the Keyboard (A-R Editions, Middleton 2006)
38.38
Zurück zum Zitat G. Vigliensoni, M.M. Wanderley: A quantitative comparison of position trackers for the development of a touch-less musical interface. In: Proc. Int. Conf. New Interfaces Music. Expr., Ann Arbor (2012) G. Vigliensoni, M.M. Wanderley: A quantitative comparison of position trackers for the development of a touch-less musical interface. In: Proc. Int. Conf. New Interfaces Music. Expr., Ann Arbor (2012)
38.39
Zurück zum Zitat T. Marrin, R. Picard: The ‘Conductor’s Jacket’: A device for recording expressive musical gestures. In: Proc. Int. Comput. Music Conf (1998) pp. 215–219 T. Marrin, R. Picard: The ‘Conductor’s Jacket’: A device for recording expressive musical gestures. In: Proc. Int. Comput. Music Conf (1998) pp. 215–219
38.40
Zurück zum Zitat E. Lin, P. Wu: Jam Master, a music composing interface. In: Proc. Hum. Interface Technol., Vancouver (2000) pp. 21–28 E. Lin, P. Wu: Jam Master, a music composing interface. In: Proc. Hum. Interface Technol., Vancouver (2000) pp. 21–28
38.41
Zurück zum Zitat M.T. Marshall, J. Malloch, M.M. Wanderley: Gesture control of spatialization. In: 7th Int. Workshop Gesture Human-Comput. Interact. Simul., Lisbon (2007) M.T. Marshall, J. Malloch, M.M. Wanderley: Gesture control of spatialization. In: 7th Int. Workshop Gesture Human-Comput. Interact. Simul., Lisbon (2007)
38.42
Zurück zum Zitat M.T. Marshall, M. Rath, B. Moynihan: The Virtual Bodhran – The Vodhran. In: Proc. Int. Conf. New Interfaces Music. Expr., Dublin (2002) pp. 118–119 M.T. Marshall, M. Rath, B. Moynihan: The Virtual Bodhran – The Vodhran. In: Proc. Int. Conf. New Interfaces Music. Expr., Dublin (2002) pp. 118–119
38.43
Zurück zum Zitat E. Maestre, J. Janer, M. Blaauw, A. Pérez, E. Guaus: Acquisition of violin instrumental gestures using a commercial EMF tracking device. In: Proc. Int. Comput. Music Conf., Copenhagen (2007) E. Maestre, J. Janer, M. Blaauw, A. Pérez, E. Guaus: Acquisition of violin instrumental gestures using a commercial EMF tracking device. In: Proc. Int. Comput. Music Conf., Copenhagen (2007)
38.44
Zurück zum Zitat A.R. Jensenius, K. Nymoen, R.I. Godøy: A multilayered GDIF-based setup for studying coarticulation in the movements of musicians. In: Proc. Int. Comput. Music Conf (2008) pp. 743–746 A.R. Jensenius, K. Nymoen, R.I. Godøy: A multilayered GDIF-based setup for studying coarticulation in the movements of musicians. In: Proc. Int. Comput. Music Conf (2008) pp. 743–746
38.45
Zurück zum Zitat H. Wilmers: Bowsense – An open wireless motion sensing platform. In: Proc. Int. Comput. Music Conf., Montreal (2009) pp. 287–290 H. Wilmers: Bowsense – An open wireless motion sensing platform. In: Proc. Int. Comput. Music Conf., Montreal (2009) pp. 287–290
38.46
Zurück zum Zitat S. Skogstad, K. Nymoen, M.E. Høvin: Comparing inertial and optical MoCap technologies for synthesis control. In: Proc. Sound Music Comput., Padova (2011) pp. 421–426 S. Skogstad, K. Nymoen, M.E. Høvin: Comparing inertial and optical MoCap technologies for synthesis control. In: Proc. Sound Music Comput., Padova (2011) pp. 421–426
38.47
Zurück zum Zitat G. Welch, E. Foxlin: Motion tracking: No silver bullet, but a respectable arsenal, IEEE Comput. Graph. Appl. 22(6), 24–38 (2002)CrossRef G. Welch, E. Foxlin: Motion tracking: No silver bullet, but a respectable arsenal, IEEE Comput. Graph. Appl. 22(6), 24–38 (2002)CrossRef
38.48
Zurück zum Zitat Y. de Quay, S. Skogstad, A.R. Jensenius: Dance Jockey: Performing electronic music by dancing, Leonardo Music J. 21, 11–12 (2011)CrossRef Y. de Quay, S. Skogstad, A.R. Jensenius: Dance Jockey: Performing electronic music by dancing, Leonardo Music J. 21, 11–12 (2011)CrossRef
38.49
Zurück zum Zitat M.A.O. Pérez, R.B. Knapp: BioTools: A biosignal toolbox for composers and performers. Computer Music Modeling and Retrieval. In: Sense of Sounds, Lecture Notes in Computer Science, Vol. 4969, ed. by R. Kronland-Martinet, S. Ystad, K. Jensen (Springer, Berlin, Heidelberg 2008) pp. 441–452 M.A.O. Pérez, R.B. Knapp: BioTools: A biosignal toolbox for composers and performers. Computer Music Modeling and Retrieval. In: Sense of Sounds, Lecture Notes in Computer Science, Vol. 4969, ed. by R. Kronland-Martinet, S. Ystad, K. Jensen (Springer, Berlin, Heidelberg 2008) pp. 441–452
38.50
Zurück zum Zitat A. Tanaka: Musical technical issues in using interactive instrument technology with application to the BioMuse. In: Proc. Int. Comput. Music Conf., Waseda (1993) pp. 124–124 A. Tanaka: Musical technical issues in using interactive instrument technology with application to the BioMuse. In: Proc. Int. Comput. Music Conf., Waseda (1993) pp. 124–124
38.51
Zurück zum Zitat K. Nymoen, M.R. Haugen, A.R. Jensenius: MuMYO – Evaluating and exploring the MYO armband for musical interaction. In: Proc. Int. Conf. New Interfaces Music. Expr., Baton Rouge (2015) K. Nymoen, M.R. Haugen, A.R. Jensenius: MuMYO – Evaluating and exploring the MYO armband for musical interaction. In: Proc. Int. Conf. New Interfaces Music. Expr., Baton Rouge (2015)
38.52
Zurück zum Zitat C. Lee, S.K. Yoo, Y.J. Park, N.H. Kim, K.S. Jeong, B.C. Lee: Using neural network to recognize human emotions from heart rate variability and skin resistance. In: Proc. IEEE Eng. Med. Biol., Shanghai (2005) pp. 5523–5525 C. Lee, S.K. Yoo, Y.J. Park, N.H. Kim, K.S. Jeong, B.C. Lee: Using neural network to recognize human emotions from heart rate variability and skin resistance. In: Proc. IEEE Eng. Med. Biol., Shanghai (2005) pp. 5523–5525
38.53
Zurück zum Zitat G.H. Zimny, E.W. Weidenfeller: Effects of music upon GSR and heart-rate, Am. J. Psychol. 76(2), 311–314 (1963)CrossRef G.H. Zimny, E.W. Weidenfeller: Effects of music upon GSR and heart-rate, Am. J. Psychol. 76(2), 311–314 (1963)CrossRef
38.54
Zurück zum Zitat D.G. Craig: An Exploratory Study of Physiological Changes during ‘‘Chills’’ Induced by Music, Musicae Scientiae 9(2), 273–287 (2005)CrossRef D.G. Craig: An Exploratory Study of Physiological Changes during ‘‘Chills’’ Induced by Music, Musicae Scientiae 9(2), 273–287 (2005)CrossRef
38.55
Zurück zum Zitat M. Ojanen, J. Suominen, T. Kallio, K. Lassfolk: Design principles and user interfaces of Erkki Kurenniemi’s electronic musical instruments of the 1960’s and 1970’s. In: Proc. Int. Conf. New Interfaces Music. Expr., New York (2007) pp. 88–93 M. Ojanen, J. Suominen, T. Kallio, K. Lassfolk: Design principles and user interfaces of Erkki Kurenniemi’s electronic musical instruments of the 1960’s and 1970’s. In: Proc. Int. Conf. New Interfaces Music. Expr., New York (2007) pp. 88–93
38.56
Zurück zum Zitat E.R. Miranda, B. Boskamp: Steering generative rules with the EEG: An approach to brain-computer music interfacing. In: Proc. Sound Music Comput., Salerno (2005) E.R. Miranda, B. Boskamp: Steering generative rules with the EEG: An approach to brain-computer music interfacing. In: Proc. Sound Music Comput., Salerno (2005)
38.57
Zurück zum Zitat A.R. Jensenius, A. Camurri, N. Castagne, E. Maestre, J. Malloch, D. McGilvray, D. Schwarz, M. Wright: Panel: The need of formats for streaming and storing music-related movement and gesture data. In: Proc. Int. Comput. Music Conf (2007) pp. 13–16 A.R. Jensenius, A. Camurri, N. Castagne, E. Maestre, J. Malloch, D. McGilvray, D. Schwarz, M. Wright: Panel: The need of formats for streaming and storing music-related movement and gesture data. In: Proc. Int. Comput. Music Conf (2007) pp. 13–16
38.58
Zurück zum Zitat Motion Lab Systems: The C3D File Format: User Guide (Motion Lab Systems, Baton Rouge 2008) Motion Lab Systems: The C3D File Format: User Guide (Motion Lab Systems, Baton Rouge 2008)
38.59
Zurück zum Zitat H. Chung, Y. Lee: MCML: Motion capture markup language for integration of heterogeneous motion capture data, Comput. Stand. Interfaces 26(2), 113–130 (2004)CrossRef H. Chung, Y. Lee: MCML: Motion capture markup language for integration of heterogeneous motion capture data, Comput. Stand. Interfaces 26(2), 113–130 (2004)CrossRef
38.60
Zurück zum Zitat T. Tsutsui, S. Saeyor, M. Ishizuka: MPML: A multimodal presentation markup language with character agent control functions. In: Proc. (CD-ROM) WebNet, San Antonio (2000) pp. 30–37 T. Tsutsui, S. Saeyor, M. Ishizuka: MPML: A multimodal presentation markup language with character agent control functions. In: Proc. (CD-ROM) WebNet, San Antonio (2000) pp. 30–37
38.61
Zurück zum Zitat E. Hartman, J. Cooper, K. Spratt: Swing set: Musical controllers with inherent physical dynamics. In: Proc. Int. Conf. New Interfaces Music. Expr (2008) pp. 356–357 E. Hartman, J. Cooper, K. Spratt: Swing set: Musical controllers with inherent physical dynamics. In: Proc. Int. Conf. New Interfaces Music. Expr (2008) pp. 356–357
38.62
Zurück zum Zitat B. Manjunath, P. Salembier, T. Sikora: Introduction to MPEG-7: Multimedia Content Description Interface (Wiley, New York 2002) B. Manjunath, P. Salembier, T. Sikora: Introduction to MPEG-7: Multimedia Content Description Interface (Wiley, New York 2002)
38.63
Zurück zum Zitat M. Evrard, D. Couroussé, N. Castagné, C. Cadoz, J.-L. Florens, A. Luciani: The GMS File Format: Specifications of the version 0.1 of the format, Technical report, INPG, ACROE/ICA, Grenoble, France (2006) M. Evrard, D. Couroussé, N. Castagné, C. Cadoz, J.-L. Florens, A. Luciani: The GMS File Format: Specifications of the version 0.1 of the format, Technical report, INPG, ACROE/ICA, Grenoble, France (2006)
38.64
Zurück zum Zitat J. Morrison: EA IFF 85: Standard for Interchange Format Files. Technical report, Electronic Arts (1985) J. Morrison: EA IFF 85: Standard for Interchange Format Files. Technical report, Electronic Arts (1985)
38.65
Zurück zum Zitat A.R. Jensenius, T. Kvifte, R.I. Godøy: Towards a gesture description interchange format. In: Proc. Int. Conf. New Interfaces for Music. Expr (2006) pp. 176–179 A.R. Jensenius, T. Kvifte, R.I. Godøy: Towards a gesture description interchange format. In: Proc. Int. Conf. New Interfaces for Music. Expr (2006) pp. 176–179
38.66
Zurück zum Zitat M. Wright, A. Freed, A. Momeni: OpenSound control: State of the art 2003. In: Proc. Int. Conf. New Interfaces Music. Expr., Montreal (2003) M. Wright, A. Freed, A. Momeni: OpenSound control: State of the art 2003. In: Proc. Int. Conf. New Interfaces Music. Expr., Montreal (2003)
38.67
Zurück zum Zitat J.J. Burred, C.E. Cella, G. Peeters, A. Roebel, D. Schwarz: Using the SDIF sound description interchange format for audio features. In: Proc. Int. Conf. Music Inf. Retr. (2008) pp. 427–432 J.J. Burred, C.E. Cella, G. Peeters, A. Roebel, D. Schwarz: Using the SDIF sound description interchange format for audio features. In: Proc. Int. Conf. Music Inf. Retr. (2008) pp. 427–432
38.68
Zurück zum Zitat P. Roland: The Music Encoding Initiative (MEI). In: Proc. 1st Int. Conf. Music. Appl. using XML (2002) pp. 55–59 P. Roland: The Music Encoding Initiative (MEI). In: Proc. 1st Int. Conf. Music. Appl. using XML (2002) pp. 55–59
38.69
Zurück zum Zitat A. Camurri, P. Coletta, A. Massari, B. Mazzarino, M. Peri, M. Ricchetti, A. Ricci, G. Volpe: Toward real-time multimodal processing: EyesWeb 4.0. In: Proc. Artif. Intell. Simul. Behav. Conv., Leeds (2004) pp. 22–26 A. Camurri, P. Coletta, A. Massari, B. Mazzarino, M. Peri, M. Ricchetti, A. Ricci, G. Volpe: Toward real-time multimodal processing: EyesWeb 4.0. In: Proc. Artif. Intell. Simul. Behav. Conv., Leeds (2004) pp. 22–26
38.70
Zurück zum Zitat A. Camurri, P. Coletta, G. Varni, S. Ghisio: Developing multimodal interactive systems with EyesWeb XMI. In: Proc. Int. Conf. New Interfaces for Music. Expr., New York (2007) pp. 305–308 A. Camurri, P. Coletta, G. Varni, S. Ghisio: Developing multimodal interactive systems with EyesWeb XMI. In: Proc. Int. Conf. New Interfaces for Music. Expr., New York (2007) pp. 305–308
38.71
Zurück zum Zitat B. Burger, P. Toiviainen: MoCap Toolbox – A Matlab toolbox for computational analysis of movement data. In: Proc. Sound Music Comput. Conf. (2013) pp. 172–178 B. Burger, P. Toiviainen: MoCap Toolbox – A Matlab toolbox for computational analysis of movement data. In: Proc. Sound Music Comput. Conf. (2013) pp. 172–178
38.72
Zurück zum Zitat J. Jaimovich, B. Knapp: Synchronization of multimodal recordings for musical performance research. In: Proc. Int. Conf. New Interfaces Music. Expr., Sydney (2010) pp. 372–374 J. Jaimovich, B. Knapp: Synchronization of multimodal recordings for musical performance research. In: Proc. Int. Conf. New Interfaces Music. Expr., Sydney (2010) pp. 372–374
38.73
Zurück zum Zitat O. Mayor, J. Llop, E. Maestre: RepoVizz: A multimodal on-line database and browsing tool for music performance research. In: Int. Soc. Music Inform. Retr. Conf (2011) O. Mayor, J. Llop, E. Maestre: RepoVizz: A multimodal on-line database and browsing tool for music performance research. In: Int. Soc. Music Inform. Retr. Conf (2011)
Metadaten
Titel
Methods for Studying Music-Related Body Motion
verfasst von
Alexander Refsum Jensenius
Copyright-Jahr
2018
Verlag
Springer Berlin Heidelberg
DOI
https://doi.org/10.1007/978-3-662-55004-5_38

    Marktübersichten

    Die im Laufe eines Jahres in der „adhäsion“ veröffentlichten Marktübersichten helfen Anwendern verschiedenster Branchen, sich einen gezielten Überblick über Lieferantenangebote zu verschaffen.