Skip to main content
Top
Published in: Journal on Multimodal User Interfaces 3/2020

07-07-2020 | Original Paper

Multisensory instrumental dynamics as an emergent paradigm for digital musical creation

A retrospective and prospective of haptic-audio creation with physical models

Authors: James Leonard, Jérôme Villeneuve, Alexandros Kontogeorgakopoulos

Published in: Journal on Multimodal User Interfaces | Issue 3/2020

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

The nature of human/instrument interaction is a long-standing area of study, drawing interest from fields as diverse as philosophy, cognitive sciences, anthropology, human–computer-interaction, and artistic creation. In particular, the case of the interaction between performer and musical instrument provides an enticing framework for studying the instrumental dynamics that allow for embodiment, skill acquisition and virtuosity with (electro-)acoustical instruments, and questioning how such notions may be transferred into the realm of digital music technologies and virtual instruments. This paper offers a study of concepts and technologies allowing for instrumental dynamics with Digital Musical Instruments, through an analysis of haptic-audio creation centred on (a) theoretical and conceptual frameworks, (b) technological components—namely physical modelling techniques for the design of virtual mechanical systems and force-feedback technologies allowing mechanical coupling with them, and (c) a corpus of artistic works based on this approach. Through this retrospective, we argue that artistic works created in this field over the last 20 years—and those yet to come—may be of significant importance to the haptics community as new objects that question physicality, tangibility, and creativity from a fresh and rather singular angle. Following which, we discuss the convergence of efforts in this field, challenges still ahead, and the possible emergence of a new transdisciplinary community focused on multisensory digital art forms.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Footnotes
1
This relates to André Leroi-Gourhan’s definition of the “instrument” as a mechanical object used by man to perform a physical, morphological and functional adaptation between him/herself and the environment [49].
 
2
As displayed by the New Instruments for Musical Expression (NIME) community: https://​www.​nime.​org/​.
 
4
From NovInt Technologies: https://​hapticshouse.​com/​.
 
6
Such an analysis in the scope of non-haptic use of physical models for musical composition was undertaken in 2004 by Chris Chafe—see [21].
 
9
One could wonder whether simple vibrotactile audio feedback in the keys of an electric piano interface (see [29]) would yield a greater sense of presence and realism than a full haptic piano mechanism simulation.
 
Literature
1.
go back to reference Allen AS (2014) Ruratae: a physics-based audio engine. Ph.D. thesis, UC San Diego Allen AS (2014) Ruratae: a physics-based audio engine. Ph.D. thesis, UC San Diego
2.
go back to reference Bailenson JN, Yee N, Brave S, Merget D, Koslow D (2007) Virtual interpersonal touch: expressing and recognizing emotions through haptic devices. Human–Computer Interaction 22(3):325–353 Bailenson JN, Yee N, Brave S, Merget D, Koslow D (2007) Virtual interpersonal touch: expressing and recognizing emotions through haptic devices. Human–Computer Interaction 22(3):325–353
3.
go back to reference Berdahl E (2014) Audio-rate modulation of physical model parameters. In: International Computer Music Conference, ICMC 2014 Berdahl E (2014) Audio-rate modulation of physical model parameters. In: International Computer Music Conference, ICMC 2014
4.
go back to reference Berdahl E, Kontogeorgakopoulos A (2014) Engraving–Hammering–Casting: Exploring the sonic-ergotic medium for live musical performance. In: Proceedings of the International Computer Music Conference, pp 387–390. Ljubljana, Slovenia (2012) Berdahl E, Kontogeorgakopoulos A (2014) Engraving–Hammering–Casting: Exploring the sonic-ergotic medium for live musical performance. In: Proceedings of the International Computer Music Conference, pp 387–390. Ljubljana, Slovenia (2012)
5.
go back to reference Berdahl E, Kontogeorgakopoulos A (2013) The firefader: simple, open-source, and reconfigurable haptic force feedback for musicians. Comput Music J 37(1):23–34 Berdahl E, Kontogeorgakopoulos A (2013) The firefader: simple, open-source, and reconfigurable haptic force feedback for musicians. Comput Music J 37(1):23–34
6.
go back to reference Berdahl E, Kontogeorgakopoulos A, Overholt D (2010) Hsp v2: Haptic signal processing with extensions for physical modeling. In: 5th International Workshop on Haptic and Audio Interaction Design-HAID, Copenhagen, pp 61–62 Berdahl E, Kontogeorgakopoulos A, Overholt D (2010) Hsp v2: Haptic signal processing with extensions for physical modeling. In: 5th International Workshop on Haptic and Audio Interaction Design-HAID, Copenhagen, pp 61–62
7.
go back to reference Berdahl E, Pfalz A, Beck SD (2016) Very slack strings: a physical model and its use in the composition quartet for strings. In: Proceedings of the conference on new interfaces for musical expression (NIME), pp 9–10 Berdahl E, Pfalz A, Beck SD (2016) Very slack strings: a physical model and its use in the composition quartet for strings. In: Proceedings of the conference on new interfaces for musical expression (NIME), pp 9–10
8.
go back to reference Berdahl E, Pfalz A, Blandino M (2016) Hybrid virtual modeling for multisensory interaction design. Proc Audio Mostly 2016:215–221 Berdahl E, Pfalz A, Blandino M (2016) Hybrid virtual modeling for multisensory interaction design. Proc Audio Mostly 2016:215–221
9.
go back to reference Berdahl E, Pfalz A, Blandino M, Beck SD (2018) Force-feedback instruments for the laptop orchestra of louisiana. Musical haptics. Springer, Cham, pp 171–191 Berdahl E, Pfalz A, Blandino M, Beck SD (2018) Force-feedback instruments for the laptop orchestra of louisiana. Musical haptics. Springer, Cham, pp 171–191
10.
go back to reference Berdahl E, Smith III J (2012) An introduction to the synth-a-modeler compiler: modular and open-source sound synthesis using physical models. In: Proceedings of the Linux Audio Conference Berdahl E, Smith III J (2012) An introduction to the synth-a-modeler compiler: modular and open-source sound synthesis using physical models. In: Proceedings of the Linux Audio Conference
11.
go back to reference Bilbao S, Ducceschi M, Webb C (2019) Large-scale real-time modular physical modeling sound synthesis. In: Proceedings of the international conference on digital audio effects (DAFx 2019), Birmingham, UK Bilbao S, Ducceschi M, Webb C (2019) Large-scale real-time modular physical modeling sound synthesis. In: Proceedings of the international conference on digital audio effects (DAFx 2019), Birmingham, UK
12.
go back to reference Bilbao SD (2009) Numerical sound synthesis. Wiley, New YorkMATH Bilbao SD (2009) Numerical sound synthesis. Wiley, New YorkMATH
13.
go back to reference Cadoz C (1994) Le geste canal de communication homme/machine: la communication” instrumentale”. Technique et Science Informatiques 13(1):31–61 Cadoz C (1994) Le geste canal de communication homme/machine: la communication” instrumentale”. Technique et Science Informatiques 13(1):31–61
14.
go back to reference Cadoz C, Luciani A, Florens JL (1993) Cordis-anima: a modeling and simulation system for sound and image synthesis: the general formalism. Comput Music J 17(1):19–29 Cadoz C, Luciani A, Florens JL (1993) Cordis-anima: a modeling and simulation system for sound and image synthesis: the general formalism. Comput Music J 17(1):19–29
15.
go back to reference Cadoz C, Luciani A, Florens JL, Castagné N (2003) Acroe-ica: artistic creation and computer interactive multisensory simulation force feedback gesture transducers. In: Proceedings of the 2003 conference on New interfaces for musical expression, pp 235–246 Cadoz C, Luciani A, Florens JL, Castagné N (2003) Acroe-ica: artistic creation and computer interactive multisensory simulation force feedback gesture transducers. In: Proceedings of the 2003 conference on New interfaces for musical expression, pp 235–246
16.
go back to reference Cadoz C, Luciani A, Florens JL, Roads C, Chadabe F (1984) Responsive input devices and sound synthesis by stimulation of instrumental mechanisms: the cordis system. Comput Music J 8(3):60–73 Cadoz C, Luciani A, Florens JL, Roads C, Chadabe F (1984) Responsive input devices and sound synthesis by stimulation of instrumental mechanisms: the cordis system. Comput Music J 8(3):60–73
17.
go back to reference Cadoz C, Wanderley MM et al (2001) Gesture: music. In: Wanderley MM, Battier M (eds) Trends in gestural control of music, Paris, IRCAM/Centre Pompidou Cadoz C, Wanderley MM et al (2001) Gesture: music. In: Wanderley MM, Battier M (eds) Trends in gestural control of music, Paris, IRCAM/Centre Pompidou
18.
go back to reference Castagné N, Cadoz C (2003) 10 criteria for evaluating physical modelling schemes for music creation. In: Proceedings of the 9th international conference on digital audio effects Castagné N, Cadoz C (2003) 10 criteria for evaluating physical modelling schemes for music creation. In: Proceedings of the 9th international conference on digital audio effects
19.
go back to reference Castagné N, Cadoz C, Florens JL, Luciani A (2004) Haptics in computer music : a paradigm shift. In: Proceedings of EuroHaptics Castagné N, Cadoz C, Florens JL, Luciani A (2004) Haptics in computer music : a paradigm shift. In: Proceedings of EuroHaptics
20.
go back to reference Cavusoglu MC, Tendick F (2000) Multirate simulation for high fidelity haptic interaction with deformable objects in virtual environments. In: Robotics and Automation, 2000. Proceedings. ICRA’00. IEEE International Conference on, vol 3, pp 2458–2465. IEEE Cavusoglu MC, Tendick F (2000) Multirate simulation for high fidelity haptic interaction with deformable objects in virtual environments. In: Robotics and Automation, 2000. Proceedings. ICRA’00. IEEE International Conference on, vol 3, pp 2458–2465. IEEE
21.
go back to reference Chafe C (2004) Case studies of physical models in music composition. In: Proceedings of the 18th international congress on acoustics Chafe C (2004) Case studies of physical models in music composition. In: Proceedings of the 18th international congress on acoustics
22.
go back to reference Christensen PJ, Serafin S (2019) Graph based physical models for sound synthesis. In: International conference on sound and music computing Christensen PJ, Serafin S (2019) Graph based physical models for sound synthesis. In: International conference on sound and music computing
23.
go back to reference Ding S, Gallacher C (2018) The haply development platform: a modular and open-sourced entry level haptic toolset. In: Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems, p D309. ACM Ding S, Gallacher C (2018) The haply development platform: a modular and open-sourced entry level haptic toolset. In: Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems, p D309. ACM
24.
go back to reference Emmerson S (2009) Combining the acoustic and the digital: music for instruments and computers or prerecorded sound. In: The Oxford handbook of computer music Emmerson S (2009) Combining the acoustic and the digital: music for instruments and computers or prerecorded sound. In: The Oxford handbook of computer music
25.
go back to reference Fels S, Gadd A, Mulder A (2002) Mapping transparency through metaphor: towards more expressive musical instruments. Organised Sound 7(2):109–126 Fels S, Gadd A, Mulder A (2002) Mapping transparency through metaphor: towards more expressive musical instruments. Organised Sound 7(2):109–126
26.
go back to reference Florens JL (1978) Coupleur gestuel retroactif pour la commande et le controle de sons synthetises en temps-reel. Ph.D. thesis, Institut National Polytechnique de Grenoble Florens JL (1978) Coupleur gestuel retroactif pour la commande et le controle de sons synthetises en temps-reel. Ph.D. thesis, Institut National Polytechnique de Grenoble
27.
go back to reference Florens Jl, Henry C (2002) Real-time bowed string synthesis with force feedback gesture interaction. In: Proceedings of the Forum Acusticum Florens Jl, Henry C (2002) Real-time bowed string synthesis with force feedback gesture interaction. In: Proceedings of the Forum Acusticum
28.
go back to reference Florens JL, Luciani A, Cadoz C, Castagné N (2004) Ergos: Multi-degrees of freedom and versatile force-feedback panoply. In: EuroHaptics 2004 Florens JL, Luciani A, Cadoz C, Castagné N (2004) Ergos: Multi-degrees of freedom and versatile force-feedback panoply. In: EuroHaptics 2004
29.
go back to reference Flückiger M, Grosshauser T, Tröster G (2018) Influence of piano key vibration level on players’ perception and performance in piano playing. Appl Sci 8(12):2697 Flückiger M, Grosshauser T, Tröster G (2018) Influence of piano key vibration level on players’ perception and performance in piano playing. Appl Sci 8(12):2697
30.
go back to reference Gillespie B (1994) The virtual piano action: Design and implementation. In: International Computer Music Conference, ICMC 1994 Gillespie B (1994) The virtual piano action: Design and implementation. In: International Computer Music Conference, ICMC 1994
31.
go back to reference Gillespie RB, O’Modhrain S (2011) Embodied cognition as a motivating perspective for haptic interaction design: a position paper. In: World Haptics Conference (WHC), 2011 IEEE, pp 481–486 Gillespie RB, O’Modhrain S (2011) Embodied cognition as a motivating perspective for haptic interaction design: a position paper. In: World Haptics Conference (WHC), 2011 IEEE, pp 481–486
32.
go back to reference Giordano M, Wanderley MM (2013) Perceptual and technological issues in the design of vibrotactile-augmented interfaces for music technology and media. In: International Workshop on Haptic and Audio Interaction Design, pp 89–98. Springer Giordano M, Wanderley MM (2013) Perceptual and technological issues in the design of vibrotactile-augmented interfaces for music technology and media. In: International Workshop on Haptic and Audio Interaction Design, pp 89–98. Springer
33.
go back to reference Grindlay G (2008) Haptic guidance benefits musical motor learning. In: 2008 Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, pp 397–404. IEEE Grindlay G (2008) Haptic guidance benefits musical motor learning. In: 2008 Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, pp 397–404. IEEE
34.
go back to reference Hayes L (2012) Performing articulation and expression through a haptic interface. International Computer Music Conference, ICMC 2012, pp. 400–403 Hayes L (2012) Performing articulation and expression through a haptic interface. International Computer Music Conference, ICMC 2012, pp. 400–403
35.
go back to reference Hayward V, Astley OR, Cruz-Hernandez M, Grant D, Robles-De-La-Torre G (2004) Haptic interfaces and devices. Sens Rev 24(1):16–29 Hayward V, Astley OR, Cruz-Hernandez M, Grant D, Robles-De-La-Torre G (2004) Haptic interfaces and devices. Sens Rev 24(1):16–29
36.
go back to reference Henry C (2004) Physical modeling for pure data (PMPD) and real time interaction with an audio synthesis. In: Proceedings of the Sound and Music Computing Conference, SMC Henry C (2004) Physical modeling for pure data (PMPD) and real time interaction with an audio synthesis. In: Proceedings of the Sound and Music Computing Conference, SMC
37.
go back to reference Herrera D, Florens JL, Voda A (2013) Identification approach for the analysis of human–haptic interface coupling. In: 11th IFAC International workshop on adaptation and learning in control and signal processing (ALCOSP 2013), pp 187–192 Herrera D, Florens JL, Voda A (2013) Identification approach for the analysis of human–haptic interface coupling. In: 11th IFAC International workshop on adaptation and learning in control and signal processing (ALCOSP 2013), pp 187–192
38.
go back to reference Hiller L, Ruiz P (1971) Synthesizing musical sounds by solving the wave equation for vibrating objects: part 1. J Audio Eng Soc 19(6):462–470 Hiller L, Ruiz P (1971) Synthesizing musical sounds by solving the wave equation for vibrating objects: part 1. J Audio Eng Soc 19(6):462–470
39.
go back to reference Howard DM, Rimell S (2004) Real-time gesture-controlled physical modelling music synthesis with tactile feedback. EURASIP J Adv Signal Process 2004(7):830184 Howard DM, Rimell S (2004) Real-time gesture-controlled physical modelling music synthesis with tactile feedback. EURASIP J Adv Signal Process 2004(7):830184
40.
go back to reference Iovino: Modalys: a synthesizer for the composer-luthier-performer. In: IRCAM internal article (1998) Iovino: Modalys: a synthesizer for the composer-luthier-performer. In: IRCAM internal article (1998)
41.
go back to reference Kontogeorgakopoulos A, Cadoz C (2007) Cordis anima physical modeling and simulation system analysis. In: 4th Sound and Music Computing Conference 2007, pp 275–282 Kontogeorgakopoulos A, Cadoz C (2007) Cordis anima physical modeling and simulation system analysis. In: 4th Sound and Music Computing Conference 2007, pp 275–282
42.
go back to reference Kontogeorgakopoulos A, Siorros G, Klissouras O (2019) Mechanical entanglement: a collaborative haptic-music performance. In: 16th Sound and Music Computing Conference 2019, pp 20–25 Kontogeorgakopoulos A, Siorros G, Klissouras O (2019) Mechanical entanglement: a collaborative haptic-music performance. In: 16th Sound and Music Computing Conference 2019, pp 20–25
43.
go back to reference Lederman SJ, Klatzky RL (2009) Haptic perception: a tutorial. Attention Perception Psychophys 71(7):1439–1459 Lederman SJ, Klatzky RL (2009) Haptic perception: a tutorial. Attention Perception Psychophys 71(7):1439–1459
44.
go back to reference Leonard J, Cadoz C (2015) Physical modelling concepts for a collection of multisensory virtual musical instruments. Proc Int Conf New Interfaces Musical Exp 2015:150–155 Leonard J, Cadoz C (2015) Physical modelling concepts for a collection of multisensory virtual musical instruments. Proc Int Conf New Interfaces Musical Exp 2015:150–155
45.
go back to reference Leonard J, Castagné N, Cadoz C, Luciani A (2018) The msci platform: a framework for the design and simulation of multisensory virtual musical instruments. In: Musical Haptics, pp 151–169. Springer Leonard J, Castagné N, Cadoz C, Luciani A (2018) The msci platform: a framework for the design and simulation of multisensory virtual musical instruments. In: Musical Haptics, pp 151–169. Springer
46.
go back to reference Leonard J, Florens JL, Cadoz C, Castagné N (2014) Exploring the role of dynamic audio-haptic coupling in musical gestures on simulated instruments. In: International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, pp 469–477. Springer Leonard J, Florens JL, Cadoz C, Castagné N (2014) Exploring the role of dynamic audio-haptic coupling in musical gestures on simulated instruments. In: International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, pp 469–477. Springer
47.
go back to reference Leonard J, Villeneuve J (2019) Fast audio-haptic prototyping with mass-interaction physics. In: International Workshop on Haptic and Audio Interaction Design-HAID2019 Leonard J, Villeneuve J (2019) Fast audio-haptic prototyping with mass-interaction physics. In: International Workshop on Haptic and Audio Interaction Design-HAID2019
48.
go back to reference Leonard J, Villeneuve J (2019) mi-gen\(\sim \): An efficient and accessible mass-interaction sound synthesis toolbox. In: International Conference on Sound and Music Computing Leonard J, Villeneuve J (2019) mi-gen\(\sim \): An efficient and accessible mass-interaction sound synthesis toolbox. In: International Conference on Sound and Music Computing
49.
go back to reference Leroi-Gourhan A (2013) Le geste et la parole: technique et langage, vol 1. Albin Michel Leroi-Gourhan A (2013) Le geste et la parole: technique et langage, vol 1. Albin Michel
50.
go back to reference Luciani A, Florens JL, Couroussé D, Castet J (2009) Ergotic sounds: a new way to improve playability, believability and presence of virtual musical instruments. J New Music Res 38(3):309–323 Luciani A, Florens JL, Couroussé D, Castet J (2009) Ergotic sounds: a new way to improve playability, believability and presence of virtual musical instruments. J New Music Res 38(3):309–323
51.
go back to reference Magnusson T (2010) Designing constraints: composing and performing with digital musical systems. Comput Music J 34(4):62–73 Magnusson T (2010) Designing constraints: composing and performing with digital musical systems. Comput Music J 34(4):62–73
52.
go back to reference Magnusson T (2017) Musical organics: a heterarchical approach to digital organology. J New Music Res 46(3):286–303 Magnusson T (2017) Musical organics: a heterarchical approach to digital organology. J New Music Res 46(3):286–303
53.
go back to reference Malloch J, Birnbaum D, Sinyor E, Wanderley MM (2006) Towards a new conceptual framework for digital musical instruments. In: Proceedings of the 9th international conference on digital audio effects, pp 49–52 Malloch J, Birnbaum D, Sinyor E, Wanderley MM (2006) Towards a new conceptual framework for digital musical instruments. In: Proceedings of the 9th international conference on digital audio effects, pp 49–52
54.
go back to reference Marlière S, Marchi F, Florens JL, Luciani A, Chevrier J (2008) An augmented reality nanomanipulator for learning nanophysics: The “nanolearner” platform. In: International Conference on Cyberworlds 2008, pp 94–101. IEEE Marlière S, Marchi F, Florens JL, Luciani A, Chevrier J (2008) An augmented reality nanomanipulator for learning nanophysics: The “nanolearner” platform. In: International Conference on Cyberworlds 2008, pp 94–101. IEEE
55.
go back to reference Morgan D, Qiao S (2009) Analysis of damped mass-spring systems for sound synthesis. EURASIP J Audio Speech Music Process 2009(1):947823 Morgan D, Qiao S (2009) Analysis of damped mass-spring systems for sound synthesis. EURASIP J Audio Speech Music Process 2009(1):947823
56.
go back to reference Nichols C (2002) The vbow: a virtual violin bow controller for mapping gesture to synthesis with haptic feedback. Organised Sound 7(2):215–220 Nichols C (2002) The vbow: a virtual violin bow controller for mapping gesture to synthesis with haptic feedback. Organised Sound 7(2):215–220
57.
go back to reference O’Modhrain S (2011) A framework for the evaluation of digital musical instruments. Comput Music J 35(1):28–42 O’Modhrain S (2011) A framework for the evaluation of digital musical instruments. Comput Music J 35(1):28–42
58.
go back to reference Orlarey Y, Fober D, Letz S (2009) FAUST: an Efficient Functional Approach to DSP Programming. In: New computational paradigms for computer music, pp 65–96 Orlarey Y, Fober D, Letz S (2009) FAUST: an Efficient Functional Approach to DSP Programming. In: New computational paradigms for computer music, pp 65–96
59.
go back to reference O’Modhrain S, Gillespie RB (2018) Once more, with feeling: revisiting the role of touch in performer-instrument interaction. In: Musical Haptics, pp 11–27. Springer, Cham O’Modhrain S, Gillespie RB (2018) Once more, with feeling: revisiting the role of touch in performer-instrument interaction. In: Musical Haptics, pp 11–27. Springer, Cham
60.
go back to reference Paine G (2010) Towards a taxonomy of realtime interfaces for electronic music performance. In: Proceedings of the conference on new interfaces for musical expression (NIME), pp 436–439 Paine G (2010) Towards a taxonomy of realtime interfaces for electronic music performance. In: Proceedings of the conference on new interfaces for musical expression (NIME), pp 436–439
61.
go back to reference Papetti S, Saitis C (2018) Musical Haptics: Introduction. In: Musical Haptics, pp 1–7. Springer Papetti S, Saitis C (2018) Musical Haptics: Introduction. In: Musical Haptics, pp 1–7. Springer
62.
go back to reference Pearson M (1996) Tao: a physical modelling system and related issues. Organised Sound 1(1):43–50 Pearson M (1996) Tao: a physical modelling system and related issues. Organised Sound 1(1):43–50
63.
go back to reference Rimell S, Howard DM, Tyrrell AM, Kirk R, Hunt A (2002) Cymatic. restoring the physical manifestation of digital sound using haptic interfaces to control a new computer based musical instrument. In: International computer music conference, ICMC 2002 Rimell S, Howard DM, Tyrrell AM, Kirk R, Hunt A (2002) Cymatic. restoring the physical manifestation of digital sound using haptic interfaces to control a new computer based musical instrument. In: International computer music conference, ICMC 2002
64.
go back to reference Salminen K, Surakka V, Lylykangas J, Raisamo J, Saarinen R, Raisamo R, Rantala J, Evreinov G (2008) Emotional and behavioral responses to haptic stimulation. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp 1555–1562 Salminen K, Surakka V, Lylykangas J, Raisamo J, Saarinen R, Raisamo R, Rantala J, Evreinov G (2008) Emotional and behavioral responses to haptic stimulation. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp 1555–1562
65.
go back to reference Sheffield E, Berdahl E, Pfalz A (2016) The haptic capstans: rotational force feedback for music using a firefader derivative device. Proc Int Conf New Interfaces Musical Exp 16:1–2 Sheffield E, Berdahl E, Pfalz A (2016) The haptic capstans: rotational force feedback for music using a firefader derivative device. Proc Int Conf New Interfaces Musical Exp 16:1–2
66.
go back to reference Sinclair S, Florens JL, Wanderley M (2010) A Haptic simulator for gestural interaction with the bowed string. In: 10ème Congrès Français d’Acoustique Sinclair S, Florens JL, Wanderley M (2010) A Haptic simulator for gestural interaction with the bowed string. In: 10ème Congrès Français d’Acoustique
67.
go back to reference Sinclair S, Wanderley MM (2008) A run-time programmable simulator to enable multi-modal interaction with rigid-body systems. Interact Comput 21(1–2):54–63 Sinclair S, Wanderley MM (2008) A run-time programmable simulator to enable multi-modal interaction with rigid-body systems. Interact Comput 21(1–2):54–63
68.
go back to reference Smith J, MacLean K (2007) Communicating emotion through a haptic link: design space and methodology. Int J Human–Computer Stud 65(4):376–387 Smith J, MacLean K (2007) Communicating emotion through a haptic link: design space and methodology. Int J Human–Computer Stud 65(4):376–387
69.
go back to reference Smith JO (1992) Physical modeling using digital waveguides. Comput Music J 16(4):74–91 Smith JO (1992) Physical modeling using digital waveguides. Comput Music J 16(4):74–91
70.
go back to reference Turchet L, Barthet M (2017) Envisioning smart musical haptic wearables to enhance performers’ creative communication. In: Proceedings of international symposium on computer music multidisciplinary research, pp 538–549 Turchet L, Barthet M (2017) Envisioning smart musical haptic wearables to enhance performers’ creative communication. In: Proceedings of international symposium on computer music multidisciplinary research, pp 538–549
71.
go back to reference Verplank W (2005) Haptic music exercises. In: Proceedings of the 2005 conference on New interfaces for musical expression, pp 256–257. National University of Singapore Verplank W (2005) Haptic music exercises. In: Proceedings of the 2005 conference on New interfaces for musical expression, pp 256–257. National University of Singapore
72.
go back to reference Verplank W, Gurevich M, Mathews M (2002) The plank: designing a simple Haptic controller. In: Proceedings of the 2002 conference on New interfaces for musical expression, pp 1–4. National University of Singapore Verplank W, Gurevich M, Mathews M (2002) The plank: designing a simple Haptic controller. In: Proceedings of the 2002 conference on New interfaces for musical expression, pp 1–4. National University of Singapore
73.
go back to reference Villeneuve J, Leonard J (2019) Mass-interaction physical models for sound and multi-sensory creation: starting anew. In: International conference on sound and music computing Villeneuve J, Leonard J (2019) Mass-interaction physical models for sound and multi-sensory creation: starting anew. In: International conference on sound and music computing
74.
go back to reference Wanderley MM, Depalle P (2004) Gestural control of sound synthesis. Proc IEEE 92(4):632–644 Wanderley MM, Depalle P (2004) Gestural control of sound synthesis. Proc IEEE 92(4):632–644
75.
go back to reference Zappi V, McPherson A (2014) Dimensionality and appropriation in digital musical instrument design. Proc Conf New Interfaces Musical Exp (NIME) 14:455–460 Zappi V, McPherson A (2014) Dimensionality and appropriation in digital musical instrument design. Proc Conf New Interfaces Musical Exp (NIME) 14:455–460
Metadata
Title
Multisensory instrumental dynamics as an emergent paradigm for digital musical creation
A retrospective and prospective of haptic-audio creation with physical models
Authors
James Leonard
Jérôme Villeneuve
Alexandros Kontogeorgakopoulos
Publication date
07-07-2020
Publisher
Springer International Publishing
Published in
Journal on Multimodal User Interfaces / Issue 3/2020
Print ISSN: 1783-7677
Electronic ISSN: 1783-8738
DOI
https://doi.org/10.1007/s12193-020-00334-y

Other articles of this Issue 3/2020

Journal on Multimodal User Interfaces 3/2020 Go to the issue

Premium Partner