ABSTRACT
Piano&Dancer is an interactive piece for a dancer and an electromechanical acoustic piano. The piece presents the dancer and the piano as two performers on stage whose bodily movements are mutually interdependent. This interdependence reveals a close relationship between physical and musical gestures. Accordingly, the realisation of the piece has been based on creative processes that merge choreographic and compositional methods. In order to relate the expressive movement qualities of a dancer to the creation of musical material, the piece employs a variety of techniques. These include methods for movement tracking and feature analysis, generative algorithms for creating musical structures, and the application of non-conventional scales and chord transformations to shape the modal characteristics of the music. The publication contextualises Piano&Dancer by relating its creation to concepts of embodiment, interactivity and musical structure and by discussing opportunities for creative cross-fertilisation between dance choreography and musical composition. It also provides some details about the challenges and potentials of integrating a mechanical musical instrument into an interactive setting for a dance performance. Finally, the paper highlights some of the technical and aesthetic principles that were used in order to connect expressive qualities of body movements to the creation of music structures.
- Rafael Pérez Arroyo. 2004. Music in the age of the pyramids: La música en la era de las pirámides. Natural Acoustic Recordings.Google Scholar
- Enda Bates. 2009. The Composition and Performance of Spatial Music. Ph.D. Dissertation. Trinity College Dublin.Google Scholar
- Gregory Bateson. 1972. Steps to an ecology of mind: Collected essays in anthropology psychiatry evolution, and epistemology. University of Chicago Press.Google Scholar
- Andreas Bergsland and Robert Wechsler. 2015. Composing interactive dance pieces for the motioncomposer, a device for persons with disabilities. In Proceedings of the international conference on New Interfaces for Musical Expression. 20--23. Google ScholarDigital Library
- Daniel Bisig, Martin Neukom, and John Flury. 2008. Interactive Swarm Orchestra a Generic Programming Environment for Swarm Based Computer Music. In ICMC.Google Scholar
- Daniel Bisig, Pablo Palacio, and Muriel Romero. 2016. Piano&Dancer. In Proceedings of the 19th Generative Art Conference. 138--154.Google Scholar
- Alethea Blackler, Vesna Popovic, and Doug Mahar. 2010. Investigating usersfi intuitive interaction with complex artefacts. Applied ergonomics 41, 1 (2010), 72--92.Google Scholar
- Antonio Camurri, Corrado Canepa, Nicola Ferrari, Maurizio Mancini, Radoslaw Niewiadomski, Stefano Piana, Gualtiero Volpe, Jean-Marc Matos, Pablo Palacio, and Muriel Romero. 2016. A system to support the learning of movement qualities in dance: a case study on dynamic symmetry. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct. ACM, 973--976. Google ScholarDigital Library
- Antonio Camurri, Gualtiero Volpe, Stefano Piana, Maurizio Mancini, Radoslaw Niewiadomski, Nicola Ferrari, and Corrado Canepa. 2016. The dancer in the eye: towards a multi-layered computational framework of qualities in movement. In Proceedings of the 3rd International Symposium on Movement and Computing. ACM, 6. Google ScholarDigital Library
- William CD-ROM FORSYTHE, Nick HAFFER, and Volker KUCHELMEISTER. Improvisation technologies: a tool for the analytical dance eye. Música Thom Willems, Maxin Franke; direção de fotografia Thomas Lovell Balogh, Jess Hall. SI: RD-Studio, 1999. 1 CD-ROM multimídia. (????).Google Scholar
- M Coniglio and D Stoppielo. 2007. MidiDancer: A Wireless Movement Sensing System. Available on-line at www.troikaranch.org/mididancer.html. Accessed January (2007).Google Scholar
- Brent Gillespie. 1999. Music, Cognition, and Computerized Sound: An Introduction to Psychoacoustics, chapter 18, Haptics. (1999). Google ScholarDigital Library
- Richard Graham and Brian Bridges. 2015. Managing musical complexity with embodied metaphors. In Proceedings of the International Conference on New Interfaces for Musical Expression (NIME). Louisiana State University. Google ScholarDigital Library
- Jörn Hurtienne and Johann Habakuk Israel. 2007. Image schemas and their metaphorical extensions: intuitive patterns for tangible interaction. In Proceedings of the 1st international conference on Tangible and embedded interaction. ACM, 127--134. Google ScholarDigital Library
- Ksenia Kolykhalova, Paolo Alborno, Antonio Camurri, and Gualtiero Volpe. 2016. A serious games platform for validating sonification of human full-body movement qualities. In Proceedings of the 3rd International Symposium on Movement and Computing. ACM, 39. Google ScholarDigital Library
- George Lakoff. 1990. The Invariance Hypothesis: is abstract reason based on image-schemas? Cognitive Linguistics (includes Cognitive Linguistic Bibliography) 1, 1 (1990), 39--74.Google Scholar
- Marc Leman. 2012. Music and schema theory: Cognitive foundations of systematic musicology. Vol. 31. Springer Science & Business Media.Google Scholar
- Aaron Levisohn and Thecla Schiphorst. 2011. Embodied engagement: Supporting movement awareness in ubiquitous computing systems. Ubiquitous Learning: An International Journal 3 (2011), 97--111.Google ScholarCross Ref
- Mary Michelle Mainsbridge. 2016. Body as instrument: an exploration of gestural interface design. Ph.D. Dissertation.Google Scholar
- Erin Manning. 2006. Prosthetics making sense: dancing the technogenetic body. the Fibreculture Journal 9 (2006).Google Scholar
- Olivier Messiaen. 2002. Traité de rythme, de couleur, et d'ornithologie: (1949-1992): en sept tomes. Alphonse Leduc, Paris.Google Scholar
- Olivier Messian. 1994. The technique of my musical language. Alphonse Leduc, Paris.Google Scholar
- Jean Newlove and John Dalby. 2004. Laban for all. Taylor & Francis US.Google Scholar
- Kia Ng. 2012. mConduct: transcending domains and distributed performance. Ann Arbor, MI: Michigan Publishing, University of Michigan Library.Google Scholar
- Ricardo Pedrosa and Karon MacLean. 2008. Perceptually informed roles for haptic feedback in expressive music controllers. In International Workshop on Haptic and Audio Interaction Design. Springer, 21--29. Google ScholarDigital Library
- Stefano Piana, Paolo Alborno, Radoslaw Niewiadomski, Maurizio Mancini, Gualtiero Volpe, and Antonio Camurri. 2016. Movement fluidity analysis based on performance and perception. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. ACM, 1629--1636. Google ScholarDigital Library
- Joseph Rovan and Vincent Hayward. 2000. Typology of tactile sounds and their synthesis in gesture-driven computer music performance. Trends in gestural control of music (2000), 297--320.Google Scholar
- Jan C Schacher and Daniel Bisig. 2014. Watch This! Expressive Movement in Electronic Music Performance. In Proceedings of the 2014 International Workshop on Movement and Computing. ACM, 106. Google ScholarDigital Library
- Denis Smalley. 1997. Spectromorphology: explaining sound-shapes. Organised sound 2, 02 (1997), 107--126. Google ScholarDigital Library
- Bàlint Andràs Varga. 1996. Conversations with Iannis Xenakis. Faber and Faber, London.Google Scholar
- Elizabeth Waterhouse. 2007. Technological Artifacts: The Transmission of William Forsythefis Choreographic Knowledge. In Congress on Research in Dance (CORD) 40th Annual Conference, Choreographies of Migration: Patterns of Global Mobility.Google Scholar
- Trevor Wishart and Simon Emmerson. 1996. On sonic art. Vol. 12. Psychology Press.Google Scholar
Index Terms
- Piano&Dancer: Interaction Between a Dancer and an Acoustic Instrument
Recommendations
The Dancer in the Eye: Towards a Multi-Layered Computational Framework of Qualities in Movement
MOCO '16: Proceedings of the 3rd International Symposium on Movement and ComputingThis paper presents a conceptual framework for the analysis of expressive qualities of movement. Our perspective is to model an observer of a dance performance. The conceptual framework is made of four layers, ranging from the physical signals that ...
Virtual rap dancer: invitation to dance
CHI EA '06: CHI '06 Extended Abstracts on Human Factors in Computing SystemsThis paper presents a virtual rap dancer that is able to dance to the beat of music coming in from music recordings, beats obtained from music, voice or other input through a microphone, motion beats detected in the video stream of a human dancer, or ...
Towards a Multimodal Repository of Expressive Movement Qualities in Dance
MOCO '16: Proceedings of the 3rd International Symposium on Movement and ComputingIn this paper, we present a new multimodal repository for the analysis of expressive movement qualities in dance. First, we discuss guidelines and methodology that we applied to create this repository. Next, the technical setup of recordings and the ...
Comments