Abstract
The notion of “movement qualities” is central in contemporary dance; it describes the manner in which a movement is executed. Movement qualities convey information revealing movement expressiveness; their use has strong potential for movement-based interaction with applications in arts, entertainment, education, or rehabilitation. The purpose of our research is to design and evaluate interactive reflexive visuals for movement qualities. The theoretical basis for this research is drawn from a collaboration with the members of the international dance company Emio Greco|PC to study their formalization of movement qualities. We designed a pedagogical interactive installation called Double Skin/Double Mind (DS/DM) for the analysis and visualization of movement qualities through physical model-based interactive renderings.
In this article, we first evaluate dancers’ perception of the visuals as metaphors for movement qualities. This evaluation shows that, depending on the physical model parameterization, the visuals are capable of generating dynamic behaviors that the dancers associate with DS/DM movement qualities. Moreover, we evaluate dance students’ and professionals’ experience of the interactive visuals in the context of a dance pedagogical workshop and a professional dance training. The results of these evaluations show that the dancers consider the interactive visuals to be a reflexive system that encourages them to perform, improves their experience, and contributes to a better understanding of movement qualities. Our findings support research on interactive systems for real-time analysis and visualization of movement qualities, which open new perspectives in movement-based interaction design.
- B. Bermudez, S. Delahunta, M. Hoogenboom, C. Ziegler, F. Bevilacqua, S. Fdili Alaoui, and B. Meneses Gutierrez. 2011. The double skin/double mind interactive installation. Journal for Artistic Research (JAR).Google Scholar
- B. Bermudez and C. Fernandes. 2010. Inventing the interactive glossary: An approach to documenting contemporary dance. Arti Journal 2, 2, 29--31.Google Scholar
- F. Bevilacqua, F. Guedy, N. Schnell, E. Flety, and N. Leroy. 2007. Wireless sensor interface and gesture-follower for music pedagogy. In Proceedings of the 7th International Conference on New Interfaces for Musical Expression. 124--129. Google ScholarDigital Library
- L. A. Blom and L. T. Chaplin. 1982. The Intimate Act of Choreography. University of Pittsburgh Press, Pittsburgh, PA.Google Scholar
- D. Bouchard and N. Badler. 2007. Semantic segmentation of motion capture using Laban Movement Analysis. In Proceedings of the International Conference on Intelligent Virtual Agents. Springer, Paris, France, 37--44. Google ScholarDigital Library
- A. Camurri, S. Hashimoto, M. Ricchetti, A. Ricci, K. Suzuki, R. Trocca, and G. Volpe. 2000. Eyesweb: Toward gesture and affect recognition in interactive dance and music systems. Computer Music Journal 24, 1, 57--69. Google ScholarDigital Library
- D. Chi, M. Costa, L. Zhao, and N. Badler. 2000. The EMOTE model for effort and shape. In Proceedings of the Conference on Computer Graphics and Interactive Techniques (SIGGRAPH). ACM, New York, NY, 173--182. Google ScholarDigital Library
- G. Corness and T. Schiphorst. 2013. Performing with a system’s intention: Embodied cues in performer-system interaction. In Proceedings of the 9th ACM Conference on Creativity and Cognition. ACM, New York, NY, 156--164. Google ScholarDigital Library
- S. Delahunta. 2007. Capturing Intention: Documentation, Analysis and Notation Research Based on the Work of Emio Greco | PC. Emio Greco | PC and Amsterdam School of the Arts.Google Scholar
- S. Fdili Alaoui, F. Bevilacqua, B. Bermudez, and C. Jacquemin. 2013. Dance interaction with physical model visualization based on movement qualities. International Journal of Arts and Technology (IJART) 6, 4, 357--387.Google ScholarCross Ref
- S. Fdili Alaoui, B. Caramiaux, M. Serrano, and F. Bevilacqua. 2012. Dance movement qualities as interaction modality. In Proceedings of the ACM Conference on Designing Interactive Systems (DIS). ACM, New York, NY, 761--769. Google ScholarDigital Library
- S. Fdili Alaoui, C. Henry, and C. Jacquemin. 2014. Physical modeling for interactive installations and the performing arts. International Journal of Performing Arts and Digital Media (IJPADM) 10, 2, 159--178.Google ScholarCross Ref
- J. Georgii and R. Westermann. 2005. Mass-spring systems on the GPU. Simulation Modelling Practice and Theory 13, 8, 693--702.Google ScholarCross Ref
- P. Hackney. 2003. Making Connections: Total Body Integration through Bartenieff Fundamentals. Routledge, New York, NY.Google ScholarCross Ref
- W. N. W. Hashim, N. L. M. Noor, and W. A. W. Adnan. 2009. The design of aesthetic interaction: Towards a graceful interaction framework. In Proceedings of the 2nd International Conference on Interaction Sciences: Information Technology, Culture and Human. ACM, New York, NY, 69--75. Google ScholarDigital Library
- C. Hsieh and C. Luciani. 2005. Generating dance verbs and assisting computer choreography. In Proceedings of the ACM International Conference on Multimedia (SIGMM). ACM, New York, NY, 774--782. Google ScholarDigital Library
- C Jacquemin. 2008. Allegra: A new instrument for bringing interactive graphics to life. In Proceedings of the ACM International Conference on Multimedia (SIGMM). ACM, New York, NY, 961--964. Google ScholarDigital Library
- C. Jacquemin and S. de Laubier. 2006. Transmodal feedback as a new perspective for audio-visual effects. In Proceedings of the 2006 Conference on New Interfaces for Musical Expression. 156--161. Google ScholarDigital Library
- M. W. Kernodle and L. G. Carlton. 1992. Information feedback and the learning multiple-degree-of-freedom activities. Journal of Motor Behavior. 24, 2, 187--196.Google ScholarCross Ref
- J. Kjolberg. 2004. Designing full body movement interaction using modern dance as a starting point. In Proceedings of the ACM Conference on Designing Interactive Systems (DIS). ACM, New York, NY, 353--356. Google ScholarDigital Library
- R. Laban. 1994. La Maîtrise du Mouvement. Actes Sud, Arles, France.Google Scholar
- Y. Lee, D. Terzopoulos, and K. Waters. 1995. Realistic modeling for facial animation. In Proceedings of the Conference on Computer Graphics and Interactive Techniques (SIGGRAPH). ACM, New York, NY, 55--62. Google ScholarDigital Library
- A. Lockhart and E. Pease. 1982. Modern Dance: Building and Teaching Lessons (6th ed.). William C. Brown, New York, NY.Google Scholar
- D. S. Maranan, S. Fdili Alaoui, T. Schiphorst, P. Pasquier, and L. Bartram. 2013. Designing for movement: Evaluating computational models using LMA effort qualities. In Proceedings of the ACM International Conference on Human Factors in Computing Systems (SIGCHI). ACM, New York, NY, 991--1000. Google ScholarDigital Library
- H. Mentis and C. Johansson. 2013. Seeing movement qualities. In Proceedings of the International Conference on Human Factors in Computing Systems (SIGCHI). ACM, New York, NY, 3375--3384. Google ScholarDigital Library
- J. Moen and J. Sandsjo. 2005. BodyBug-design of KinAesthetic interaction. In Digital Proceedings of NORDES In the Making.Google Scholar
- A. Momeni and C Henry. 2006. Dynamic independent mapping layers for concurrent control of audio and video synthesis. Computer Music Journal 30, 1, 49--66. Google ScholarDigital Library
- F. Pachet. 2006. Enhancing individual creativity with interactive musical reflective systems. I. Deliège, and G. Wiggins (Eds.), Musical Creativity: Multidisciplinary Research in Theory And Practice. Psychology Press, New York, NY.Google Scholar
- X. Provot. 1997. Collision and self-collision detection handling in cloth model dedicated to design garment. In Proceedings of Graphics Interface. 177--189.Google Scholar
- T. Schiphorst. 2009. soft (n): Toward a somaesthetics of touch. In Proceedings of the International Conference on Human Factors in Computing Systems (SIGCHI). ACM, New York, NY, 2427--2438. Google ScholarDigital Library
- T. Schiphorst, N. Jaffe, and R. Lovell. 2005. Threads of recognition: Using touch as input with directionally conductive fabric. In Proceedings of the International Conference on Human Factors in Computing Systems (SIGCHI). ACM, New York, NY, 2--7.Google Scholar
- P. Subyen, D. S. Maranan, K. Carlson, T. Schiphorst, and P. Pasquier. 2011. Flow: Expressing movement quality. In Learning, The User in Flux Workshop at the ACM International Conference on Human Factors in Computing Systems (SIGCHI). Vancouver, Canada.Google Scholar
- D. Swaminathan, H. Thornburg, J. Mumford, S. Rajko, J. James, T. Ingalls, E. Campana, G. Qian, P. Sampath, and B. Peng. 2009. A dynamic Bayesian approach to computational Laban shape quality analysis. Advances in Human--Computer Interaction 2009, 1--17. Google ScholarDigital Library
- G. Volpe. 2003. Computational Models of Expressive Gesture in multimedia systems. Ph.D. Dissertation. InfoMus Lab, Genova.Google Scholar
Index Terms
- Interactive Visuals as Metaphors for Dance Movement Qualities
Recommendations
Seeing, Sensing and Recognizing Laban Movement Qualities
CHI '17: Proceedings of the 2017 CHI Conference on Human Factors in Computing SystemsHuman movement has historically been approached as a functional component of interaction within human computer interaction. Yet movement is not only functional, it is also highly expressive. In our research, we explore how movement expertise as ...
Seeing movement qualities
CHI '13: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsWith the increased availability of movement based interactive devices there is a growing interest in exploring the potential design space for engaging movement-based interactions. This has led to the exploration of different ways to sense and model ...
Vocalizing dance movement for interactive sonification of laban effort factors
DIS '14: Proceedings of the 2014 conference on Designing interactive systemsWe investigate the use of interactive sound feedback for dance pedagogy based on the practice of vocalizing while moving. Our goal is to allow dancers to access a greater range of expressive movement qualities through vocalization. We propose a ...
Comments