ABSTRACT
This paper presents a conceptual framework for the analysis of expressive qualities of movement. Our perspective is to model an observer of a dance performance. The conceptual framework is made of four layers, ranging from the physical signals that sensors capture to the qualities that movement communicate (e.g., in terms of emotions). The framework aims to provide a conceptual background the development of computational systems can build upon, with a particular reference to systems analyzing a vocabulary of expressive movement qualities, and translating them to other sensory channels, such as the auditory modality. Such systems enable their users to "listen to a choreography" or to "feel a ballet", in a new kind of cross-modal mediated experience.
- Sarah Fdili Alaoui, Frédéric Bevilacqua, and Christian Jacquemin. 2015. Interactive Visuals as Metaphors for Dance Movement Qualities. ACM Trans Interact Intell Syst 5, 3: 13--24. Google ScholarDigital Library
- Frédéric Bevilacqua, Bruno Zamborlin, Anthony Sypniewski, Norbert Schnell, Fabrice Guédy, and Nicolas Rasamimanana. 2009. Continuous realtime gesture following and recognition. In Gesture in embodied communication and human-computer interaction, Stefan Kopp and Ipke Wachsmuth (eds.). Springer Berlin Heidelberg, Germany, 73--84. Google ScholarDigital Library
- Antonio Camurri and Gualtiero Volpe. 2016. The Intersection of art and technology. IEEE Multimedia 23, 1: 10--17. Google ScholarDigital Library
- Antonio Camurri, Barbara Mazzarino, Matteo Ricchetti. Renee Timmers, and Gualtiero Volpe. 2004. Multimodal analysis of expressive gesture in music and dance performances. In Gesture-Based Communication in Human-Computer Interaction, Antonio Camurri and Gualtiero Volpe (eds.) Springer Berlin Heidelberg, Germany, 20--39.Google Scholar
- Antonio Camurri, Barbara Mazzarino, and Gualtiero Volpe. 2004. Expressive Interfaces. Cognition Technology & Work 6, 1, 15--22. Google ScholarDigital Library
- Antonio Camurri, Carol L. Krumhansl, Barbara Mazzarino, and Gualtiero Volpe. 2004. An Exploratory Study of Anticipating Human Movement in Dance. In Proceeding of the 2nd International Symposium on Measurement, Analysis and Modeling of Human Functions.Google Scholar
- Antonio Camurri, Ingrid Lagerlof, and Gualtiero Volpe. 2003. Recognizing Emotion from Dance Movement: Comparison of Spectator Recognition and Automated Techniques. Int J Hum Comput Stud 59, 1--2: 213--225. Google ScholarDigital Library
- Antonio Camurri, Barbara Mazzarino, Gualtiero Volpe, Pietro Morasso, Federica Priano, and Cristina Re. 2003. Application of multimdia techniques in the physical rehabilitation of Parkinson's patients. Comput Anim Virtual Worlds (formerly Journal of Visualization and Computer Animation) 14, 5: 269--278.Google Scholar
- Ginevra Castellano, Marcello Mortillaro, Antonio Camurri, Gualtiero Volpe, and Kkaus Scherer. 2008. Automated Analysis of Body Movement in Emotionally Expressive Piano Performances. Music Perception 26, 2:103--119.Google ScholarCross Ref
- Beatrice de Gelder. 2006. Towards the Neurobiology of Emotional Body Language. Nature Rev. Neuroscience 7, 3: 242--249.Google ScholarCross Ref
- Paul Fraisse. 1963. The psychology of time. New York: Harper.Google Scholar
- Donald Glowinski, Floriane Dardard, Giorgio Gnecco, Stefano Piana, and Antonio Camurri. 2014. Expressive Non-Verbal Interaction in a String Quartet: an Analysis through Head Movements. Journal on Multimodal User Interfaces 9, 1: 55--68.Google ScholarCross Ref
- Donald Glowinski, Maurizio Mancini, Roddie Cowie, Antonio Camurri, Carlo Chiorri, and Cian Doherty. 2013. The movements made by performers in a skilled quartet: a distinctive pattern, and the function that it serves. Front. Psychol. 4: 841.Google ScholarCross Ref
- Donald Glowinski, Nele Dael, Antonio Camurri, Gualtiero Volpe, Marcello Mortillaro, and Klaus Scherer. 2011. Towards a Minimal Representation of Affective Gestures. IEEE Trans Affective Comput 2, 2: 106--118. Google ScholarDigital Library
- Neville Hogan and Dagmar Sternad. 2007. On rhythmic and discrete movements: reflections, definitions and implications for motor control. Exp Brain Res 181, 1: 13--30.Google ScholarCross Ref
- Rudolf Laban and F.C. Lawrence. 1947. Effort. MacDonald and Evans.Google Scholar
- Michelle Karg, Ali-Akbar Samadani, Rob Gorbet, Kolja Kühnlenz, Jesse Hoey, and Dana Kulić. 2013. Body movements for affective expression: a survey of automatic recognition and generation. IEEE Trans Affective Comput 4, 4: 341--359.Google ScholarCross Ref
- Andrea Kleinsmith and Nadia Bianchi-Berthouze. 2013. Affective body expression perception and recognition: A survey. IEEE Trans Affective Comput 4, 1: 15--33. Google ScholarDigital Library
- Radoslaw Niewiadomski, Maurizio Mancini, Gualtiero Volpe, and Antonio Camurri. 2015. Automated Detection of Impulsive Movements in HCI. In Proceedings of the 11th Biannual Conference of Italian SIGCHI Chapter (CHItaly 2015), 166--169. Google ScholarDigital Library
- Katie Noble, Donald Glowinski, Helen Murphy, Corinne Jola, Phil McAleer, Nikhil Darshane, Kedzie Penfield, Sandhiya Kalyanasundaram, Antonio Camurri, and Frank E. Pollick. 2014. Event Segmentation and Biological Motion Perception in Watching Dance. Art & Perception 2, 1--2: 59--74.Google ScholarCross Ref
- Jessica Phillips-Silver and Peter E. Keller. 2012. Searching for roots of entrainment and joint action in early musical interactions. Front Hum Neurosci 6: 26.Google ScholarCross Ref
- Stefano Piana, Alessandra Staglianò, Francesca Odone, and Antonio Camurri. 2016. Adaptive Body Gesture Representation for Automatic Emotion Recognition. ACM Trans Interact Intell Syst 6, 1: 6. Google ScholarDigital Library
- Stefano Piana, Paolo Alborno, Radoslaw Niewiadomski, Maurizio Mancini, Gualtiero Volpe, and Antonio Camurri. 2016. Movement Fluidity Analysis Based on Performance and Perception. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA '16), 1629--1636. Google ScholarDigital Library
- Ernst Pöppel. 1997. A hierarchical model of temporal perception. Trends Cogn Sci 1, 2: 56--61.Google ScholarCross Ref
- Joshua S. Richman and J. Randall Moorman. 2000. Physiological time-series analysis using approximate entropy and sample entropy. American Journal of Physiology. Heart and Circulatory Physiology 278, 6: H2039--H2049.Google ScholarCross Ref
- Aneesha Singh, Stefano Piana, Davide Pollarolo, Gualtiero Volpe, Giovanna Varni, Ana Tajadura-Jiménez, Amanda CdeC Williams, Antonio Camurri, and Nadia Bianchi-Berthouze. 2016. Go-with-the-Flow: Tracking, Analysis and Sonification of Movement and Breathing to Build Confidence in Activity Despite Chronic Pain. Human-Computer Interaction 31, 3--4: 335--383. Google ScholarDigital Library
- Giovanna Varni, Gualtiero Volpe, and Antonio Camurri. 2010. A System for Real-Time Multimodal Analysis of Nonverbal Affective Social Interaction in User-Centric Media. IEEE Trans Multimedia 12, 6: 576--590. Google ScholarDigital Library
- Gualtiero Volpe, and Antonio Camurri. 2011. A system for embodied social active listening to sound and music content. ACM Journal on Computing and Cultural Heritage, 4, 1:2--23. Google ScholarDigital Library
- Harald G. Wallbott. 1998. Bodily Expression of Emotion. European Journal Social Psychology 28, 6: 879--896.Google ScholarCross Ref
Index Terms
- The Dancer in the Eye: Towards a Multi-Layered Computational Framework of Qualities in Movement
Recommendations
Towards a Multimodal Repository of Expressive Movement Qualities in Dance
MOCO '16: Proceedings of the 3rd International Symposium on Movement and ComputingIn this paper, we present a new multimodal repository for the analysis of expressive movement qualities in dance. First, we discuss guidelines and methodology that we applied to create this repository. Next, the technical setup of recordings and the ...
Piano&Dancer: Interaction Between a Dancer and an Acoustic Instrument
MOCO '17: Proceedings of the 4th International Conference on Movement ComputingPiano&Dancer is an interactive piece for a dancer and an electromechanical acoustic piano. The piece presents the dancer and the piano as two performers on stage whose bodily movements are mutually interdependent. This interdependence reveals a close ...
Hearing movement: how taiko can inform automatic recognition of expressive movement qualities
MOCO '15: Proceedings of the 2nd International Workshop on Movement and ComputingWe describe the first stages of exploratory research undertaken to analyze expressive movement qualities of taiko performance, a Japaense artistic practice that combines stylized movement with drumming technique. The eventual goals of this research are ...
Comments