skip to main content
10.1145/2948910.2948927acmotherconferencesArticle/Chapter ViewAbstractPublication PagesmocoConference Proceedingsconference-collections
research-article

The Dancer in the Eye: Towards a Multi-Layered Computational Framework of Qualities in Movement

Authors Info & Claims
Published:05 July 2016Publication History

ABSTRACT

This paper presents a conceptual framework for the analysis of expressive qualities of movement. Our perspective is to model an observer of a dance performance. The conceptual framework is made of four layers, ranging from the physical signals that sensors capture to the qualities that movement communicate (e.g., in terms of emotions). The framework aims to provide a conceptual background the development of computational systems can build upon, with a particular reference to systems analyzing a vocabulary of expressive movement qualities, and translating them to other sensory channels, such as the auditory modality. Such systems enable their users to "listen to a choreography" or to "feel a ballet", in a new kind of cross-modal mediated experience.

References

  1. Sarah Fdili Alaoui, Frédéric Bevilacqua, and Christian Jacquemin. 2015. Interactive Visuals as Metaphors for Dance Movement Qualities. ACM Trans Interact Intell Syst 5, 3: 13--24. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Frédéric Bevilacqua, Bruno Zamborlin, Anthony Sypniewski, Norbert Schnell, Fabrice Guédy, and Nicolas Rasamimanana. 2009. Continuous realtime gesture following and recognition. In Gesture in embodied communication and human-computer interaction, Stefan Kopp and Ipke Wachsmuth (eds.). Springer Berlin Heidelberg, Germany, 73--84. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Antonio Camurri and Gualtiero Volpe. 2016. The Intersection of art and technology. IEEE Multimedia 23, 1: 10--17. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Antonio Camurri, Barbara Mazzarino, Matteo Ricchetti. Renee Timmers, and Gualtiero Volpe. 2004. Multimodal analysis of expressive gesture in music and dance performances. In Gesture-Based Communication in Human-Computer Interaction, Antonio Camurri and Gualtiero Volpe (eds.) Springer Berlin Heidelberg, Germany, 20--39.Google ScholarGoogle Scholar
  5. Antonio Camurri, Barbara Mazzarino, and Gualtiero Volpe. 2004. Expressive Interfaces. Cognition Technology & Work 6, 1, 15--22. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Antonio Camurri, Carol L. Krumhansl, Barbara Mazzarino, and Gualtiero Volpe. 2004. An Exploratory Study of Anticipating Human Movement in Dance. In Proceeding of the 2nd International Symposium on Measurement, Analysis and Modeling of Human Functions.Google ScholarGoogle Scholar
  7. Antonio Camurri, Ingrid Lagerlof, and Gualtiero Volpe. 2003. Recognizing Emotion from Dance Movement: Comparison of Spectator Recognition and Automated Techniques. Int J Hum Comput Stud 59, 1--2: 213--225. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Antonio Camurri, Barbara Mazzarino, Gualtiero Volpe, Pietro Morasso, Federica Priano, and Cristina Re. 2003. Application of multimdia techniques in the physical rehabilitation of Parkinson's patients. Comput Anim Virtual Worlds (formerly Journal of Visualization and Computer Animation) 14, 5: 269--278.Google ScholarGoogle Scholar
  9. Ginevra Castellano, Marcello Mortillaro, Antonio Camurri, Gualtiero Volpe, and Kkaus Scherer. 2008. Automated Analysis of Body Movement in Emotionally Expressive Piano Performances. Music Perception 26, 2:103--119.Google ScholarGoogle ScholarCross RefCross Ref
  10. Beatrice de Gelder. 2006. Towards the Neurobiology of Emotional Body Language. Nature Rev. Neuroscience 7, 3: 242--249.Google ScholarGoogle ScholarCross RefCross Ref
  11. Paul Fraisse. 1963. The psychology of time. New York: Harper.Google ScholarGoogle Scholar
  12. Donald Glowinski, Floriane Dardard, Giorgio Gnecco, Stefano Piana, and Antonio Camurri. 2014. Expressive Non-Verbal Interaction in a String Quartet: an Analysis through Head Movements. Journal on Multimodal User Interfaces 9, 1: 55--68.Google ScholarGoogle ScholarCross RefCross Ref
  13. Donald Glowinski, Maurizio Mancini, Roddie Cowie, Antonio Camurri, Carlo Chiorri, and Cian Doherty. 2013. The movements made by performers in a skilled quartet: a distinctive pattern, and the function that it serves. Front. Psychol. 4: 841.Google ScholarGoogle ScholarCross RefCross Ref
  14. Donald Glowinski, Nele Dael, Antonio Camurri, Gualtiero Volpe, Marcello Mortillaro, and Klaus Scherer. 2011. Towards a Minimal Representation of Affective Gestures. IEEE Trans Affective Comput 2, 2: 106--118. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Neville Hogan and Dagmar Sternad. 2007. On rhythmic and discrete movements: reflections, definitions and implications for motor control. Exp Brain Res 181, 1: 13--30.Google ScholarGoogle ScholarCross RefCross Ref
  16. Rudolf Laban and F.C. Lawrence. 1947. Effort. MacDonald and Evans.Google ScholarGoogle Scholar
  17. Michelle Karg, Ali-Akbar Samadani, Rob Gorbet, Kolja Kühnlenz, Jesse Hoey, and Dana Kulić. 2013. Body movements for affective expression: a survey of automatic recognition and generation. IEEE Trans Affective Comput 4, 4: 341--359.Google ScholarGoogle ScholarCross RefCross Ref
  18. Andrea Kleinsmith and Nadia Bianchi-Berthouze. 2013. Affective body expression perception and recognition: A survey. IEEE Trans Affective Comput 4, 1: 15--33. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Radoslaw Niewiadomski, Maurizio Mancini, Gualtiero Volpe, and Antonio Camurri. 2015. Automated Detection of Impulsive Movements in HCI. In Proceedings of the 11th Biannual Conference of Italian SIGCHI Chapter (CHItaly 2015), 166--169. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Katie Noble, Donald Glowinski, Helen Murphy, Corinne Jola, Phil McAleer, Nikhil Darshane, Kedzie Penfield, Sandhiya Kalyanasundaram, Antonio Camurri, and Frank E. Pollick. 2014. Event Segmentation and Biological Motion Perception in Watching Dance. Art & Perception 2, 1--2: 59--74.Google ScholarGoogle ScholarCross RefCross Ref
  21. Jessica Phillips-Silver and Peter E. Keller. 2012. Searching for roots of entrainment and joint action in early musical interactions. Front Hum Neurosci 6: 26.Google ScholarGoogle ScholarCross RefCross Ref
  22. Stefano Piana, Alessandra Staglianò, Francesca Odone, and Antonio Camurri. 2016. Adaptive Body Gesture Representation for Automatic Emotion Recognition. ACM Trans Interact Intell Syst 6, 1: 6. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Stefano Piana, Paolo Alborno, Radoslaw Niewiadomski, Maurizio Mancini, Gualtiero Volpe, and Antonio Camurri. 2016. Movement Fluidity Analysis Based on Performance and Perception. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA '16), 1629--1636. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Ernst Pöppel. 1997. A hierarchical model of temporal perception. Trends Cogn Sci 1, 2: 56--61.Google ScholarGoogle ScholarCross RefCross Ref
  25. Joshua S. Richman and J. Randall Moorman. 2000. Physiological time-series analysis using approximate entropy and sample entropy. American Journal of Physiology. Heart and Circulatory Physiology 278, 6: H2039--H2049.Google ScholarGoogle ScholarCross RefCross Ref
  26. Aneesha Singh, Stefano Piana, Davide Pollarolo, Gualtiero Volpe, Giovanna Varni, Ana Tajadura-Jiménez, Amanda CdeC Williams, Antonio Camurri, and Nadia Bianchi-Berthouze. 2016. Go-with-the-Flow: Tracking, Analysis and Sonification of Movement and Breathing to Build Confidence in Activity Despite Chronic Pain. Human-Computer Interaction 31, 3--4: 335--383. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Giovanna Varni, Gualtiero Volpe, and Antonio Camurri. 2010. A System for Real-Time Multimodal Analysis of Nonverbal Affective Social Interaction in User-Centric Media. IEEE Trans Multimedia 12, 6: 576--590. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Gualtiero Volpe, and Antonio Camurri. 2011. A system for embodied social active listening to sound and music content. ACM Journal on Computing and Cultural Heritage, 4, 1:2--23. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Harald G. Wallbott. 1998. Bodily Expression of Emotion. European Journal Social Psychology 28, 6: 879--896.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. The Dancer in the Eye: Towards a Multi-Layered Computational Framework of Qualities in Movement

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      MOCO '16: Proceedings of the 3rd International Symposium on Movement and Computing
      July 2016
      300 pages
      ISBN:9781450343077
      DOI:10.1145/2948910

      Copyright © 2016 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 5 July 2016

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article
      • Research
      • Refereed limited

      Acceptance Rates

      Overall Acceptance Rate50of110submissions,45%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader