skip to main content
10.1145/2948910.2948962acmotherconferencesArticle/Chapter ViewAbstractPublication PagesmocoConference Proceedingsconference-collections
short-paper

A serious games platform for validating sonification of human full-body movement qualities

Published:05 July 2016Publication History

ABSTRACT

In this paper we describe a serious games platfrom for validating sonification of human full-body movement qualities. This platform supports the design and development of serious games aiming at validating (i) our techniques to measure expressive movement qualities, and (ii) the mapping strategies to translate such qualities in the auditory domain, by means of interactive sonification and active music experience. The platform is a part of a more general framework developed in the context of the EU ICT H2020 DANCE "Dancing in the dark" Project n.645553 that aims at enabling the perception of nonverbal artistic whole-body experiences to visual impaired people.

References

  1. K. Bousmalis, L. Morency, and M. Pantic. Modeling hidden dynamics of multimodal cues for spontaneous agreement and disagreement recognition. In Automatic Face Gesture Recognition and Workshops (FG 2011), 2011 IEEE International Conference on, pages 746--752, 2011.Google ScholarGoogle ScholarCross RefCross Ref
  2. K. L. C. G. Castellano, G. Multimodal emotion recognition from expressive faces, body gestures and speech. In Doctoral Consortium of ACII, Lisbon, 2007.Google ScholarGoogle Scholar
  3. B. R. Dubus, G. A systematic review of mapping strategies for the sonification of physical quantities. Plos One, 8(12), 2013.Google ScholarGoogle Scholar
  4. K. Felix Navarro, E. Lawrence, J. Garcia Marin, and C. Sax. A dynamic and customisable layered serious game design framework for improving the physical and mental health of the aged and the infirm. In Conference on eHealth, Telemedicine, and Social Medicine. IARIA Conference, 2011.Google ScholarGoogle Scholar
  5. T. Hermann. Taxonomy and Definitions for Sonification and Auditory Display. In P. Susini and O. Warusfel, editors, Proceedings of the 14th International Conference on Auditory Display (ICAD 2008). IRCAM, 2008.Google ScholarGoogle Scholar
  6. H. Hung and D. Gatica-Perez. Estimating cohesion in small groups using audio-visual nonverbal behavior. Multimedia, IEEE Transactions on, 12(6):563--575, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. G. Kramer. Auditory Display: Sonification, Audification, and Auditory Interfaces. Perseus Publishing, 1993. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. D. Lakens and M. Stel. If they move in sync, they must feel in sync: Movement synchrony leads to attribution of rapport and entitativity. Social Congnition, 29 (1):1--14, 2011.Google ScholarGoogle ScholarCross RefCross Ref
  9. B. Mazzarino and M. Mancini. The need for impulsivity and smoothness improving hci by qualitatively measuring new high-level human motion features. International Conference on Signal Processing and Multimedia Applications, pages 62--67, 2009.Google ScholarGoogle Scholar
  10. R. Niewiadomski, M. Mancini, G. Volpe, and A. Camurri. Automated detection of impulsive movements in hci. In Proceedings of the 11th Biannual Conference on Italian SIGCHI Chapter, CHItaly 2015, pages 166--169, New York, NY, USA, 2015. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. S. A. C. A. O. F. H. J. Piana, S. Adaptive body gesture representation for automatic emotion recognition. In Transactions on Interactive Intelligent System, 2015. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. M. Rehm. Non-symbolic gestural interaction for aml. In D. R.-C. A. J. Aghajan, H., editor, Human-Centric: Interface for Ambient Intelligence, pages 327--345. ACM Press, Amsterdam, The Netherlands, 2010.Google ScholarGoogle Scholar
  13. G. Varni, A. Camurri, P. Coletta, and G. Volpe. Toward a real-time automated measure of empathy and dominance. In Computational Science and Engineering, 2009. CSE '09. International Conference on, volume 4, pages 843--848, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. A serious games platform for validating sonification of human full-body movement qualities

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      MOCO '16: Proceedings of the 3rd International Symposium on Movement and Computing
      July 2016
      300 pages
      ISBN:9781450343077
      DOI:10.1145/2948910

      Copyright © 2016 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 5 July 2016

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • short-paper
      • Research
      • Refereed limited

      Acceptance Rates

      Overall Acceptance Rate50of110submissions,45%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader