skip to main content
10.1145/2148131.2148148acmconferencesArticle/Chapter ViewAbstractPublication PagesteiConference Proceedingsconference-collections
research-article

On-body interaction: armed and dangerous

Published:19 February 2012Publication History

ABSTRACT

Recent technological advances in input sensing, as well as ultra-small projectors, have opened up new opportunities for interaction -- the use of the body itself as both an input and output platform. Such on-body interfaces offer new interactive possibilities, and the promise of access to computation, communication and information literally in the palm of our hands. The unique context of on-body interaction allows us to take advantage of extra dimensions of input our bodies naturally afford us. In this paper, we consider how the arms and hands can be used to enhance on-body interactions, which is typically finger input centric. To explore this opportunity, we developed Armura, a novel interactive on-body system, supporting both input and graphical output. Using this platform as a vehicle for exploration, we proto-typed many applications and interactions. This helped to confirm chief use modalities, identify fruitful interaction approaches, and in general, better understand how interfaces operate on the body. We highlight the most compelling techniques we uncovered. Further, this paper is the first to consider and prototype how conventional interaction issues, such as cursor control and clutching, apply to the on-body domain. Finally, we bring to light several new and unique interaction techniques.

Skip Supplemental Material Section

Supplemental Material

p69-harrison.mov

mov

14 MB

References

  1. Accot, J. and Zhai, S. More than dotting the i's -- foundations for crossing-based interfaces. In Proc. CHI '02. 73--80. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Apitz, G. and Guimbretière, F. Cross Y: a crossing-based drawing application. In Proc. UIST '04. 3--12. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Argyros, A. A., and Lourakis, M. I. A. Vision-based Interpretation of Hand Gestures for Remote Control of a Computer Mouse. In Proc. ECCV 2006 Workshop on Computer Vision in HCI. LNCS 3979. 40--51. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Barnett, A. The dancing body as a screen: Synchronizing projected motion graphics onto the human form in contemporary dance. Comput. Entert. 7, 1 (2009), 1--32. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Beardsley, P., Baar, J. V., Raskar, R. and Forlines, C. Interaction using a handheld projector. IEEE Computer Graphics and Applications, 25, 1 (2005), 39--43. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Blasko, G., Feiner, S. and Coriand, F. Exploring interaction with a simulated wrist-worn projection display. Proc. ISWC '09. 2--9. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Bolt, R. A. and Herranz, E. 1992. Two-handed gesture in multi-modal natural dialog. In Proc. UIST '92. 7--14. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Buxton, W., and Myers, B. A study in two-handed input. In Proc. CHI '86. 321--326. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Cao, X. and Balakrishnan, R. Interacting with dynamically defined information spaces using a handheld projector and a pen. In Proc. UIST '06. 225--234. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Cao, X., Forlines, C. and Balakrishnan, R. Multi-user interaction using handheld projectors. In Proc. UIST '07. 43--52. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Erol, A., Bebis, G., Nicolescu, M., Boyle, R. D., and Twombly, X. Vision-based hand pose estimation: A review. Computer Vision and Image Understanding. 108 (2007), 52--73. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Fitzmaurice, G. W. Situated Information Spaces and Spatially Aware Palmtop Computers. Comm. of the ACM, 36, 7 (1993), 38--49. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Gustafson, S., Bierwirth, D., and Baudisch, P. Imaginary Interfaces: Spatial Interaction with Empty Hands and Without Visual Feedback. In Proc. UIST '10. 3--12. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Guiard, Y., Asymmetric Division of Labor in Human Skilled Bimanual Action: The Kinematic Chain as a Model. Jour. of Motor Behavior, 19, 4 (1987), 486--517.Google ScholarGoogle ScholarCross RefCross Ref
  15. Hang, A., Rukzio, E., and Greaves, A. Projector phone: a study of using mobile phones with integrated projector for interaction with maps. In Proc. MobileHCI '08. 207--216. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Harrison, C., Benko, H., and Wilson, A. D. OmniTouch: wearable multitouch interaction everywhere. In Proc. UIST '11. 441--450. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Harrison, C., Tan, D., and Morris, D. Skinput: appropriating the body as an input surface. In Proc. CHI '10. 453--462. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Hilliges, O., Izadi, S., Wilson, A. D., Hodges, S., Garcia-Mendoza, A., and Butz, A. Interactions in the air: adding further depth to interactive tabletops. In Proc. UIST '09. 139--148. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Hinckley, K., Pausch, R., Goble, J. C., and Kassell, N. F. A survey of design issues in spatial input. Proc. UIST '94. 213--222. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Hudson, S. E., Harrison, C., Harrison, B. L., and LaMarca, A. Whack gestures: inexact and inattentive interaction with mobile devices. In Proc. TEI '10. 109--112. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Kabbash, P., Buxton, W., Sellen, A., Two-Handed Input in a Compound Task. In Proc. CHI '94. 417--423. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Kabbash, P., MacKenzie, I. S., and Buxton, W. Human performance using computer input devices in the preferred and non-preferred hands. In Proc. CHI '93. 474--481. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Kendon, A. (1988). How Gestures Can Become Like Words. In Crosscultural Perspectives in Nonverbal Communication. Toronto, C. J. Hogrefe, 131--141.Google ScholarGoogle Scholar
  24. Laakso, S. and Laakso, M. Design of a body-driven multiplayer game system. Comput. Entertain. 4, 4 (2006), 1544--3574. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Li, F. C., Dearman, D., and Truong, K. N. Virtual shelves: interactions with orientation aware devices. Proc. UIST '09. 125--128. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Mistry, P., Maes, P., and Chang, L. WUW - wear Ur world: a wearable gestural interface. In CHI '09 Ext. Abst. 4111--4116. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Mine, M. R. Brooks, F. P. and Sequin, C. H. Moving objects in space: exploiting proprioception in virtual-environment interaction. In Proc. SIGGRAPH '97. 19--26. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Mulder, A. (1996). Hand Gestures for HCI, School of Kinesiology, Simon Fraser University, Technical Report 96--1.Google ScholarGoogle Scholar
  29. NTT IT Corp. TenoriPop: http://tenoripop.comGoogle ScholarGoogle Scholar
  30. Post, E. R. and Orth, M. Smart Fabric, or "Wearable Clothing." In Proc. ISWC '97. 167--168. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Raskar, R., Beardsley, P., van Baar, J., Wang, Y., Dietz, P., Lee, J., Leigh, D., and Willwacher, T. RFIG lamps: interacting with a self-describing world via photosensing wireless tags and projectors. In Proc. SIGGRAPH '04. 406--415. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Saponas, T. S. Enabling always-available input: through on-body interfaces. In Proc. CHI '09 Ext. Abst. 3117--3120. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. Saponas, T. S., Tan, D. S., Morris, D., Balakrishnan, R., Turner, J., and Landay, J. A. Enabling always-available input with muscle-computer interfaces. In Proc. UIST '09. 167--176. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Smith, J., White, T., Dodge, C., Paradiso, J., Gershenfeld, N., and Allport, D. Electric Field Sensing For Graphical Interfaces. IEEE Comput. Graph. Appl. 18, 3 (1998), 54--60. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Starner T., Pentland A., Weaver, J. Real-Time American Sign Language Recognition Using Desk and Wearable Computer Based Video, IEEE Trans. on Pattern Analysis and Machine Intelligence. 20, 12 (1998). 1371--1375. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Sturman, D. J. and Zeltzer, D. A Survey of Glove-based Input. IEEE Comp Graph and Appl, 14, 1 (1994). 30--39 Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Sugrue, C. 2007. "Delicate Boundaries" (art installation).Google ScholarGoogle Scholar
  38. Tan, D., Morris, D., and Saponas, T. S. 2010. Interfaces on the go. ACM XRDS, 16, 4 (2010), 30--34. Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. Valii, C. (ed). (2006). The Gallaudet Dictionary of American Sign Language; 1st edition. Gallaudet University Press.Google ScholarGoogle Scholar
  40. Warren, J. (2003). Unencumbered Full Body Interaction in Video Games. MFA Design and Technology Thesis, Parsons School of Design.Google ScholarGoogle Scholar
  41. Willis, K. D. and Poupyrev, I. MotionBeam: designing for movement with handheld projectors. In Proc. CHI '10 Ext. Abst. 3253--3258. Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. Wilson, A. Robust computer vision-based detection of pinching for one and two-handed gesture input. Proc. UIST '06, 255--258. Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. Wilson, A. and Benko, H. Combining Multiple Depth Cameras and Projectors for Interactions On, Above and Between Surfaces. In Proc. UIST '10. 273--282. Google ScholarGoogle ScholarDigital LibraryDigital Library
  44. Witten, I. H. and Frank, E. (2005). Data Mining: Practical machine learning tools and techniques, 2nd Edition, Morgan Kaufmann, San Francisco. Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. Yamamoto, G. and Sato, K. PALMbit: A PALM Interface with Projector-Camera System. In Adj. Proc. UbiComp '07. 276--279.Google ScholarGoogle Scholar
  46. Yee, K. Peephole displays: pen interaction on spatially aware handheld computers. In Proc. CHI '03. 1--8. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. On-body interaction: armed and dangerous

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        TEI '12: Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction
        February 2012
        413 pages
        ISBN:9781450311748
        DOI:10.1145/2148131

        Copyright © 2012 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 19 February 2012

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        Overall Acceptance Rate393of1,367submissions,29%

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader