ABSTRACT
Recent technological advances in input sensing, as well as ultra-small projectors, have opened up new opportunities for interaction -- the use of the body itself as both an input and output platform. Such on-body interfaces offer new interactive possibilities, and the promise of access to computation, communication and information literally in the palm of our hands. The unique context of on-body interaction allows us to take advantage of extra dimensions of input our bodies naturally afford us. In this paper, we consider how the arms and hands can be used to enhance on-body interactions, which is typically finger input centric. To explore this opportunity, we developed Armura, a novel interactive on-body system, supporting both input and graphical output. Using this platform as a vehicle for exploration, we proto-typed many applications and interactions. This helped to confirm chief use modalities, identify fruitful interaction approaches, and in general, better understand how interfaces operate on the body. We highlight the most compelling techniques we uncovered. Further, this paper is the first to consider and prototype how conventional interaction issues, such as cursor control and clutching, apply to the on-body domain. Finally, we bring to light several new and unique interaction techniques.
Supplemental Material
- Accot, J. and Zhai, S. More than dotting the i's -- foundations for crossing-based interfaces. In Proc. CHI '02. 73--80. Google ScholarDigital Library
- Apitz, G. and Guimbretière, F. Cross Y: a crossing-based drawing application. In Proc. UIST '04. 3--12. Google ScholarDigital Library
- Argyros, A. A., and Lourakis, M. I. A. Vision-based Interpretation of Hand Gestures for Remote Control of a Computer Mouse. In Proc. ECCV 2006 Workshop on Computer Vision in HCI. LNCS 3979. 40--51. Google ScholarDigital Library
- Barnett, A. The dancing body as a screen: Synchronizing projected motion graphics onto the human form in contemporary dance. Comput. Entert. 7, 1 (2009), 1--32. Google ScholarDigital Library
- Beardsley, P., Baar, J. V., Raskar, R. and Forlines, C. Interaction using a handheld projector. IEEE Computer Graphics and Applications, 25, 1 (2005), 39--43. Google ScholarDigital Library
- Blasko, G., Feiner, S. and Coriand, F. Exploring interaction with a simulated wrist-worn projection display. Proc. ISWC '09. 2--9. Google ScholarDigital Library
- Bolt, R. A. and Herranz, E. 1992. Two-handed gesture in multi-modal natural dialog. In Proc. UIST '92. 7--14. Google ScholarDigital Library
- Buxton, W., and Myers, B. A study in two-handed input. In Proc. CHI '86. 321--326. Google ScholarDigital Library
- Cao, X. and Balakrishnan, R. Interacting with dynamically defined information spaces using a handheld projector and a pen. In Proc. UIST '06. 225--234. Google ScholarDigital Library
- Cao, X., Forlines, C. and Balakrishnan, R. Multi-user interaction using handheld projectors. In Proc. UIST '07. 43--52. Google ScholarDigital Library
- Erol, A., Bebis, G., Nicolescu, M., Boyle, R. D., and Twombly, X. Vision-based hand pose estimation: A review. Computer Vision and Image Understanding. 108 (2007), 52--73. Google ScholarDigital Library
- Fitzmaurice, G. W. Situated Information Spaces and Spatially Aware Palmtop Computers. Comm. of the ACM, 36, 7 (1993), 38--49. Google ScholarDigital Library
- Gustafson, S., Bierwirth, D., and Baudisch, P. Imaginary Interfaces: Spatial Interaction with Empty Hands and Without Visual Feedback. In Proc. UIST '10. 3--12. Google ScholarDigital Library
- Guiard, Y., Asymmetric Division of Labor in Human Skilled Bimanual Action: The Kinematic Chain as a Model. Jour. of Motor Behavior, 19, 4 (1987), 486--517.Google ScholarCross Ref
- Hang, A., Rukzio, E., and Greaves, A. Projector phone: a study of using mobile phones with integrated projector for interaction with maps. In Proc. MobileHCI '08. 207--216. Google ScholarDigital Library
- Harrison, C., Benko, H., and Wilson, A. D. OmniTouch: wearable multitouch interaction everywhere. In Proc. UIST '11. 441--450. Google ScholarDigital Library
- Harrison, C., Tan, D., and Morris, D. Skinput: appropriating the body as an input surface. In Proc. CHI '10. 453--462. Google ScholarDigital Library
- Hilliges, O., Izadi, S., Wilson, A. D., Hodges, S., Garcia-Mendoza, A., and Butz, A. Interactions in the air: adding further depth to interactive tabletops. In Proc. UIST '09. 139--148. Google ScholarDigital Library
- Hinckley, K., Pausch, R., Goble, J. C., and Kassell, N. F. A survey of design issues in spatial input. Proc. UIST '94. 213--222. Google ScholarDigital Library
- Hudson, S. E., Harrison, C., Harrison, B. L., and LaMarca, A. Whack gestures: inexact and inattentive interaction with mobile devices. In Proc. TEI '10. 109--112. Google ScholarDigital Library
- Kabbash, P., Buxton, W., Sellen, A., Two-Handed Input in a Compound Task. In Proc. CHI '94. 417--423. Google ScholarDigital Library
- Kabbash, P., MacKenzie, I. S., and Buxton, W. Human performance using computer input devices in the preferred and non-preferred hands. In Proc. CHI '93. 474--481. Google ScholarDigital Library
- Kendon, A. (1988). How Gestures Can Become Like Words. In Crosscultural Perspectives in Nonverbal Communication. Toronto, C. J. Hogrefe, 131--141.Google Scholar
- Laakso, S. and Laakso, M. Design of a body-driven multiplayer game system. Comput. Entertain. 4, 4 (2006), 1544--3574. Google ScholarDigital Library
- Li, F. C., Dearman, D., and Truong, K. N. Virtual shelves: interactions with orientation aware devices. Proc. UIST '09. 125--128. Google ScholarDigital Library
- Mistry, P., Maes, P., and Chang, L. WUW - wear Ur world: a wearable gestural interface. In CHI '09 Ext. Abst. 4111--4116. Google ScholarDigital Library
- Mine, M. R. Brooks, F. P. and Sequin, C. H. Moving objects in space: exploiting proprioception in virtual-environment interaction. In Proc. SIGGRAPH '97. 19--26. Google ScholarDigital Library
- Mulder, A. (1996). Hand Gestures for HCI, School of Kinesiology, Simon Fraser University, Technical Report 96--1.Google Scholar
- NTT IT Corp. TenoriPop: http://tenoripop.comGoogle Scholar
- Post, E. R. and Orth, M. Smart Fabric, or "Wearable Clothing." In Proc. ISWC '97. 167--168. Google ScholarDigital Library
- Raskar, R., Beardsley, P., van Baar, J., Wang, Y., Dietz, P., Lee, J., Leigh, D., and Willwacher, T. RFIG lamps: interacting with a self-describing world via photosensing wireless tags and projectors. In Proc. SIGGRAPH '04. 406--415. Google ScholarDigital Library
- Saponas, T. S. Enabling always-available input: through on-body interfaces. In Proc. CHI '09 Ext. Abst. 3117--3120. Google ScholarDigital Library
- Saponas, T. S., Tan, D. S., Morris, D., Balakrishnan, R., Turner, J., and Landay, J. A. Enabling always-available input with muscle-computer interfaces. In Proc. UIST '09. 167--176. Google ScholarDigital Library
- Smith, J., White, T., Dodge, C., Paradiso, J., Gershenfeld, N., and Allport, D. Electric Field Sensing For Graphical Interfaces. IEEE Comput. Graph. Appl. 18, 3 (1998), 54--60. Google ScholarDigital Library
- Starner T., Pentland A., Weaver, J. Real-Time American Sign Language Recognition Using Desk and Wearable Computer Based Video, IEEE Trans. on Pattern Analysis and Machine Intelligence. 20, 12 (1998). 1371--1375. Google ScholarDigital Library
- Sturman, D. J. and Zeltzer, D. A Survey of Glove-based Input. IEEE Comp Graph and Appl, 14, 1 (1994). 30--39 Google ScholarDigital Library
- Sugrue, C. 2007. "Delicate Boundaries" (art installation).Google Scholar
- Tan, D., Morris, D., and Saponas, T. S. 2010. Interfaces on the go. ACM XRDS, 16, 4 (2010), 30--34. Google ScholarDigital Library
- Valii, C. (ed). (2006). The Gallaudet Dictionary of American Sign Language; 1st edition. Gallaudet University Press.Google Scholar
- Warren, J. (2003). Unencumbered Full Body Interaction in Video Games. MFA Design and Technology Thesis, Parsons School of Design.Google Scholar
- Willis, K. D. and Poupyrev, I. MotionBeam: designing for movement with handheld projectors. In Proc. CHI '10 Ext. Abst. 3253--3258. Google ScholarDigital Library
- Wilson, A. Robust computer vision-based detection of pinching for one and two-handed gesture input. Proc. UIST '06, 255--258. Google ScholarDigital Library
- Wilson, A. and Benko, H. Combining Multiple Depth Cameras and Projectors for Interactions On, Above and Between Surfaces. In Proc. UIST '10. 273--282. Google ScholarDigital Library
- Witten, I. H. and Frank, E. (2005). Data Mining: Practical machine learning tools and techniques, 2nd Edition, Morgan Kaufmann, San Francisco. Google ScholarDigital Library
- Yamamoto, G. and Sato, K. PALMbit: A PALM Interface with Projector-Camera System. In Adj. Proc. UbiComp '07. 276--279.Google Scholar
- Yee, K. Peephole displays: pen interaction on spatially aware handheld computers. In Proc. CHI '03. 1--8. Google ScholarDigital Library
Index Terms
- On-body interaction: armed and dangerous
Recommendations
Sensing techniques for tablet+stylus interaction
UIST '14: Proceedings of the 27th annual ACM symposium on User interface software and technologyWe explore grip and motion sensing to afford new techniques that leverage how users naturally manipulate tablet and stylus devices during pen + touch interaction. We can detect whether the user holds the pen in a writing grip or tucked between his ...
PenShaft: Enabling Pen Shaft Detection and Interaction for Touchscreens
AH2021: 12th Augmented Human International ConferencePenShaft is a battery-free, easy to implement solution for augmenting the shaft of a capacitive pen with interactive capabilities. By applying conductive materials in a specific pattern on the pen’s shaft, we can detect when it is laid on a capacitive ...
Sensing Posture-Aware Pen+Touch Interaction on Tablets
CHI '19: Proceedings of the 2019 CHI Conference on Human Factors in Computing SystemsMany status-quo interfaces for tablets with pen + touch input capabilities force users to reach for device-centric UI widgets at fixed locations, rather than sensing and adapting to the user-centric posture. To address this problem, we propose sensing ...
Comments