ABSTRACT
Multi-display environments (MDEs) have advanced rapidly in recent years, incorporating multi-touch tabletops, tablets, wall displays and even position tracking systems. Designers have proposed a variety of interesting gestures for use in an MDE, some of which involve a user moving their hands, arms, body or even a device itself. These gestures are often used as part of interactions to move data between the various components of an MDE, which is a longstanding research problem. But designers, not users, have created most of these gestures and concerns over implementation issues such as recognition may have influenced their design. We performed a user study to elicit these gestures directly from users, but found a low level of convergence among the gestures produced. This lack of agreement is important and we discuss its possible causes and the implication it has for designers. To assist designers, we present the most prevalent gestures and some of the underlying conceptual themes behind them. We also provide analysis of how certain factors such as distance and device type impact the choice of gestures and discuss how to apply them to real-world systems.
- Ballendat, T, Marquardt, N, and Greenberg, S. Proxemic interaction: designing for a proximity and orientation-aware environment. In Proc. of ITS '10,121--130. Google ScholarDigital Library
- Biehl, J T and Bailey, Brian P. ARIS: an interface for application relocation in an interactive space. In Proc. of GI '04, 107--116. Google ScholarDigital Library
- Boring, S., Baur, D., Butz, A., Gustafson, S., Baudisch, P. Touch projector: mobile interaction through video. In Proc of CHI 2010,2287--2296. Google ScholarDigital Library
- Bragdon, A., DeLine, R., Hinckley, K., Morris, M. Code space: touch + air gesture hybrid interactions for supporting developer meetings. In Proc ITS 2011, 212--221. Google ScholarDigital Library
- Brignull, H, Izadi, S, Fitzpatrick, G, Rogers, Y, and Roddem, T. The introduction of a shared interactive surface into a communal space. In Proc. of CSCW 04, 49--5 Google ScholarDigital Library
- Buxton, W. Chunking and phrasing and the design of human-computer dialogues. In Human-computer interaction, Morgan Kaufmann Publishers In, pp. 494--499. Google ScholarDigital Library
- Dachselt, R., Bucholz, R. Natural throw and tilt interaction between mobile phones and distant displays. In Proc of CHI EA 2009, 3253--3258. Google ScholarDigital Library
- Döring, T, Shirazi, A, and Schmidt, A. Exploring gesture-based interaction techniques in multi-display environments with mobile phones and a multi-touch table. In Proc. of AVI '10, 419--419. Google ScholarDigital Library
- Epps, J.; Lichman, S. & Wu, M. A study of hand shape use in tabletop gesture interaction, Ext. Abstracts CHI 2006, 748--753. Google ScholarDigital Library
- Frisch, M., Heydekorn, J., Dachselt, R.: Investigating Multi-Touch and Pen Gestures for Diagram Editing on Interactive Surfaces. In Proc. of ITS 2009,149--156. Google ScholarDigital Library
- Hassan, N., Rahman, M., Irani, P., Graham, P. Chucking: A One-Handed Document Sharing Technique. In Proc of Interact 2009, 264--278. Google ScholarDigital Library
- Heydekorn, J., Frisch, M., Dachselt, R.: Evaluating a User-Elicited Gesture Set for Interactive Displays, In Proc. of Mensch und Computer 2011, 191--200.Google ScholarCross Ref
- Hinckley, K. Synchronous gestures for multiple persons and computers. In Proc of UIST 2003 , 149--158. Google ScholarDigital Library
- Johanson, B, Fox, A, and Winograd, T. The Interactive Workspaces project: experiences with ubiquitous computing rooms. IEEE Pervasive Computing, 1, 2 (Apr-Jun 2002), 67--74. Google ScholarDigital Library
- Kray, C, Nesbitt, D, Dawson, J, and Rohs, M. User-defined gestures for connecting mobile phones, public displays, and tabletops. In Proc. of MobileHCI '10, 239--248. Google ScholarDigital Library
- Kurdyukova, E, Redlin, M, and André, E. Studying user-defined iPad gestures for interaction in multi-display environment. In Proc. of IUI '12, 93--96. Google ScholarDigital Library
- MacKenzie, R, Hawkey, K, Booth, K S, Zhangbo, L, Perswain, P, and Sukhveer, D S. LACOME: a multi-user collaboration system for shared large displays. In Proc. of CSCW '12 ,267--268. Google ScholarDigital Library
- Morris, M R, Wobbrock, J O, and Wilson, A D. Understanding users' preferences for surface gestures. In Proc. of GI '10, 261--268. Google ScholarDigital Library
- Nacenta, M A, Sakurai, S, Yamaguchi, T et al. E-conic: a perspective-aware interface for multi-display environments. In Proc. of UIST '07, 279--288. Google ScholarDigital Library
- Nielsen, M., Störring, M., Moeslund, T.B. and Granum, E. A procedure for developing intuitive and ergonomic gesture interfaces for HCI. In Proc. Int'l Gesture Workshop, 409--420.Google Scholar
- Rekimoto, J., Saitoh, M. Augmented surfaces: a spatially continuous work space for hybrid computing environments. In Proc of CHI 1999 , 378--385. Google ScholarDigital Library
- Rekimoto, J. Pick-and-drop: a direct manipulation technique for multiple computer environments. In Proc. UIST'97, 31--49. Google ScholarDigital Library
- Streitz, N.A., Geibler, B. i-LAND: an interactive landscape for creativity and innovation. In Proc. CHI 2002, 120--127. Google ScholarDigital Library
- Voida, S, Podlaseck, M, Kjeldsen, R, and Pinhanez, C. A study on the manipulation of 2D objects in a projector/camera-based augmented reality environment. In Proc. of CHI '05, 611--620. Google ScholarDigital Library
- Wallace, J, Ha, V, Ziola, R, and Inkpen, K. Swordfish: user tailored workspaces in multi-display environments. In Proc. of CHI '06, 1487--1492. Google ScholarDigital Library
- Wilson, A D and Benko, H. Combining multiple depth cameras and projectors for interactions on, above and between surfaces. In Proc. of UIST '10, 273--282. Google ScholarDigital Library
- Wobbrock, J O, Morris, M R, and Wilson, A D. User-defined gestures for surface computing. In Proc. of CHI '10, 1083--1092. Google ScholarDigital Library
Index Terms
- Eliciting usable gestures for multi-display environments
Recommendations
From small screens to big displays: understanding interaction in multi-display environments
IUI '13 Companion: Proceedings of the companion publication of the 2013 international conference on Intelligent user interfaces companionDevices such as tablets, mobile phones, tabletops and wall displays all incorporate different sizes of screens, and are now commonplace in a variety of situations and environments. Environments that incorporate these devices, multi-display environments (...
Exploring the design space of gestural interaction with active tokens through user-defined gestures
CHI '14: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsMulti-touch and tangible interfaces provide unique opportunities for enhancing learning and discovery with big data. However, existing interaction techniques have limitations when manipulating large data sets. Our goal is to define novel interaction ...
Eugenie: gestural and tangible interaction with active tokens for bio-design
UIST '14 Adjunct: Adjunct Proceedings of the 27th Annual ACM Symposium on User Interface Software and TechnologyWe present a case study of a tangible user interface that implements novel interaction techniques for the construction of complex queries in large data sets. Our interface, Eugenie, utilizes gestural interaction with active physical tokens and a multi-...
Comments