ABSTRACT
We investigate how to seamlessly bridge the gap between users and distant displays for basic interaction tasks, such as object selection and manipulation. For this, we take advantage of very fast and implicit, yet imprecise gaze- and head-directed input in combination with ubiquitous smartphones for additional manual touch control. We have carefully elaborated two novel and consistent sets of gaze-supported interaction techniques based on touch-enhanced gaze pointers and local magnification lenses. These conflict-free sets allow for fluently selecting and positioning distant targets. Both sets were evaluated in a user study with 16 participants. Overall, users were fastest with a touch-enhanced gaze pointer for selecting and positioning an object after some training. While the positive user feedback for both sets suggests that our proposed gaze- and head-directed interaction techniques are suitable for a convenient and fluent selection and manipulation of distant targets, further improvements are necessary for more precise cursor control.
Supplemental Material
- Argelaguet, F., and Andujar, C. Efficient 3D pointing selection in cluttered virtual environments. Computer Graphics and Appl., IEEE 29, 6 (11 2009), 34--43. Google ScholarDigital Library
- Ashmore, M., Duchowski, A. T., and Shoemaker, G. Efficient eye pointing with a fisheye lens. In Proc. of GI '05 (2005), 203--210. Google ScholarDigital Library
- Bezerianos, A., and Balakrishnan, R. The vacuum: Facilitating the manipulation of distant objects. In Proc. of CHI '05, ACM (2005), 361--370. Google ScholarDigital Library
- Bieg, H.-J., Chuang, L. L., Fleming, R. W., Reiterer, H., and Bülthoff, H. H. Eye and pointer coordination in search and selection tasks. In Proc. of ETRA'10, ACM (2010), 89--92. Google ScholarDigital Library
- Bolt, R. A. Gaze-orchestrated dynamic windows. In Proc. of SIGGRAPH '81, ACM (1981), 109--119. Google ScholarDigital Library
- Boring, S., Baur, D., Butz, A., Gustafson, S., and Baudisch, P. Touch projector: mobile interaction through video. In Proc. of CHI '10, ACM (2010), 2287--2296. Google ScholarDigital Library
- Bulling, A., and Gellersen, H. Toward mobile eye-based human-computer interaction. IEEE Pervasive Computing 9 (2010), 8--12. Google ScholarDigital Library
- Drewes, H., and Schmidt, A. The MAGIC touch: Combining MAGIC-pointing with a touch-sensitive mouse. In Proc. of INTERACT'09, Springer-Verlag (2009), 415--428. Google ScholarDigital Library
- Han, S., Lee, H., Park, J., Chang, W., and Kim, C. Remote interaction for 3D manipulation. In Proc. of CHI EA '10, ACM (2010), 4225--4230. Google ScholarDigital Library
- Jacob, R. J. K. What you look at is what you get: eye movement-based interaction techniques. In Proc. of CHI '90, ACM (1990), 11--18. Google ScholarDigital Library
- Kaiser, E., Olwal, A., McGee, D., Benko, H., Corradini, A., Li, X., Cohen, P., and Feiner, S. Mutual disambiguation of 3D multimodal interaction in augmented and virtual reality. In Proc. of ICMI '03, ACM (2003), 12--19. Google ScholarDigital Library
- Keefe, D. F., Gupta, A., Feldman, D., Carlis, J. V., Krehbiel Keefe, S., and Griffin, T. J. Scaling up multi-touch selection and querying: Interfaces and applications for combining mobile multi-touch input with large-scale visualization displays. Int. J. Hum.-Comput. Stud. 70, 10 (Oct. 2012), 703--713. Google ScholarDigital Library
- Kumar, M., Paepcke, A., and Winograd, T. EyePoint: practical pointing and selection using gaze and keyboard. In Proc. of CHI '07, ACM (2007), 421--430. Google ScholarDigital Library
- Monden, A., Matsumoto, K., and Yamato, M. Evaluation of gaze-added target selection methods suitable for general GUIs. Int. J. Comput. Appl. Technol. 24 (June 2005), 17--24. Google ScholarDigital Library
- Nancel, M., Wagner, J., Pietriga, E., Chapuis, O., and Mackay, W. Mid-air pan-and-zoom on wall-sized displays. In Proc. of CHI '11, ACM (2011), 177--186. Google ScholarDigital Library
- Stellmach, S., and Dachselt, R. Look & touch: gaze-supported target acquisition. In Proc. of CHI '12, ACM (2012), 2981--2990. Google ScholarDigital Library
- Stellmach, S., Stober, S., Nürnberger, A., and Dachselt, R. Designing gaze-supported multimodal interactions for the exploration of large image collections. In Proc. of NGCA'11, ACM (2011), 1--8. Google ScholarDigital Library
- Turner, J., Bulling, A., and Gellersen, H. Combining gaze with manual interaction to extend physical reach. In Proc. of PETMEI '11, ACM (2011), 33--36. Google ScholarDigital Library
- Vogel, D., and Balakrishnan, R. Distant freehand pointing and clicking on very large, high resolution displays. In Proc. of UIST'05, ACM (2005), 33--42. Google ScholarDigital Library
- Ware, C., and Mikaelian, H. H. An evaluation of an eye tracker as a device for computer input. In Proc. of CHI'87, ACM (1987), 183--188. Google ScholarDigital Library
- Yamamoto, M., Komeda, M., Nagamatsu, T., and Watanabe, T. Development of eye-tracking tabletop interface for media art works. In Proc. of ITS '10, ITS '10, ACM (2010), 295--296. Google ScholarDigital Library
- Yoo, B., Han, J.-J., Choi, C., Yi, K., Suh, S., Park, D., and Kim, C. 3D user interface combining gaze and hand gestures for large-scale display. In Proc. of CHI EA'10, ACM (2010), 3709--3714. Google ScholarDigital Library
- Zhai, S., Morimoto, C., and Ihde, S. Manual and gaze input cascaded (MAGIC) pointing. In Proc. of CHI'99, ACM (1999), 246--253. Google ScholarDigital Library
Index Terms
- Still looking: investigating seamless gaze-supported selection, positioning, and manipulation of distant targets
Recommendations
Eye&Head: Synergetic Eye and Head Movement for Gaze Pointing and Selection
UIST '19: Proceedings of the 32nd Annual ACM Symposium on User Interface Software and TechnologyEye gaze involves the coordination of eye and head movement to acquire gaze targets, but existing approaches to gaze pointing are based on eye-tracking in abstraction from head motion. We propose to leverage the synergetic movement of eye and head, and ...
GAWSCHI: gaze-augmented, wearable-supplemented computer-human interaction
ETRA '16: Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & ApplicationsRecent developments in eye tracking technology are paving the way for gaze-driven interaction as the primary interaction modality. Despite successful efforts, existing solutions to the "Midas Touch" problem have two inherent issues: 1) lower accuracy, ...
Multimodal Segmentation on a Large Interactive Tabletop: Extending Interaction on Horizontal Surfaces with Gaze
ISS '16: Proceedings of the 2016 ACM International Conference on Interactive Surfaces and SpacesEye tracking is a promising input modality for interactive tabletops. However, issues such as eyelid occlusion and the viewing angle at distant positions present significant challenges for remote gaze tracking in this setting. We present the results of ...
Comments