skip to main content
10.1145/2470654.2470695acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Still looking: investigating seamless gaze-supported selection, positioning, and manipulation of distant targets

Published:27 April 2013Publication History

ABSTRACT

We investigate how to seamlessly bridge the gap between users and distant displays for basic interaction tasks, such as object selection and manipulation. For this, we take advantage of very fast and implicit, yet imprecise gaze- and head-directed input in combination with ubiquitous smartphones for additional manual touch control. We have carefully elaborated two novel and consistent sets of gaze-supported interaction techniques based on touch-enhanced gaze pointers and local magnification lenses. These conflict-free sets allow for fluently selecting and positioning distant targets. Both sets were evaluated in a user study with 16 participants. Overall, users were fastest with a touch-enhanced gaze pointer for selecting and positioning an object after some training. While the positive user feedback for both sets suggests that our proposed gaze- and head-directed interaction techniques are suitable for a convenient and fluent selection and manipulation of distant targets, further improvements are necessary for more precise cursor control.

Skip Supplemental Material Section

Supplemental Material

chi0463-file3.mp4

mp4

21.2 MB

References

  1. Argelaguet, F., and Andujar, C. Efficient 3D pointing selection in cluttered virtual environments. Computer Graphics and Appl., IEEE 29, 6 (11 2009), 34--43. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Ashmore, M., Duchowski, A. T., and Shoemaker, G. Efficient eye pointing with a fisheye lens. In Proc. of GI '05 (2005), 203--210. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Bezerianos, A., and Balakrishnan, R. The vacuum: Facilitating the manipulation of distant objects. In Proc. of CHI '05, ACM (2005), 361--370. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Bieg, H.-J., Chuang, L. L., Fleming, R. W., Reiterer, H., and Bülthoff, H. H. Eye and pointer coordination in search and selection tasks. In Proc. of ETRA'10, ACM (2010), 89--92. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Bolt, R. A. Gaze-orchestrated dynamic windows. In Proc. of SIGGRAPH '81, ACM (1981), 109--119. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Boring, S., Baur, D., Butz, A., Gustafson, S., and Baudisch, P. Touch projector: mobile interaction through video. In Proc. of CHI '10, ACM (2010), 2287--2296. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Bulling, A., and Gellersen, H. Toward mobile eye-based human-computer interaction. IEEE Pervasive Computing 9 (2010), 8--12. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Drewes, H., and Schmidt, A. The MAGIC touch: Combining MAGIC-pointing with a touch-sensitive mouse. In Proc. of INTERACT'09, Springer-Verlag (2009), 415--428. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Han, S., Lee, H., Park, J., Chang, W., and Kim, C. Remote interaction for 3D manipulation. In Proc. of CHI EA '10, ACM (2010), 4225--4230. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Jacob, R. J. K. What you look at is what you get: eye movement-based interaction techniques. In Proc. of CHI '90, ACM (1990), 11--18. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Kaiser, E., Olwal, A., McGee, D., Benko, H., Corradini, A., Li, X., Cohen, P., and Feiner, S. Mutual disambiguation of 3D multimodal interaction in augmented and virtual reality. In Proc. of ICMI '03, ACM (2003), 12--19. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Keefe, D. F., Gupta, A., Feldman, D., Carlis, J. V., Krehbiel Keefe, S., and Griffin, T. J. Scaling up multi-touch selection and querying: Interfaces and applications for combining mobile multi-touch input with large-scale visualization displays. Int. J. Hum.-Comput. Stud. 70, 10 (Oct. 2012), 703--713. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Kumar, M., Paepcke, A., and Winograd, T. EyePoint: practical pointing and selection using gaze and keyboard. In Proc. of CHI '07, ACM (2007), 421--430. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Monden, A., Matsumoto, K., and Yamato, M. Evaluation of gaze-added target selection methods suitable for general GUIs. Int. J. Comput. Appl. Technol. 24 (June 2005), 17--24. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Nancel, M., Wagner, J., Pietriga, E., Chapuis, O., and Mackay, W. Mid-air pan-and-zoom on wall-sized displays. In Proc. of CHI '11, ACM (2011), 177--186. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Stellmach, S., and Dachselt, R. Look & touch: gaze-supported target acquisition. In Proc. of CHI '12, ACM (2012), 2981--2990. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Stellmach, S., Stober, S., Nürnberger, A., and Dachselt, R. Designing gaze-supported multimodal interactions for the exploration of large image collections. In Proc. of NGCA'11, ACM (2011), 1--8. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Turner, J., Bulling, A., and Gellersen, H. Combining gaze with manual interaction to extend physical reach. In Proc. of PETMEI '11, ACM (2011), 33--36. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Vogel, D., and Balakrishnan, R. Distant freehand pointing and clicking on very large, high resolution displays. In Proc. of UIST'05, ACM (2005), 33--42. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Ware, C., and Mikaelian, H. H. An evaluation of an eye tracker as a device for computer input. In Proc. of CHI'87, ACM (1987), 183--188. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Yamamoto, M., Komeda, M., Nagamatsu, T., and Watanabe, T. Development of eye-tracking tabletop interface for media art works. In Proc. of ITS '10, ITS '10, ACM (2010), 295--296. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Yoo, B., Han, J.-J., Choi, C., Yi, K., Suh, S., Park, D., and Kim, C. 3D user interface combining gaze and hand gestures for large-scale display. In Proc. of CHI EA'10, ACM (2010), 3709--3714. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Zhai, S., Morimoto, C., and Ihde, S. Manual and gaze input cascaded (MAGIC) pointing. In Proc. of CHI'99, ACM (1999), 246--253. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Still looking: investigating seamless gaze-supported selection, positioning, and manipulation of distant targets

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI '13: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
      April 2013
      3550 pages
      ISBN:9781450318990
      DOI:10.1145/2470654

      Copyright © 2013 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 27 April 2013

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      CHI '13 Paper Acceptance Rate392of1,963submissions,20%Overall Acceptance Rate6,199of26,314submissions,24%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader