skip to main content
10.1145/1753326.1753646acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Gazemarks: gaze-based visual placeholders to ease attention switching

Authors Info & Claims
Published:10 April 2010Publication History

ABSTRACT

Many tasks require attention switching. For example, searching for information on one sheet of paper and then entering this information onto another one. With paper we see that people use fingers or objects as placeholders. Using these simple aids, the process of switching attention between displays can be simplified and speeded up. With large or multiple visual displays we have many tasks where both attention areas are on the screen and where using a finger as a placeholder is not suitable. One way users deal with this is to use the mouse and highlight their current focus. However, this also has its limitations -- in particular in environments where there is no pointing device. Our approach is to utilize the user's gaze position to provide a visual placeholder. The last area where a user fixated on the screen (before moving their attention away) is highlighted; we call this visual reminder a Gazemark. Gazemarks ease orientation and the resumption of the interrupted task when coming back to this display. In this paper we report on a study where the effectiveness of using Gazemarks was investigated, in particular we show how they can ease attention switching. Our results show faster completion times for a resumed simple visual search task when using this technique. The paper analyzes relevant parameters for the implementation of Gazemarks and discusses some further application areas for this approach.

References

  1. Alliance of Automobile Manufacturers: Statement of Principles, Criteria and Verification Procedures on Driver Interaction with Advanced In-Vehicle Information and Communication Systems, 2003.Google ScholarGoogle Scholar
  2. Anderson, R.E. Social impacts of computing: Codes of professional ethics. Social Science Computing Review 10, 2 (1992), 453--469.Google ScholarGoogle ScholarCross RefCross Ref
  3. Ashdown, M., Oka, K., Sato, Y. Combining head tracking and mouse input for a gui on multiple monitors. Ext. Abstracts CHI 2005, ACM Press (2005), pp 1188--1191. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Ashmore, M., Duchowski, A.T., and Shoemaker, G. Efficient Eye Pointing with a FishEye Lens. Proc Graphics Interface 2005, pp. 203--10. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Ballard, D., Hayhoe, M., Pook, P., Rao, R. Deictic codes for the embodiment of cognition. Behavioral and Brain Sciences, 20, 723--767, (1997).Google ScholarGoogle ScholarCross RefCross Ref
  6. Benko, H., Feiner, F. Multi--Monitor Mouse. Ext. Abstracts CHI 2005. ACM Press (2005), pp 1208--1211. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Bolt, R.A. Gaze-orchestrated dynamic windows. SIGGRAPH '81, ACM Press (1981), 109--119. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Commission of the European Communities: Commission Recommendation of 22 December 2006 on safe and efficient in-vehicle information and communication systems: Update of the European Statement of Principles on human machine interface, 2006.Google ScholarGoogle Scholar
  9. Czerwinski, M., Smith, G., Regan, T., Meyers, B., Robertson, G., Starkweather, G. Toward characterizing the productivity benefits of very large displays. Proc. Interact 2003, IOS Press (2003), 9--16.Google ScholarGoogle Scholar
  10. Dickie, C., Hart, J., Vertegaal, R., Eiser, A. LookPoint: an evaulation of eye input for hands-free switching of input devices between multiple computers. Proc. OZCHI 2006, pp 119--126. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Dix A, Ramduny-Ellis D, Wilkinson J. Trigger analysis-understanding broken tasks. In: Diaper D, Stanton N (eds) The handbook of task analysis for human-computer interaction. Lawrence Erlbaum Associates, London (2002).Google ScholarGoogle Scholar
  12. Drewes, H., Schmidt A,. Interacting with the Computer Using Gaze Gestures. Proc. INTERACT 2007, pp 475--488. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Drewes, H., De Luca, A., Schmidt, A. Eye-Gaze Interaction for Mobile Phones. Proc. Mobility 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Duchowski, A.T. Eye Tracking Methodology: Theory and Practice. Springer-Verlag New York (2003), ISBN:1852336668. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Eriksson, M. Papanikotopoulos, N.P. (1997). Eye-tracking for detection of driver fatigue. Proc. Seventheenth Annual Conference of the Cognitive Science Society 1995, pp. 212--217.Google ScholarGoogle ScholarCross RefCross Ref
  16. Fono, D., Vertegaal, R. EyeWindows: Evaluation of Eye-Controlled Zooming Windows for Focus Selection. Proc. CHI 2005, ACM Press (2005), pp 151--160. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Gonzalez, V.M. and G. Mark, Constant, constant, multi-tasking craziness: managing multiple working spheres. Proc. CHI 2004, ACM Press (2004), pp 113--120. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Grudin, J. Partitioning Digital Worlds: Focal and Peripheral Awareness in Multiple Monitor Use. In Proc. CHI 2002, ACM Press (2002), 458--465. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Holleis, P., Schmidt, A. MAKEIT: Integrate User Interaction Times in the Design Process of Mobile Applications. Proc. of the Sixth International Conference on Pervasive Computing, Pervasive'08. Springer LNCS Sydney, Australia (2008), S. 56--74. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Horrey, W.J., Wickens, C.D. Driving and side task performance: The effects of display clutter, separation, and modality. Human Factors, 46(4), 611--624, (2004).Google ScholarGoogle ScholarCross RefCross Ref
  21. Hutchings, D., Czerwinski, M., Smith, G., Meyers, B., Robertson, G. Display space usage and window management operation comparisons between single monitor and multiple monitor users. Proc. AVI 2004, ACM Press (2004), pp 32--39. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Hyrskykari, A., Majaranta, P., Aaltonen, A., Räihä, K. Design issues of iDict: A gaze-assisted translation aid. Proc of the Eye Tracking Research and Applications Symposium 2000, ACM Press (2000), pp. 9--14. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Iqbal, S.T., Horvitz, E. Disruption and recovery of computing tasks: Field study, analysis, and directions, Proc. CHI 2007, ACM Press (2007), pp 677--686. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Jacob, R.J.K., Karn, K.S. Eye tracking in human-computer interaction and usability research: Ready to deliver the promises (Section commentary). In J. Hyona, R. Radach, & H. Deubel (Eds.), The Mind's Eyes: Cognitive and Applied Aspects of Eye Movements. Oxford: Elsevier Science (2003).Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Jacob, R.J. What you look at is what you get: eye movement-based interaction techniques. Proc. CHI '90, ACM Press (1990), 11--18. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Kang, Y., Stasko, J. Lightweight task/application performance using single versus multiple monitors: a comparative study. In Proc. GI 2008, pp 17--24. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Kirsh, D. A Few Thoughts on Cognitive Overload, Intellectica, 2000 pp 19--51.Google ScholarGoogle ScholarCross RefCross Ref
  28. Kirsh, D. The Context of Work, Human computer Interaction, 2001 Vol 16(2-4), pp. 305--322. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Kumar, M., Paepcke, A., Winograd, T. EyePoint: Practical Pointing and Selection Using Gaze and Keyboard. Proc CHI 2007, ACM Press (2007), pp 421--430. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Lankford, C. Effective Eye-Gaze Input into Windows. Proc. ETRA 2000: Eye Tracking Research & Applications Symposium. ACM Press (2000). pp. 23--27. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Laqua, S., Bandara, S.U., Sasse, M.A.. GazeSpace: Eye Gaze Controlled Content Spaces. Proc BCS HCI Group Conference. (2007). Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Lidwell, W., Holden, K., Butler, J.: Universal Principles of Design. Rockport (2005).Google ScholarGoogle Scholar
  33. Lin, C.-S., Huan, C.-C., Chan, C.-N., Yeh, M.-S., Chiu C.-C. The design of a computer game using an eye tracking device for eye's activity rehabilitation, Optics and Lasers in Engineering 42(1), 2004, pp. 91--108.Google ScholarGoogle ScholarCross RefCross Ref
  34. Mark, G., Gonzalez, V., Harris, J. (2005). No Task Left Behind? Examining the Nature of Fragmented Ext. Abstracts CHI 2005, ACM Press (2005), pp. 321--330. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Monk, C.A., Boehm-Davis, D.A., Trafton, J.G.. The attentional costs of interrupting task performance at various stages. Proc. Human factors and Ergonomics Society 46th annual meeting 2002.Google ScholarGoogle ScholarCross RefCross Ref
  36. Moses RA. in Adler's Physiology of the eye clinical application, Robert A. Moses., Ed. (Mosby, 1981), chap. 1, pp. 1--15.Google ScholarGoogle Scholar
  37. Robertson, G.G., Czerwinski, M., Baudisch, P., Meyers, B., Robbins, D., Smith, G., Tan, D. Large Display user experience. In IEEE CG&A special issue on large displays, 25(4), pp. 44--51, (2005). Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. Salvucci, D.D. Intelligent Gaze-Added Interfaces. Proc CHI 2000, ACM Press (2000). pp. 273--80. Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. Scaife, M., Rogers, Y. External cognition: how do graphical representations work? International Journal of Human-Computer Studies, 45, 185--213, (1996). Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. Smith, J.D., Graham, T.C. Use of eye movements for video game control. Proc. ACE 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. Yamato, M., Monden, A., Matsumoto, K.-c., Inoue, K., Torii, K. Quick Button Selection with Eye Gazing for General GUI Environment. Proc. of International Conference on Software: Theory and Practice (ICS2000), pp.712--719.Google ScholarGoogle Scholar
  42. Zhai, S., Morimoto, C., Ihde, S. Manual and Gaze Input Cascaded (MAGIC) Pointing. Proc. CHI 1999, ACM Press (1999), pp. 246--53, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Gazemarks: gaze-based visual placeholders to ease attention switching

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI '10: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
      April 2010
      2690 pages
      ISBN:9781605589299
      DOI:10.1145/1753326

      Copyright © 2010 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 10 April 2010

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate6,199of26,314submissions,24%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader