ABSTRACT
Many tasks require attention switching. For example, searching for information on one sheet of paper and then entering this information onto another one. With paper we see that people use fingers or objects as placeholders. Using these simple aids, the process of switching attention between displays can be simplified and speeded up. With large or multiple visual displays we have many tasks where both attention areas are on the screen and where using a finger as a placeholder is not suitable. One way users deal with this is to use the mouse and highlight their current focus. However, this also has its limitations -- in particular in environments where there is no pointing device. Our approach is to utilize the user's gaze position to provide a visual placeholder. The last area where a user fixated on the screen (before moving their attention away) is highlighted; we call this visual reminder a Gazemark. Gazemarks ease orientation and the resumption of the interrupted task when coming back to this display. In this paper we report on a study where the effectiveness of using Gazemarks was investigated, in particular we show how they can ease attention switching. Our results show faster completion times for a resumed simple visual search task when using this technique. The paper analyzes relevant parameters for the implementation of Gazemarks and discusses some further application areas for this approach.
- Alliance of Automobile Manufacturers: Statement of Principles, Criteria and Verification Procedures on Driver Interaction with Advanced In-Vehicle Information and Communication Systems, 2003.Google Scholar
- Anderson, R.E. Social impacts of computing: Codes of professional ethics. Social Science Computing Review 10, 2 (1992), 453--469.Google ScholarCross Ref
- Ashdown, M., Oka, K., Sato, Y. Combining head tracking and mouse input for a gui on multiple monitors. Ext. Abstracts CHI 2005, ACM Press (2005), pp 1188--1191. Google ScholarDigital Library
- Ashmore, M., Duchowski, A.T., and Shoemaker, G. Efficient Eye Pointing with a FishEye Lens. Proc Graphics Interface 2005, pp. 203--10. Google ScholarDigital Library
- Ballard, D., Hayhoe, M., Pook, P., Rao, R. Deictic codes for the embodiment of cognition. Behavioral and Brain Sciences, 20, 723--767, (1997).Google ScholarCross Ref
- Benko, H., Feiner, F. Multi--Monitor Mouse. Ext. Abstracts CHI 2005. ACM Press (2005), pp 1208--1211. Google ScholarDigital Library
- Bolt, R.A. Gaze-orchestrated dynamic windows. SIGGRAPH '81, ACM Press (1981), 109--119. Google ScholarDigital Library
- Commission of the European Communities: Commission Recommendation of 22 December 2006 on safe and efficient in-vehicle information and communication systems: Update of the European Statement of Principles on human machine interface, 2006.Google Scholar
- Czerwinski, M., Smith, G., Regan, T., Meyers, B., Robertson, G., Starkweather, G. Toward characterizing the productivity benefits of very large displays. Proc. Interact 2003, IOS Press (2003), 9--16.Google Scholar
- Dickie, C., Hart, J., Vertegaal, R., Eiser, A. LookPoint: an evaulation of eye input for hands-free switching of input devices between multiple computers. Proc. OZCHI 2006, pp 119--126. Google ScholarDigital Library
- Dix A, Ramduny-Ellis D, Wilkinson J. Trigger analysis-understanding broken tasks. In: Diaper D, Stanton N (eds) The handbook of task analysis for human-computer interaction. Lawrence Erlbaum Associates, London (2002).Google Scholar
- Drewes, H., Schmidt A,. Interacting with the Computer Using Gaze Gestures. Proc. INTERACT 2007, pp 475--488. Google ScholarDigital Library
- Drewes, H., De Luca, A., Schmidt, A. Eye-Gaze Interaction for Mobile Phones. Proc. Mobility 2007. Google ScholarDigital Library
- Duchowski, A.T. Eye Tracking Methodology: Theory and Practice. Springer-Verlag New York (2003), ISBN:1852336668. Google ScholarDigital Library
- Eriksson, M. Papanikotopoulos, N.P. (1997). Eye-tracking for detection of driver fatigue. Proc. Seventheenth Annual Conference of the Cognitive Science Society 1995, pp. 212--217.Google ScholarCross Ref
- Fono, D., Vertegaal, R. EyeWindows: Evaluation of Eye-Controlled Zooming Windows for Focus Selection. Proc. CHI 2005, ACM Press (2005), pp 151--160. Google ScholarDigital Library
- Gonzalez, V.M. and G. Mark, Constant, constant, multi-tasking craziness: managing multiple working spheres. Proc. CHI 2004, ACM Press (2004), pp 113--120. Google ScholarDigital Library
- Grudin, J. Partitioning Digital Worlds: Focal and Peripheral Awareness in Multiple Monitor Use. In Proc. CHI 2002, ACM Press (2002), 458--465. Google ScholarDigital Library
- Holleis, P., Schmidt, A. MAKEIT: Integrate User Interaction Times in the Design Process of Mobile Applications. Proc. of the Sixth International Conference on Pervasive Computing, Pervasive'08. Springer LNCS Sydney, Australia (2008), S. 56--74. Google ScholarDigital Library
- Horrey, W.J., Wickens, C.D. Driving and side task performance: The effects of display clutter, separation, and modality. Human Factors, 46(4), 611--624, (2004).Google ScholarCross Ref
- Hutchings, D., Czerwinski, M., Smith, G., Meyers, B., Robertson, G. Display space usage and window management operation comparisons between single monitor and multiple monitor users. Proc. AVI 2004, ACM Press (2004), pp 32--39. Google ScholarDigital Library
- Hyrskykari, A., Majaranta, P., Aaltonen, A., Räihä, K. Design issues of iDict: A gaze-assisted translation aid. Proc of the Eye Tracking Research and Applications Symposium 2000, ACM Press (2000), pp. 9--14. Google ScholarDigital Library
- Iqbal, S.T., Horvitz, E. Disruption and recovery of computing tasks: Field study, analysis, and directions, Proc. CHI 2007, ACM Press (2007), pp 677--686. Google ScholarDigital Library
- Jacob, R.J.K., Karn, K.S. Eye tracking in human-computer interaction and usability research: Ready to deliver the promises (Section commentary). In J. Hyona, R. Radach, & H. Deubel (Eds.), The Mind's Eyes: Cognitive and Applied Aspects of Eye Movements. Oxford: Elsevier Science (2003).Google ScholarDigital Library
- Jacob, R.J. What you look at is what you get: eye movement-based interaction techniques. Proc. CHI '90, ACM Press (1990), 11--18. Google ScholarDigital Library
- Kang, Y., Stasko, J. Lightweight task/application performance using single versus multiple monitors: a comparative study. In Proc. GI 2008, pp 17--24. Google ScholarDigital Library
- Kirsh, D. A Few Thoughts on Cognitive Overload, Intellectica, 2000 pp 19--51.Google ScholarCross Ref
- Kirsh, D. The Context of Work, Human computer Interaction, 2001 Vol 16(2-4), pp. 305--322. Google ScholarDigital Library
- Kumar, M., Paepcke, A., Winograd, T. EyePoint: Practical Pointing and Selection Using Gaze and Keyboard. Proc CHI 2007, ACM Press (2007), pp 421--430. Google ScholarDigital Library
- Lankford, C. Effective Eye-Gaze Input into Windows. Proc. ETRA 2000: Eye Tracking Research & Applications Symposium. ACM Press (2000). pp. 23--27. Google ScholarDigital Library
- Laqua, S., Bandara, S.U., Sasse, M.A.. GazeSpace: Eye Gaze Controlled Content Spaces. Proc BCS HCI Group Conference. (2007). Google ScholarDigital Library
- Lidwell, W., Holden, K., Butler, J.: Universal Principles of Design. Rockport (2005).Google Scholar
- Lin, C.-S., Huan, C.-C., Chan, C.-N., Yeh, M.-S., Chiu C.-C. The design of a computer game using an eye tracking device for eye's activity rehabilitation, Optics and Lasers in Engineering 42(1), 2004, pp. 91--108.Google ScholarCross Ref
- Mark, G., Gonzalez, V., Harris, J. (2005). No Task Left Behind? Examining the Nature of Fragmented Ext. Abstracts CHI 2005, ACM Press (2005), pp. 321--330. Google ScholarDigital Library
- Monk, C.A., Boehm-Davis, D.A., Trafton, J.G.. The attentional costs of interrupting task performance at various stages. Proc. Human factors and Ergonomics Society 46th annual meeting 2002.Google ScholarCross Ref
- Moses RA. in Adler's Physiology of the eye clinical application, Robert A. Moses., Ed. (Mosby, 1981), chap. 1, pp. 1--15.Google Scholar
- Robertson, G.G., Czerwinski, M., Baudisch, P., Meyers, B., Robbins, D., Smith, G., Tan, D. Large Display user experience. In IEEE CG&A special issue on large displays, 25(4), pp. 44--51, (2005). Google ScholarDigital Library
- Salvucci, D.D. Intelligent Gaze-Added Interfaces. Proc CHI 2000, ACM Press (2000). pp. 273--80. Google ScholarDigital Library
- Scaife, M., Rogers, Y. External cognition: how do graphical representations work? International Journal of Human-Computer Studies, 45, 185--213, (1996). Google ScholarDigital Library
- Smith, J.D., Graham, T.C. Use of eye movements for video game control. Proc. ACE 2006. Google ScholarDigital Library
- Yamato, M., Monden, A., Matsumoto, K.-c., Inoue, K., Torii, K. Quick Button Selection with Eye Gazing for General GUI Environment. Proc. of International Conference on Software: Theory and Practice (ICS2000), pp.712--719.Google Scholar
- Zhai, S., Morimoto, C., Ihde, S. Manual and Gaze Input Cascaded (MAGIC) Pointing. Proc. CHI 1999, ACM Press (1999), pp. 246--53, 1999. Google ScholarDigital Library
Index Terms
- Gazemarks: gaze-based visual placeholders to ease attention switching
Recommendations
An eye-gaze oriented context based interaction paradigm design
HCI '18: Proceedings of the 32nd International BCS Human Computer Interaction ConferenceThe human eye's state of motion and content of interest can express people's cognitive status based on their situations. When observing the surroundings, the human eyes make different eye movements to interact with the observed objects which reflects ...
Eye-gaze interaction for mobile phones
Mobility '07: Proceedings of the 4th international conference on mobile technology, applications, and systems and the 1st international symposium on Computer human interaction in mobile technologyIn this paper, we discuss the use of eye-gaze tracking technology for mobile phones. In particular we investigate how gaze interaction can be used to control applications on handheld devices. In contrast to eye-tracking systems for desktop computers, ...
Spatial Gaze Markers: Supporting Effective Task Switching in Augmented Reality
CHI '24: Proceedings of the CHI Conference on Human Factors in Computing SystemsTask switching can occur frequently in daily routines with physical activity. In this paper, we introduce Spatial Gaze Markers, an augmented reality tool to support users in immediately returning to the last point of interest after an attention shift. ...
Comments