ABSTRACT
In many applications it is necessary to guide humans' visual attention towards certain points in the environment. This can be to highlight certain attractions in a touristic application for smart glasses, to signal important events to the driver of a car or to draw the attention of a user of a desktop system to an important message of the user interface. The question we are addressing here is: How can we guide visual attention if we are not able to do it visually? In the presented approach we use gaze-contingent auditory feedback (sonification) to guide visual attention and show that people are able to make use of this guidance to speed up visual search tasks significantly.
- Begault, D. R., Wenzel, E. M., Shrum, R., and Miller, J. A virtual audio guidance and alert system for commercial aircraft operations. In Proceedings of the Third International Conference on Auditory Display (1996), 117122.Google Scholar
- Deville, B., Bologna, G., Vinckenbosch, M., and Pun, T. Guiding the focus of attention of blind people with visual saliency (2008). 00006.Google Scholar
- Duchowski, A. T. A breadth-first survey of eye-tracking applications. Behavior Research Methods, Instruments, & Computers 34, 4 (2002), 455--470.Google Scholar
- Flowers, J. H. Thirteen years of reflection on auditory graphing: Promises, pitfalls, and potential new directions. Faculty Publications, Department of Psychology (2005), 430.Google Scholar
- Greene, H. H., and Rayner, K. Eye-movement control in direction-coded visual search. Perception 30, 2 (2001), 147--158.Google ScholarCross Ref
- Hermann, T., Höner, O., and Ritter, H. Acoumotion--an interactive sonification system for acoustic motion control. In Gesture in Human-Computer Interaction and Simulation. Springer, 2006, 312--323. Google ScholarDigital Library
- Hornof, A., and Sato, L. Eyemusic: making music with the eyes. In Proceedings of the 2004 conference on New interfaces for musical expression, National University of Singapore (2004), 185--188. Google ScholarDigital Library
- Hunt, A., Hermann, T., and Pauletto, S. Interacting with sonification systems: closing the loop. In Information Visualisation, 2004. IV 2004. Proceedings. Eighth International Conference on, IEEE (2004), 879--884. Google ScholarDigital Library
- Neuhoff, J. G., Wayand, J., and Kramer, G. Pitch and loudness interact in auditory displays: Can the data get lost in the map? Journal of Experimental Psychology: Applied 8, 1 (2002), 17.Google ScholarCross Ref
- Roy, D., Ghitza, Y., Bartelma, J., and Kehoe, C. Visual memory augmentation: using eye gaze as an attention filter. In Wearable Computers, 2004. ISWC 2004. Eighth International Symposium on, vol. 1, IEEE (2004), 128--131. Google ScholarDigital Library
- Twardon, L., Koesling, H., Finke, A., and Ritter, H. Gaze-contingent audio-visual substitution for the blind and visually impaired. In Proceedings of the 7th International Conference on Pervasive Computing Technologies for Healthcare, ICST (Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering) (2013), 129--136. Google ScholarDigital Library
- Walker, B. N., and Kramer, G. Mappings and metaphors in auditory displays: An experimental assessment. ACM Transactions on Applied Perception (TAP) 2, 4 (2005), 407--412. Google ScholarDigital Library
- Wang, D.-Y. D., Pick, D. F., Proctor, R. W., and Ye, Y. Effect of a side collision-avoidance signal on simulated driving with a navigation system. In Proceedings of the 4th International Driving Symposium on Human Factors in Driver Assessment, Training and Vehicle Design (2007), 206--211.Google ScholarCross Ref
- Watson, M., and Sanderson, P. Sonification supports eyes-free respiratory monitoring and task time-sharing. Human Factors: The Journal of the Human Factors and Ergonomics Society 46, 3 (2004), 497--517.Google ScholarCross Ref
Index Terms
- Guiding visual search tasks using gaze-contingent auditory feedback
Recommendations
Subtle gaze direction
This article presents a novel technique that combines eye-tracking with subtle image-space modulation to direct a viewer's gaze about a digital image. We call this paradigm subtle gaze direction. Subtle gaze direction exploits the fact that our ...
Visual Grouping in Menu Interfaces
CHI '15: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing SystemsMenu interfaces often arrange options into semantic groups. This semantic structure is then usually conveyed to the user by supplementary visual grouping cues. We investigate whether these visual grouping cues actually help users locate items in menus ...
Dynamic tactile guidance for visual search tasks
UIST '12: Proceedings of the 25th annual ACM symposium on User interface software and technologyVisual search in large real-world scenes is both time consuming and frustrating, because the search becomes serial when items are visually similar. Tactile guidance techniques can facilitate search by allowing visual attention to focus on a subregion of ...
Comments