ABSTRACT
Our knowledge of the way that the visual system operates in everyday behavior has, until recently, been very limited. This information is critical not only for understanding visual function, but also for understanding the consequences of various kinds of visual impairment, and for the development of interfaces between human and artificial systems. The development of eye trackers that can be mounted on the head now allows monitoring of gaze without restricting the observer's movements. Observations of natural behavior have demonstrated the highly task-specific and directed nature of fixation patterns, and reveal considerable regularity between observers. Eye, head, and hand coordination also reveals much greater flexibility and task-specificity than previously supposed. Experimental examination of the issues raised by observations of natural behavior requires the development of complex virtual environments that can be manipulated by the experimenter at critical points during task performance. Experiments where we monitored gaze in a simulated driving environment demonstrate that visibility of task relevant information depends critically on active search initiated by the observer according to an internally generated schedule, and this schedule depends on learnt regularities in the environment. In another virtual environment where observers copied toy models we showed that regularities in the spatial structure are used by observers to control eye movement targeting. Other experiments in a virtual environment with haptic feedback show that even simple visual properties like size are not continuously available or processed automatically by the visual system, but are dynamically acquired and discarded according to the momentary task demands.
- BALLARD, D. H. 1991 Animate vision: An evolutionary step in computational vision. J. of the Inst. of Electronics, Information, and Communication Engineers, 74, 343-348.Google Scholar
- BALLARD, D, HAYHOE, M. AND PELZ J. 1995 Memory Representations in natural tasks. Cognitive Neuroscience 7, 66-80. Google ScholarDigital Library
- BALLARD, D. HAYHOE, M., POOK, P. AND RAO, R. 1997 Deictic Codes for the Embodiment of Cognition. Behavioral & Brain Sciences 20, 723-767.Google ScholarCross Ref
- CHUN, M. & YIANG, Y. 1998 Contextual cueing: Implicit learning and memory of visual context guides spatial attention. Cognitive Psychology, 36, 28-71.Google ScholarCross Ref
- CHUN, M. & YIANG, Y. 1999 Top-down attentional guidance based on implicit learning of visual covariation Psychological Science, 10, 360-365.Google Scholar
- EPELBOIM, J., STEINMAN, R,, KOWLER, E,, EDWARDS, M,, PIZLO, Z,, ERKELENS, C,, AND COLLEWIJN, H. 1995 The function of visual search and memory in sequential looking tasks. Vision Res. 35, 3401-22.Google ScholarCross Ref
- HAYHOE, M. M. 2000 Vision using routines: a functional account of vision. Visual Cognition, 7, 43-64.Google ScholarCross Ref
- HAYHOE, M., LAND, M., AND SHRIVASTAVA, A. 1999 Coordination of eye and hand movements in a normal environment. Invest. Ophthalmol & Vis. Sci. 40, S380.Google Scholar
- HENDERSON, J. AND HOLLINGWORTH, A. 1999 The role of fixation position in detecting scene changes across saccades. Psychological Science, 10, 438-443.Google ScholarCross Ref
- D. E. IRWIN, 1991 Information integration across saccadic eye movements, Cognitive Psychology, 23, 420-456.Google ScholarCross Ref
- LAND MF, AND LEE DN 1994 Where we look when we steer. Nature (Lond.) 369: 742-744.Google ScholarCross Ref
- LAND, M. AND FURNEAUX, S. 1997 The knowledge base of the oculomotor system. Phil Trans. R. Lond. Soc. B, 352, 1231-1239.Google ScholarCross Ref
- LAND, M. F. AND MCCLEOD, P. 2000 The eye movement strategies of batsmen in cricket. Invest. Ophthalmol. AND Vis Sci., Suppl. 40, S815.Google Scholar
- LAND MF, MENNIE N, AND RUSTED J 1999 Eye movements and the roles of vision in activities of daily living: making a cup of tea. Perception 28, 1311-1328.Google ScholarCross Ref
- J. K. O'REGAN AND A. LEVY-SCHOEN 1983 Integrating visual information from successive fixations: Does trans-saccadic fusion exist? Vision Research, 23, 765-769.Google ScholarCross Ref
- O'REGAN, J. K., RENSINK, R. A. AND CLARK, J. J. 1999 Change-blindness as a result of 'mudsplashes'. Nature, 398, 34.Google ScholarCross Ref
- O'REGAN, J. K., DEUBEL, H., CLARK, J. AND RENSINK, R. A. 2000 Picture changes during blinks: looking without seeing and seeing without looking. Visual Cognition, 7, 191-211Google ScholarCross Ref
- PELZ, J. B., CANOSA, R., BABCOCK, J., KUCHARCZYK, D., SILVER, A., AND KONNO, D., Portable Eyetracking: A Study of Natural Eye Movements, Proceedings SPIE, Human Vision and Electronic Imaging, San Jose, CA: SPIE 2000.Google Scholar
- PELZ, J. B. AND CANOSA, R., 2001 Oculomotor Behavior and Perceptual Strategies in Complex Tasks, Vision Research. (in press)Google Scholar
- RENSINK, R. A., O'REGAN, J. K. AND CLARK, J. J. 1997 To see or not to see: the need for attention to perceive changes in scenes. J. Psychol. Sc.,. 8, 368-373Google ScholarCross Ref
- SHINODA, H., HAYHOE, M., AND SHRIVASTAVA, A. 2001 Attention in natural environments. Vision Res. (in press)Google Scholar
- SIMONS, D. J. 2000 Change blindness and visual memory. A special issue of Visual Cognition.Psycology Press, Hove, UK,Google Scholar
- ULLMAN, S. 1984 Visual Routines. Cognition, 18, 97-157.Google ScholarCross Ref
- Vision in natural and virtual environments
Recommendations
Visual Gesture Interfaces for Virtual Environments
AUIC '00: Proceedings of the First Australasian User Interface ConferenceVirtual environments provide a whole new way of viewing and manipulating 3D data. Current technology moves the images out of desktop monitors and into the space immediately surrounding the user. Users can literally put their hands on the virtual ...
Interacting with eye movements in virtual environments
CHI '00: Proceedings of the SIGCHI conference on Human Factors in Computing SystemsEye movement-based interaction offers the potential of easy, natural, and fast ways of interacting in virtual environments. However, there is little empirical evidence about the advantages or disadvantages of this approach. We developed a new ...
Distance perception in real and virtual environments
We conducted three experiments to compare distance perception in real and virtual environments. In Experiment 1, adults estimated how long it would take to walk to targets in real and virtual environments by starting and stopping a stopwatch while ...
Comments