Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Letter
  • Published:

Optimal eye movement strategies in visual search

Abstract

To perform visual search, humans, like many mammals, encode a large field of view with retinas having variable spatial resolution, and then use high-speed eye movements to direct the highest-resolution region, the fovea, towards potential target locations1,2. Good search performance is essential for survival, and hence mammals may have evolved efficient strategies for selecting fixation locations. Here we address two questions: what are the optimal eye movement strategies for a foveated visual system faced with the problem of finding a target in a cluttered environment, and do humans employ optimal eye movement strategies during a search? We derive the ideal bayesian observer3,4,5,6 for search tasks in which a target is embedded at an unknown location within a random background that has the spectral characteristics of natural scenes7. Our ideal searcher uses precise knowledge about the statistics of the scenes in which the target is embedded, and about its own visual system, to make eye movements that gain the most information about target location. We find that humans achieve nearly optimal search performance, even though humans integrate information poorly across fixations8,9,10. Analysis of the ideal searcher reveals that there is little benefit from perfect integration across fixations—much more important is efficient processing of information on each fixation. Apparently, evolution has exploited this fact to achieve efficient eye movement strategies with minimal neural resources devoted to memory.

This is a preview of subscription content, access via your institution

Access options

Rent or buy this article

Prices vary by article type

from$1.95

to$39.95

Prices may be subject to local taxes which are calculated during checkout

Figure 1: Measurement of the visibility maps.
Figure 2: Representation of the visibility maps.
Figure 3: Ideal searcher.
Figure 4: Human versus ideal performance.

Similar content being viewed by others

References

  1. Carpenter, R. H. S. (ed.) Eye Movements (Macmillan, London, 1991)

  2. Liversedge, S. P. & Findley, J. M. Saccadic eye movements and cognition. Trends Cogn. Sci. 4, 6–14 (2000)

    Article  CAS  Google Scholar 

  3. Green, D. M. & Swets, J. A. Signal Detection Theory and Psychophysics (Wiley, New York, 1966)

    Google Scholar 

  4. Burgess, A. E. & Ghandeharian, H. Visual signal detection. II. Effect of signal-location identification. J. Opt. Soc. Am. A 1, 906–910 (1984)

    Article  ADS  CAS  Google Scholar 

  5. Geisler, W. S. & Diehl, R. L. A Bayesian approach to the evolution of perceptual and cognitive systems. Cogn. Sci. 27, 379–402 (2003)

    Article  Google Scholar 

  6. Kersten, D., Mamassian, P. & Yuille, A. L. Object perception as Bayesian inference. Annu. Rev. Psychol. 55, 271–304 (2004)

    Article  Google Scholar 

  7. Field, D. J. Relations between the statistics of natural images and the response properties of cortical cells. J. Opt. Soc. Am. A 4, 2379–2394 (1987)

    Article  ADS  CAS  Google Scholar 

  8. Irwin, D. E. Information integration across saccadic eye movement. Cognit. Psychol. 23, 420–458 (1991)

    Article  CAS  Google Scholar 

  9. Hayhoe, M. M., Bensinger, D. G. & Ballard, D. H. Task constraints in visual working memory. Vision Res. 38, 125–137 (1998)

    Article  CAS  Google Scholar 

  10. Rensink, R. A. Change detection. Annu. Rev. Psychol. 53, 245–277 (2002)

    Article  Google Scholar 

  11. Palmer, J., Verghese, P. & Pavel, M. The psychophysics of visual search. Vision Res. 40, 1227–1268 (2000)

    Article  CAS  Google Scholar 

  12. Wolfe, J. M. in Attention (ed. Pashler, H.) 13–74 (Psychology Press, Hove, East Sussex, 1998)

    Google Scholar 

  13. Schall, J. D. in The Visual Neurosciences (eds Chalupa, L. M. & Werner, J. S.) 1369–1390 (MIT Press, Cambridge, Massachusetts, 2004)

    Google Scholar 

  14. Blake, A. & Yuille, A. L. (eds) Active Vision (MIT Press, Cambridge, Massachusetts, 1992)

  15. Burgess, A. E., Wagner, R. F., Jennings, R. J. & Barlow, H. B. Efficiency of human visual signal discrimination. Science 214, 93–94 (1981)

    Article  ADS  CAS  Google Scholar 

  16. Pelli, D. G. & Farell, B. Why use noise? J. Opt. Soc. Am. A 16, 647–653 (1999)

    Article  ADS  CAS  Google Scholar 

  17. Lu, Z.-L. & Dosher, B. A. Characterizing human perceptual inefficiencies with equivalent internal noise. J. Opt. Soc. Am. A 16, 764–778 (1999)

    Article  ADS  CAS  Google Scholar 

  18. Geman, D. & Jedynak, B. An active testing model for tracking roads in satellite images. IEEE Trans. Pattern Anal. Mach. Intell. 18, 1–14 (1996)

    Article  Google Scholar 

  19. Rajashekar, U., Cormack, L. K. & Bovik, A. C. in Eye Tracking Research & Applications (ed. Duchowski, A. T.) 119–123 (ACM SIGGRAPH, New Orleans, 2002)

    Google Scholar 

  20. Findley, J. M. Global processing for saccadic eye movements. Vision Res. 22, 1033–1045 (1982)

    Article  Google Scholar 

  21. Zelinsky, G. J., Rao, R. P., Hayhoe, M. M. & Ballard, D. H. Eye movements reveal the spatio-temporal dynamics of visual search. Psychol. Sci. 8, 448–453 (1997)

    Article  Google Scholar 

  22. Legge, G. E., Hooven, T. A., Klitz, T. S., Mansfield, J. G. & Tjan, B. S. Mr. Chips 2002: New insights from an ideal observer model of reading. Vision Res. 42, 2219–2234 (2002)

    Article  Google Scholar 

  23. Eckstein, M. P., Beutter, B. R. & Stone, L. S. Quantifying the performance limits of human saccadic targeting during visual search. Perception 30, 1389–1401 (2001)

    Article  CAS  Google Scholar 

  24. Eckstein, M. P., Thomas, J. P., Palmer, J. & Shimozaki, S. S. A signal detection model predicts the effects of set size on visual search accuracy for feature, conjunction, triple conjunction, and disjunction displays. Percept. Psychophys. 62, 425–451 (2000)

    Article  CAS  Google Scholar 

  25. Itti, L. & Koch, C. A saliency-based search mechanism for overt and covert shifts of visual attention. Vision Res. 40, 11–46 (2000)

    Article  Google Scholar 

  26. Rao, R. P. N., Zelinsky, G. J., Hayhoe, M. M. & Ballard, D. H. Eye movements in iconic visual search. Vision Res. 42, 1447–1463 (2002)

    Article  Google Scholar 

Download references

Acknowledgements

We thank R.F. Murray for helpful discussions, and J. Perry, L. Stern and C. Creeger for technical assistance. This work was supported by the National Eye Institute, NIH.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wilson S. Geisler.

Ethics declarations

Competing interests

The authors declare that they have no competing financial interests.

Supplementary information

Supplementary Equations

This document contains a derivation of the ideal searcher for the case of dynamic (temporally uncorrelated) external and internal noise, and describes the ideal searcher for the case of static (temporally correlated) external noise and dynamic (temporally uncorrelated) internal noise. (PDF 181 kb)

Rights and permissions

Reprints and permissions

About this article

Cite this article

Najemnik, J., Geisler, W. Optimal eye movement strategies in visual search. Nature 434, 387–391 (2005). https://doi.org/10.1038/nature03390

Download citation

  • Received:

  • Accepted:

  • Issue Date:

  • DOI: https://doi.org/10.1038/nature03390

This article is cited by

Comments

By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing