ABSTRACT
We present a multi-camera vision-based eye tracking method to robustly locate and track user's eyes as they interact with an application. We propose enhancements to various vision-based eye-tracking approaches, which include (a) the use of multiple cameras to estimate head pose and increase coverage of the sensors and (b) the use of probabilistic measures incorporating Fisher's linear discriminant to robustly track the eyes under varying lighting conditions in real-time. We present experiments and quantitative results to demonstrate the robustness of our eye tracking in two application prototypes.
- A. Aaltonen, A. Hyrskykari, and K. Raiha. 101 spots on how do users read menus? In Human Factors in Computing Systems: CHI 98, pages 132--139, New York, 1998. ACM Press. Google ScholarDigital Library
- Alan Allport. Visual Attention. MIT Press, 1993.Google Scholar
- M. Argyle. Social Interaction. Methuen & Co., London, England, 1969.Google Scholar
- M. Argyle and M. Cook. Gaze and Mutual Gaze. Cambridge University Press, Cambridge, UK, 1976. barber:perceptioninfoGoogle Scholar
- P. Barber and D. Legge. Perception and Information. Methuen & Co., London, England, 1976.Google Scholar
- P. Belhumeur, J. Hespanha, and D. Kriegman. Eigenfaces vs. fisherfaces: Recognition using class specific linear projection. In IEEE Transactions on Pattern Analysis and Machine Intelligence, volume 19, July 1997. Google ScholarDigital Library
- La Cascia, M. Sclaroff, and S. Athitso. Fast reliable head tracking under varying illumination: An approach based on robust registration of texture mapped 3-d models. In IEEE Transactions on Pattern Analysis and Machine Intelligence, 2000. Google ScholarDigital Library
- J. H. Goldberg and J. C. Schryver. Eye-gaze determination of user intent at computer interface. Elsevier Science Publishing, New York, New York, 1995.Google Scholar
- A. Haro, M. Flickner, and I. Essa. Detecting and tracking eyes by using their physiological properties, dynamics, and appearance. In IEEE Computer Vision and Pattern Recognition, pages 163--168, 2000.Google ScholarCross Ref
- M. Harville, A. Rahimi, T. Darrell, G. Gordon, and J. Woodfill. 3-d pose tracking with linear depth and brightness constraints. In International Conference on Computer Vision, 1999.Google ScholarCross Ref
- B. Jabrain, J. Wu, R. Vertegaal, and L. Grigorov. Establishing remote conversations through eye contact with physical awareness proxies. In Extended Abstracts of ACM CHI, 2003. Google ScholarDigital Library
- Y. Matsumoto, T. Ogasawara, and A. Zelinsky. Behavior recognition based on head pose and gaze direction measurement. In IEEE International Conference on Intelligent Robots and Systems, 2000.Google ScholarCross Ref
- C. H. Morimoto, D. Koons, A. Amir, and M. Flickner. Pupil detection and tracking using multiple light sources. Technical report RJ-10117, IBM Almaden Research Center, 1998.Google Scholar
- E. Mynatt, J. Rowan, and A Jacobs. Digital family portraits: Providing peace of mind for extended family members. In ACM CHI, 2001. Google ScholarDigital Library
- K. Nagel, C. Kidd, T. O'Connell, S. Patil, and G. Abowd. The family intercom: Developing a context-aware audio communication system. In Ubicomp, 2001. Google ScholarDigital Library
- R. Ruddarraju, A. Haro, and I. Essa. Fast multiple camera head pose tracking. In International Conference on Vision Interfaces, Halifax, Canada, 2003.Google Scholar
- A. Schoedl, A. Haro, and I. Essa. Head tracking using a textured polygonal model. In Proceedings Workshop on Perceptual User Interfaces, 1998.Google Scholar
- Rainer Stiefelhagen. Tracking focus of attention in meetings. In International Conference on Multi-Modal Interfaces, 2002. Google ScholarDigital Library
- Q. Tran and E. Mynatt. Cook's collage: Two exploratory designs. In CHI 2002, Conference Proceedings, 2002.Google Scholar
- A. L. Yarbus. Eye Movements during Perception of Complex Objects. Plenum Press, New York, New York, 1967.Google Scholar
Index Terms
- Perceptual user interfaces using vision-based eye tracking
Recommendations
Eye and gaze tracking for interactive graphic display
This paper describes a computer vision system based on active IR illumination for real-time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and ...
A general framework for extension of a tracking range of user-calibration-free remote eye-gaze tracking systems
ETRA '12: Proceedings of the Symposium on Eye Tracking Research and ApplicationsStereo-camera Remote Eye-Gaze Tracking (REGT) systems can provide calibration-free estimation of gaze. However, such systems have a limited tracking range due to the requirement for the eye to be tracked in both cameras. This paper presents a general ...
Complemental Use of Multiple Cameras for Stable Tracking of Multiple Markers
VMR '09: Proceedings of the 3rd International Conference on Virtual and Mixed Reality: Held as Part of HCI International 2009In many applications of Augmented Reality (AR), rectangular markers are tracked in real time by capturing with cameras. In this paper, we consider the AR application in which virtual objects are displayed onto markers while the markers and the cameras ...
Comments