ABSTRACT
In this work we describe the design, implementation and evaluation of a novel eye tracker for context-awareness and mobile HCI applications. In contrast to common systems using video cameras, this compact device relies on Electrooculography (EOG). It consists of goggles with dry electrodes integrated into the frame and a small pocket-worn component with a DSP for real-time EOG signal processing. The device is intended for wearable and standalone use: It can store data locally for long-term recordings or stream processed EOG signals to a remote device over Bluetooth. We describe how eye gestures can be efficiently recognised from EOG signals for HCI purposes. In an experiment conducted with 11 subjects playing a computer game we show that 8 eye gestures of varying complexity can be continuously recognised with equal performance to a state-of-the-art video-based system. Physical activity leads to artefacts in the EOG signal. We describe how these artefacts can be removed using an adaptive filtering scheme and characterise this approach on a 5-subject dataset. In addition to explicit eye movements for HCI, we discuss how the analysis of unconscious eye movements may eventually allow to deduce information on user activity and context not available with current sensing modalities.
- S. P. Liversedge and J. M. Findlay. Saccadic eye movements and cognition. Trends in Cognitive Sciences, 4(1):6--14, 2000.Google ScholarCross Ref
- J. M. Henderson. Human gaze control during real-world scene perception. Trends in Cognitive Sciences, 7(11):498--504, 2003.Google ScholarCross Ref
- D. Melcher and E. Kowler. Visual scene memory and the guidance of saccadic eye movements. Vision Research, 41(25--26):3597--3611, 2001.Google Scholar
- M. M. Chun. Contextual cueing of visual attention. Trends in Cognitive Sciences, 4(5):170--178, 2000.Google ScholarCross Ref
- A. Bulling, J. A. Ward, H. Gellersen and G. Tröster. Robust Recognition of Reading Activity in Transit Using Wearable Electrooculography. In Proc. of the 6th International Conference on Pervasive Computing (Pervasive 2008), pages 19--37, 2008. Google ScholarDigital Library
- B. Logan, J. Healey, M. Philipose, E. Tapia and S. Intille. A Long-Term Evaluation of Sensing Modalities for Activity Recognition. In Proc. of the 9th International Conference on Ubiquitous Computing (UbiComp 2007), pages 483--500, 2007. Google ScholarDigital Library
- M. Hayhoe and D. Ballard. Eye movements in natural behavior. Trends in Cognitive Sciences, 9:188--194, 2005.Google ScholarCross Ref
- M. F. Land. Eye movements and the control of actions in everyday life. Progress in Retinal and Eye Research, 25(3):296--324, 2006.Google ScholarCross Ref
- J. Pelz, M. Hayhoe and R. Loeber. The coordination of eye, head, and hand movements in a natural task. Experimental Brain Research, V 139(3):266--277, 2001.Google ScholarCross Ref
- S. Zhai, C. Morimoto and S. Ihde. Manual and gaze input cascaded (MAGIC) pointing. In Proc. of the SIGCHI Conference on Human Factors in Computing Systems (CHI 1999), pages 246--253, 1999. Google ScholarDigital Library
- P. Qvarfordt and S. Zhai. Conversing with the user based on eye-gaze patterns. In Proc. of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2005), pages 221--230, 2005. Google ScholarDigital Library
- H. Drewes and A. Schmidt. Interacting with the Computer Using Gaze Gestures. In Proc. of the 11th International Conference on Human-Computer Interaction (INTERACT 2007), pages 475--488, 2007. Google ScholarDigital Library
- R. J. K. Jacob. What you look at is what you get: eye movement-based interaction techniques. In Proc. of the SIGCHI Conference on Human Factors in Computing Systems (CHI 1990), pages 11--18, 1990. Google ScholarDigital Library
- D. W. Patmore and R. B. Knapp. Towards an EOG-based eye tracker for computer control. In Proc. of the 3rd International ACM Conference on Assistive Technologies (Assets 1998), pages 197--203, 1998. Google ScholarDigital Library
- C. Yingxi and W. S. Newman. A human-robot interface based on electrooculography. In Proc. of the International Conference on Robotics and Automation (ICRA 2004), pages 243--248, 2004.Google ScholarCross Ref
- F. Mizuno, T. Hayasaka, K. Tsubota, S. Wada and T. Yamaguchi. Development of hands-free operation interface for wearable computer-hyper hospital at home. In Proc. of the 25th Annual International Conference of the Engineering in Medicine and Biology Society (EMBS 2003), pages 3740--3743, 2003.Google ScholarCross Ref
- H. Manabe and M. Fukumoto. Full-time wearable headphone-type gaze detector. In Ext. Abstr. of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2006), pages 1073--1078, 2006. Google ScholarDigital Library
- A. T. Vehkaoja, J. A. Verho, M. M. Puurtinen, N. M. Nojd, J. O. Lekkala and J. A. Hyttinen. Wireless Head Cap for EOG and Facial EMG Measurements. In Proc. of the 27th Annual International Conference of the Engineering in Medicine and Biology Society (EMBS 2005), pages 5865--5868, 2005.Google ScholarCross Ref
- M. Brown, M. Marmor, Vaegan, E. Zrenner, M. Brigell and M. Bach. ISCEV Standard for Clinical Electro-oculography (EOG), 2006. Documenta Ophthalmologica, 113(3):205--212, 2006.Google ScholarCross Ref
- J. G. Webster. Medical Instrumentation: Application and Design. John Wiley and Sons, New York, 4th edition, 2007.Google Scholar
- A. Bulling, P. Herter, M. Wirz and G. Tröster. Automatic Artefact Compensation in EOG Signals. In Adj. Proc. of the 2nd European Conference on Smart Sensing and Context (EuroSSC 2007), pages 12--13, 2007.Google Scholar
- G. Schindler, C. Metzger and T. Starner. A Wearable Interface for Topological Mapping and Localization in Indoor Environments. In Proc. of the 2nd International Workshop on Location- and Context-Awareness (LoCA 2006), pages 64--73, 2006. Google ScholarDigital Library
Index Terms
- It's in your eyes: towards context-awareness and mobile HCI using wearable EOG goggles
Recommendations
Wearable EOG goggles: eye-based interaction in everyday environments
CHI EA '09: CHI '09 Extended Abstracts on Human Factors in Computing SystemsIn this paper, we present an embedded eye tracker for context-awareness and eye-based human-computer interaction - the wearable EOG goggles. In contrast to common systems using video, this unobtrusive device relies on Electrooculography (EOG). It ...
Analysing EOG signal features for the discrimination of eye movements with wearable devices
PETMEI '11: Proceedings of the 1st international workshop on pervasive eye tracking & mobile eye-based interactionEye tracking research in human-computer interaction and experimental psychology traditionally focuses on stationary devices and a small number of common eye movements. The advent of pervasive eye tracking promises new applications, such as eye-based ...
Eye movement analysis for activity recognition
UbiComp '09: Proceedings of the 11th international conference on Ubiquitous computingIn this work we investigate eye movement analysis as a new modality for recognising human activity. We devise 90 different features based on the main eye movement characteristics: saccades, fixations and blinks. The features are derived from eye ...
Comments