ABSTRACT
Fixations are widely analysed in human vision, gaze-based interaction, and experimental psychology research. However, robust fixation detection in mobile settings is profoundly challenging given the prevalence of user and gaze target motion. These movements feign a shift in gaze estimates in the frame of reference defined by the eye tracker's scene camera. To address this challenge, we present a novel fixation detection method for head-mounted eye trackers. Our method exploits that, independent of user or gaze target motion, target appearance remains about the same during a fixation. It extracts image information from small regions around the current gaze position and analyses the appearance similarity of these gaze patches across video frames to detect fixations. We evaluate our method using fine-grained fixation annotations on a five-participant indoor dataset (MPIIEgoFixation) with more than 2,300 fixations in total. Our method outperforms commonly used velocity- and dispersion-based algorithms, which highlights its significant potential to analyse scene image information for eye movement detection.
- Nantheera Anantrasirichai, Iain D Gilchrist, and David R Bull. 2016. Fixation identification for low-sample-rate mobile eye trackers. In Image Processing (ICIP), 2016 IEEE International Conference on. IEEE, 3126--3130.Google ScholarCross Ref
- Richard Andersson, Linnea Larsson, Kenneth Holmqvist, Martin Stridh, and Marcus Nyström. 2017. One algorithm to rule them all? An evaluation and discussion of ten eye movement event-detection algorithms. Behavior research methods 49, 2 (2017), 616--637.Google Scholar
- Olivier Aubert, Yannick Prié, and Daniel Schmitt. 2012. Advene As a Tailorable Hypervideo Authoring Tool: A Case Study. In Proceedings of the 2012 ACM Symposium on Document Engineering (DocEng '12). ACM, New York, NY, USA, 79--82. Google ScholarDigital Library
- David J Berg, Susan E Boehnke, Robert A Marino, Douglas P Munoz, and Laurent Itti. 2009. Free viewing of dynamic stimuli by humans and monkeys. Journal of vision 9, 5 (2009), 19--19.Google ScholarCross Ref
- Tanja Blascheck, Kuno Kurzhals, Michael Raschke, Stefan Strohmaier, Daniel Weiskopf, and Thomas Ertl. 2016. AOI hierarchies for visual exploration of fixation sequences. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. ACM, 111--118. Google ScholarDigital Library
- Pieter Blignaut. 2009. Fixation identification: The optimum threshold for a dispersion algorithm. Attention, Perception, & Psychophysics 71, 4 (2009), 881--895.Google ScholarCross Ref
- Andreas Bulling, Jamie A Ward, and Hans Gellersen. 2012. Multimodal recognition of reading activity in transit using body-worn sensors. ACM Transactions on Applied Perception (TAP) 9, 1 (2012), 2. Google ScholarDigital Library
- Andreas Bulling, Jamie A. Ward, Hans Gellersen, and Gerhard Tröster. 2011. Eye Movement Analysis for Activity Recognition Using Electrooculography. IEEE Transactions on Pattern Analysis and Machine Intelligence 33, 4 (2011), 741--753. Google ScholarDigital Library
- Andreas Bulling and Thorsten O. Zander. 2014. Cognition-Aware Computing. IEEE Pervasive Computing 13, 3 (2014), 80--83.Google ScholarCross Ref
- Kim M Dalton, Brendon M Nacewicz, Tom Johnstone, Hillary S Schaefer, Morton Ann Gernsbacher, Hill H Goldsmith, Andrew L Alexander, and Richard J Davidson. 2005. Gaze fixation and the neural circuitry of face processing in autism. Nature neuroscience 8, 4 (2005), 519--526.Google Scholar
- Sidney D'Mello, Andrew Olney, Claire Williams, and Patrick Hays. 2012. Gaze tutor: A gaze-reactive intelligent tutoring system. International Journal of human-computer studies 70, 5 (2012), 377--398. Google ScholarDigital Library
- Charles W Eriksen and James E Hoffman. 1972. Temporal and spatial characteristics of selective encoding from visual displays. Attention, Perception, & Psychophysics 12, 2 (1972), 201--204.Google ScholarCross Ref
- Myrthe Faber, Robert Bixler, and Sidney K D'Mello. 2017. An automated behavioral measure of mind wandering during computerized reading. Behavior Research Methods (2017), 1--17.Google Scholar
- Vincent P Ferrera. 2000. Task-dependent modulation of the sensorimotor transformation for smooth pursuit eye movements. Journal of Neurophysiology 84, 6 (2000), 2725--2738.Google ScholarCross Ref
- Roy S Hessels, Diederick C Niehorster, Chantal Kemner, and Ignace TC Hooge. 2017. Noise-robust fixation detection in eye movement data: Identification by two-means clustering (i2mc). Behavior research methods 49, 5 (2017), 1802--1823.Google Scholar
- Keita Higuch, Ryo Yonetani, and Yoichi Sato. 2016. Can Eye Help You?: Effects of Visualizing Eye Fixations on Remote Collaboration Scenarios for Physical Tasks. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 5180--5190. Google ScholarDigital Library
- Kenneth Holmqvist, Marcus Nyström, Richard Andersson, Richard Dewhurst, Halszka Jarodzka, and Joost Van de Weijer. 2011. Eye tracking: A comprehensive guide to methods and measures. OUP Oxford.Google Scholar
- Sabrina Hoppe and Andreas Bulling. 2016. End-to-end eye movement detection using convolutional neural networks. arXiv preprint arXiv:1609.02452 (2016).Google Scholar
- Sabrina Hoppe, Tobias Loetscher, Stephanie Morey, and Andreas Bulling. 2015. Recognition of Curiosity Using Eye Movement Analysis. In Adj. Proc. ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp). 185--188. Google ScholarDigital Library
- Sabrina Hoppe, Tobias Loetscher, Stephanie Morey, and Andreas Bulling. 2018. Eye Movements During Everyday Behavior Predict Personality Traits. Frontiers in Human Neuroscience 12 (2018).Google Scholar
- Yifei Huang, Minjie Cai, Hiroshi Kera, Ryo Yonetani, Keita Higuchi, and Yoichi Sato. 2017. Temporal Localization and Spatial Segmentation of Joint Attention in Multiple First-Person Videos. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2313--2321.Google ScholarCross Ref
- David E Irwin. 1992. Visual memory within and across fixations. In Eye movements and visual cognition. Springer, 146--165.Google Scholar
- Enkelejda Kasneci, Gjergji Kasneci, Thomas C Kübler, and Wolfgang Rosenstiel. 2015. Online recognition of fixations, saccades, and smooth pursuits for automated analysis of traffic hazard perception. In Artificial neural networks. Springer, 411--434.Google Scholar
- Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In Adj. Proc. ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp). 1151--1160. Google ScholarDigital Library
- Hiroshi Kera, Ryo Yonetani, Keita Higuchi, and Yoichi Sato. 2016. Discovering Objects of Joint Attention via First-Person Sensing. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops. 7--15.Google ScholarCross Ref
- Thomas Kinsman, Peter Bajorski, and Jeff B Pelz. 2010. Hierarchical image clustering for analyzing eye tracking videos. In Image Processing Workshop (WNYIPW), 2010 Western New York. IEEE, 58--61.Google ScholarCross Ref
- Thomas Kinsman, Karen Evans, Glenn Sweeney, Tommy Keane, and Jeff Pelz. 2012. Ego-motion compensation improves fixation detection in wearable eye tracking. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 221--224. Google ScholarDigital Library
- Oleg V Komogortsev and Alex Karpov. 2013. Automated classification and scoring of smooth pursuit eye movements in the presence of fixations and saccades. Behavior research methods 45, 1 (2013), 203--215.Google Scholar
- Kuno Kurzhals, Marcel Hlawatsch, Michael Burch, and Daniel Weiskopf. 2016a. Fixation-image charts. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. ACM, 11--18. Google ScholarDigital Library
- Kuno Kurzhals, Marcel Hlawatsch, Florian Heimerl, Michael Burch, Thomas Ertl, and Daniel Weiskopf. 2016b. Gaze stripes: Image-based visualization of eye tracking data. IEEE transactions on visualization and computer graphics 22, 1 (2016), 1005--1014.Google Scholar
- Kuno Kurzhals, Marcel Hlawatsch, Christof Seeger, and Daniel Weiskopf. 2017. Visual Analytics for Mobile Eye Tracking. IEEE transactions on visualization and computer graphics 23, 1 (2017), 301--310. Google ScholarDigital Library
- Jiajia Li, Grace Ngai, Hong Va Leong, and Stephen CF Chan. 2016. Your Eye Tells How Well You Comprehend. In Computer Software and Applications Conference (COMPSAC), 2016 IEEE 40th Annual, Vol. 2. IEEE, 503--508.Google Scholar
- Benoît Massé, Silèye Ba, and Radu Horaud. 2017. Tracking Gaze and Visual Focus of Attention of People Involved in Social Interaction. IEEE Transactions on Pattern Analysis and Machine Intelligence (2017).Google Scholar
- Cuong Nguyen and Feng Liu. 2016. Gaze-based Notetaking for Learning from Lecture Videos. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 2093--2097. Google ScholarDigital Library
- Daniel F Pontillo, Thomas B Kinsman, and Jeff B Pelz. 2010. SemantiCode: Using content similarity and database-driven matching to code wearable eyetracker gaze data. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications. ACM, 267--270. Google ScholarDigital Library
- Dario D Salvucci and Joseph H Goldberg. 2000. Identifying fixations and saccades in eye-tracking protocols. In Proceedings of the 2000 symposium on Eye tracking research & applications. ACM, 71--78. Google ScholarDigital Library
- Thiago Santini, Wolfgang Fuhl, Thomas Kübler, and Enkelejda Kasneci. 2016. Bayesian identification of fixations, saccades, and smooth pursuits. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. ACM, 163--170. Google ScholarDigital Library
- Hosnieh Sattar, Andreas Bulling, and Mario Fritz. 2017. Predicting the Category and Attributes of Visual Search Targets Using Deep Gaze Pooling. In Proc. of the IEEE International Conference on Computer Vision Workshops (ICCVW). 2740--2748.Google ScholarCross Ref
- Hosnieh Sattar, Sabine Muller, Mario Fritz, and Andreas Bulling. 2015. Prediction of search targets from fixations in open-world settings. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 981--990.Google ScholarCross Ref
- Yuki Shiga, Takumi Toyama, Yuzuko Utsumi, Koichi Kise, and Andreas Dengel. 2014. Daily activity recognition combining gaze motion and visual features. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication. ACM, 1103--1111. Google ScholarDigital Library
- Julian Steil and Andreas Bulling. 2015. Discovery of everyday human activities from long-term visual behaviour using topic models. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing. ACM, 75--85. Google ScholarDigital Library
- Yusuke Sugano and Andreas Bulling. 2015. Self-calibrating head-mounted eye trackers using egocentric visual saliency. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology. ACM, 363--372. Google ScholarDigital Library
- Marc Tonsen, Julian Steil, Yusuke Sugano, and Andreas Bulling. 2017. InvisibleEye: Mobile Eye Tracking Using Multiple Low-Resolution Cameras and Learning-Based Gaze Estimation. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 1, 3 (2017), 106. Google ScholarDigital Library
- Hoi Ying Tsang, Melanie Tory, and Colin Swindells. 2010. eSeeTrack-visualizing sequential fixation patterns. IEEE Transactions on Visualization and Computer Graphics 16, 6 (2010), 953--962. Google ScholarDigital Library
- Thierry Urruty, Stanislas Lew, Nacim Ihadaddene, and Dan A Simovici. 2007. Detecting eye fixations by projection clustering. ACM Transactions on Multimedia Computing, Communications, and Applications (TOMM) 3, 4 (2007), 5. Google ScholarDigital Library
- Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2012a. Detection of smooth pursuits using eye movement shape features. In Proc. International Symposium on Eye Tracking Research and Applications (ETRA). 177--180. Google ScholarDigital Library
- Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2012b. Detection of smooth pursuits using eye movement shape features. In Proceedings of the symposium on eye tracking research and applications. ACM, 177--180. Google ScholarDigital Library
- Jamie A Ward, Paul Lukowicz, and Gerhard Tröster. 2006. Evaluating performance in continuous context recognition using event-driven error characterisation. In LoCA, Vol. 3987. Springer, 239--255. Google ScholarDigital Library
- Simon AJ Winder and Matthew Brown. 2007. Learning local image descriptors. In Computer Vision and Pattern Recognition, 2007. CVPR'07. IEEE Conference on. IEEE, 1--8.Google ScholarCross Ref
- Jia Xu, Lopamudra Mukherjee, Yin Li, Jamieson Warner, James M Rehg, and Vikas Singh. 2015. Gaze-enabled egocentric video summarization via constrained submodular maximization. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2235--2244.Google ScholarCross Ref
- Sergey Zagoruyko and Nikos Komodakis. 2015. Learning to Compare Image Patches via Convolutional Neural Networks. In The IEEE Conference on Computer Vision and Pattern Recognition (CVPR).Google ScholarCross Ref
- Raimondas Zemblys, Diederick C Niehorster, Oleg Komogortsev, and Kenneth Holmqvist. 2017. Using machine learning to detect events in eye-tracking data. Behavior Research Methods (2017), 1--22.Google Scholar
Index Terms
- Fixation detection for head-mounted eye tracking based on visual similarity of gaze targets
Recommendations
Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency
UIST '15: Proceedings of the 28th Annual ACM Symposium on User Interface Software & TechnologyHead-mounted eye tracking has significant potential for gaze-based applications such as life logging, mental health monitoring, or the quantified self. A neglected challenge for the long-term recordings required by these applications is that drift in ...
Eye gaze tracking with free head movements using a single camera
SoICT '10: Proceedings of the 1st Symposium on Information and Communication TechnologyThe problem of eye gaze tracking has been researched and developed for a long time. The most difficult problem in the non-intrusive system of eye gaze tracking is the problem of head movements. Some of existing methods have to use two cameras and an ...
Extending the visual field of a head-mounted eye tracker for pervasive eye-based interaction
ETRA '12: Proceedings of the Symposium on Eye Tracking Research and ApplicationsPervasive eye-based interaction refers to the vision of eye-based interaction becoming ubiquitously usable in everyday life, e. g. across multiple displays in the environment. While current head-mounted eye trackers work well for interaction with ...
Comments