skip to main content
10.1145/3204493.3204538acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Fixation detection for head-mounted eye tracking based on visual similarity of gaze targets

Published:14 June 2018Publication History

ABSTRACT

Fixations are widely analysed in human vision, gaze-based interaction, and experimental psychology research. However, robust fixation detection in mobile settings is profoundly challenging given the prevalence of user and gaze target motion. These movements feign a shift in gaze estimates in the frame of reference defined by the eye tracker's scene camera. To address this challenge, we present a novel fixation detection method for head-mounted eye trackers. Our method exploits that, independent of user or gaze target motion, target appearance remains about the same during a fixation. It extracts image information from small regions around the current gaze position and analyses the appearance similarity of these gaze patches across video frames to detect fixations. We evaluate our method using fine-grained fixation annotations on a five-participant indoor dataset (MPIIEgoFixation) with more than 2,300 fixations in total. Our method outperforms commonly used velocity- and dispersion-based algorithms, which highlights its significant potential to analyse scene image information for eye movement detection.

References

  1. Nantheera Anantrasirichai, Iain D Gilchrist, and David R Bull. 2016. Fixation identification for low-sample-rate mobile eye trackers. In Image Processing (ICIP), 2016 IEEE International Conference on. IEEE, 3126--3130.Google ScholarGoogle ScholarCross RefCross Ref
  2. Richard Andersson, Linnea Larsson, Kenneth Holmqvist, Martin Stridh, and Marcus Nyström. 2017. One algorithm to rule them all? An evaluation and discussion of ten eye movement event-detection algorithms. Behavior research methods 49, 2 (2017), 616--637.Google ScholarGoogle Scholar
  3. Olivier Aubert, Yannick Prié, and Daniel Schmitt. 2012. Advene As a Tailorable Hypervideo Authoring Tool: A Case Study. In Proceedings of the 2012 ACM Symposium on Document Engineering (DocEng '12). ACM, New York, NY, USA, 79--82. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. David J Berg, Susan E Boehnke, Robert A Marino, Douglas P Munoz, and Laurent Itti. 2009. Free viewing of dynamic stimuli by humans and monkeys. Journal of vision 9, 5 (2009), 19--19.Google ScholarGoogle ScholarCross RefCross Ref
  5. Tanja Blascheck, Kuno Kurzhals, Michael Raschke, Stefan Strohmaier, Daniel Weiskopf, and Thomas Ertl. 2016. AOI hierarchies for visual exploration of fixation sequences. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. ACM, 111--118. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Pieter Blignaut. 2009. Fixation identification: The optimum threshold for a dispersion algorithm. Attention, Perception, & Psychophysics 71, 4 (2009), 881--895.Google ScholarGoogle ScholarCross RefCross Ref
  7. Andreas Bulling, Jamie A Ward, and Hans Gellersen. 2012. Multimodal recognition of reading activity in transit using body-worn sensors. ACM Transactions on Applied Perception (TAP) 9, 1 (2012), 2. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Andreas Bulling, Jamie A. Ward, Hans Gellersen, and Gerhard Tröster. 2011. Eye Movement Analysis for Activity Recognition Using Electrooculography. IEEE Transactions on Pattern Analysis and Machine Intelligence 33, 4 (2011), 741--753. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Andreas Bulling and Thorsten O. Zander. 2014. Cognition-Aware Computing. IEEE Pervasive Computing 13, 3 (2014), 80--83.Google ScholarGoogle ScholarCross RefCross Ref
  10. Kim M Dalton, Brendon M Nacewicz, Tom Johnstone, Hillary S Schaefer, Morton Ann Gernsbacher, Hill H Goldsmith, Andrew L Alexander, and Richard J Davidson. 2005. Gaze fixation and the neural circuitry of face processing in autism. Nature neuroscience 8, 4 (2005), 519--526.Google ScholarGoogle Scholar
  11. Sidney D'Mello, Andrew Olney, Claire Williams, and Patrick Hays. 2012. Gaze tutor: A gaze-reactive intelligent tutoring system. International Journal of human-computer studies 70, 5 (2012), 377--398. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Charles W Eriksen and James E Hoffman. 1972. Temporal and spatial characteristics of selective encoding from visual displays. Attention, Perception, & Psychophysics 12, 2 (1972), 201--204.Google ScholarGoogle ScholarCross RefCross Ref
  13. Myrthe Faber, Robert Bixler, and Sidney K D'Mello. 2017. An automated behavioral measure of mind wandering during computerized reading. Behavior Research Methods (2017), 1--17.Google ScholarGoogle Scholar
  14. Vincent P Ferrera. 2000. Task-dependent modulation of the sensorimotor transformation for smooth pursuit eye movements. Journal of Neurophysiology 84, 6 (2000), 2725--2738.Google ScholarGoogle ScholarCross RefCross Ref
  15. Roy S Hessels, Diederick C Niehorster, Chantal Kemner, and Ignace TC Hooge. 2017. Noise-robust fixation detection in eye movement data: Identification by two-means clustering (i2mc). Behavior research methods 49, 5 (2017), 1802--1823.Google ScholarGoogle Scholar
  16. Keita Higuch, Ryo Yonetani, and Yoichi Sato. 2016. Can Eye Help You?: Effects of Visualizing Eye Fixations on Remote Collaboration Scenarios for Physical Tasks. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 5180--5190. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Kenneth Holmqvist, Marcus Nyström, Richard Andersson, Richard Dewhurst, Halszka Jarodzka, and Joost Van de Weijer. 2011. Eye tracking: A comprehensive guide to methods and measures. OUP Oxford.Google ScholarGoogle Scholar
  18. Sabrina Hoppe and Andreas Bulling. 2016. End-to-end eye movement detection using convolutional neural networks. arXiv preprint arXiv:1609.02452 (2016).Google ScholarGoogle Scholar
  19. Sabrina Hoppe, Tobias Loetscher, Stephanie Morey, and Andreas Bulling. 2015. Recognition of Curiosity Using Eye Movement Analysis. In Adj. Proc. ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp). 185--188. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Sabrina Hoppe, Tobias Loetscher, Stephanie Morey, and Andreas Bulling. 2018. Eye Movements During Everyday Behavior Predict Personality Traits. Frontiers in Human Neuroscience 12 (2018).Google ScholarGoogle Scholar
  21. Yifei Huang, Minjie Cai, Hiroshi Kera, Ryo Yonetani, Keita Higuchi, and Yoichi Sato. 2017. Temporal Localization and Spatial Segmentation of Joint Attention in Multiple First-Person Videos. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2313--2321.Google ScholarGoogle ScholarCross RefCross Ref
  22. David E Irwin. 1992. Visual memory within and across fixations. In Eye movements and visual cognition. Springer, 146--165.Google ScholarGoogle Scholar
  23. Enkelejda Kasneci, Gjergji Kasneci, Thomas C Kübler, and Wolfgang Rosenstiel. 2015. Online recognition of fixations, saccades, and smooth pursuits for automated analysis of traffic hazard perception. In Artificial neural networks. Springer, 411--434.Google ScholarGoogle Scholar
  24. Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In Adj. Proc. ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp). 1151--1160. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Hiroshi Kera, Ryo Yonetani, Keita Higuchi, and Yoichi Sato. 2016. Discovering Objects of Joint Attention via First-Person Sensing. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops. 7--15.Google ScholarGoogle ScholarCross RefCross Ref
  26. Thomas Kinsman, Peter Bajorski, and Jeff B Pelz. 2010. Hierarchical image clustering for analyzing eye tracking videos. In Image Processing Workshop (WNYIPW), 2010 Western New York. IEEE, 58--61.Google ScholarGoogle ScholarCross RefCross Ref
  27. Thomas Kinsman, Karen Evans, Glenn Sweeney, Tommy Keane, and Jeff Pelz. 2012. Ego-motion compensation improves fixation detection in wearable eye tracking. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 221--224. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Oleg V Komogortsev and Alex Karpov. 2013. Automated classification and scoring of smooth pursuit eye movements in the presence of fixations and saccades. Behavior research methods 45, 1 (2013), 203--215.Google ScholarGoogle Scholar
  29. Kuno Kurzhals, Marcel Hlawatsch, Michael Burch, and Daniel Weiskopf. 2016a. Fixation-image charts. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. ACM, 11--18. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Kuno Kurzhals, Marcel Hlawatsch, Florian Heimerl, Michael Burch, Thomas Ertl, and Daniel Weiskopf. 2016b. Gaze stripes: Image-based visualization of eye tracking data. IEEE transactions on visualization and computer graphics 22, 1 (2016), 1005--1014.Google ScholarGoogle Scholar
  31. Kuno Kurzhals, Marcel Hlawatsch, Christof Seeger, and Daniel Weiskopf. 2017. Visual Analytics for Mobile Eye Tracking. IEEE transactions on visualization and computer graphics 23, 1 (2017), 301--310. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Jiajia Li, Grace Ngai, Hong Va Leong, and Stephen CF Chan. 2016. Your Eye Tells How Well You Comprehend. In Computer Software and Applications Conference (COMPSAC), 2016 IEEE 40th Annual, Vol. 2. IEEE, 503--508.Google ScholarGoogle Scholar
  33. Benoît Massé, Silèye Ba, and Radu Horaud. 2017. Tracking Gaze and Visual Focus of Attention of People Involved in Social Interaction. IEEE Transactions on Pattern Analysis and Machine Intelligence (2017).Google ScholarGoogle Scholar
  34. Cuong Nguyen and Feng Liu. 2016. Gaze-based Notetaking for Learning from Lecture Videos. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 2093--2097. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Daniel F Pontillo, Thomas B Kinsman, and Jeff B Pelz. 2010. SemantiCode: Using content similarity and database-driven matching to code wearable eyetracker gaze data. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications. ACM, 267--270. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Dario D Salvucci and Joseph H Goldberg. 2000. Identifying fixations and saccades in eye-tracking protocols. In Proceedings of the 2000 symposium on Eye tracking research & applications. ACM, 71--78. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Thiago Santini, Wolfgang Fuhl, Thomas Kübler, and Enkelejda Kasneci. 2016. Bayesian identification of fixations, saccades, and smooth pursuits. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. ACM, 163--170. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. Hosnieh Sattar, Andreas Bulling, and Mario Fritz. 2017. Predicting the Category and Attributes of Visual Search Targets Using Deep Gaze Pooling. In Proc. of the IEEE International Conference on Computer Vision Workshops (ICCVW). 2740--2748.Google ScholarGoogle ScholarCross RefCross Ref
  39. Hosnieh Sattar, Sabine Muller, Mario Fritz, and Andreas Bulling. 2015. Prediction of search targets from fixations in open-world settings. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 981--990.Google ScholarGoogle ScholarCross RefCross Ref
  40. Yuki Shiga, Takumi Toyama, Yuzuko Utsumi, Koichi Kise, and Andreas Dengel. 2014. Daily activity recognition combining gaze motion and visual features. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication. ACM, 1103--1111. Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. Julian Steil and Andreas Bulling. 2015. Discovery of everyday human activities from long-term visual behaviour using topic models. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing. ACM, 75--85. Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. Yusuke Sugano and Andreas Bulling. 2015. Self-calibrating head-mounted eye trackers using egocentric visual saliency. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology. ACM, 363--372. Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. Marc Tonsen, Julian Steil, Yusuke Sugano, and Andreas Bulling. 2017. InvisibleEye: Mobile Eye Tracking Using Multiple Low-Resolution Cameras and Learning-Based Gaze Estimation. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 1, 3 (2017), 106. Google ScholarGoogle ScholarDigital LibraryDigital Library
  44. Hoi Ying Tsang, Melanie Tory, and Colin Swindells. 2010. eSeeTrack-visualizing sequential fixation patterns. IEEE Transactions on Visualization and Computer Graphics 16, 6 (2010), 953--962. Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. Thierry Urruty, Stanislas Lew, Nacim Ihadaddene, and Dan A Simovici. 2007. Detecting eye fixations by projection clustering. ACM Transactions on Multimedia Computing, Communications, and Applications (TOMM) 3, 4 (2007), 5. Google ScholarGoogle ScholarDigital LibraryDigital Library
  46. Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2012a. Detection of smooth pursuits using eye movement shape features. In Proc. International Symposium on Eye Tracking Research and Applications (ETRA). 177--180. Google ScholarGoogle ScholarDigital LibraryDigital Library
  47. Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2012b. Detection of smooth pursuits using eye movement shape features. In Proceedings of the symposium on eye tracking research and applications. ACM, 177--180. Google ScholarGoogle ScholarDigital LibraryDigital Library
  48. Jamie A Ward, Paul Lukowicz, and Gerhard Tröster. 2006. Evaluating performance in continuous context recognition using event-driven error characterisation. In LoCA, Vol. 3987. Springer, 239--255. Google ScholarGoogle ScholarDigital LibraryDigital Library
  49. Simon AJ Winder and Matthew Brown. 2007. Learning local image descriptors. In Computer Vision and Pattern Recognition, 2007. CVPR'07. IEEE Conference on. IEEE, 1--8.Google ScholarGoogle ScholarCross RefCross Ref
  50. Jia Xu, Lopamudra Mukherjee, Yin Li, Jamieson Warner, James M Rehg, and Vikas Singh. 2015. Gaze-enabled egocentric video summarization via constrained submodular maximization. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2235--2244.Google ScholarGoogle ScholarCross RefCross Ref
  51. Sergey Zagoruyko and Nikos Komodakis. 2015. Learning to Compare Image Patches via Convolutional Neural Networks. In The IEEE Conference on Computer Vision and Pattern Recognition (CVPR).Google ScholarGoogle ScholarCross RefCross Ref
  52. Raimondas Zemblys, Diederick C Niehorster, Oleg Komogortsev, and Kenneth Holmqvist. 2017. Using machine learning to detect events in eye-tracking data. Behavior Research Methods (2017), 1--22.Google ScholarGoogle Scholar

Index Terms

  1. Fixation detection for head-mounted eye tracking based on visual similarity of gaze targets

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        ETRA '18: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications
        June 2018
        595 pages
        ISBN:9781450357067
        DOI:10.1145/3204493

        Copyright © 2018 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 14 June 2018

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        Overall Acceptance Rate69of137submissions,50%

        Upcoming Conference

        ETRA '24
        The 2024 Symposium on Eye Tracking Research and Applications
        June 4 - 7, 2024
        Glasgow , United Kingdom

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader