ABSTRACT
Pervasive mobile eye tracking provides a rich data source to investigate human natural behavior, providing a high degree of ecological validity in natural environments. However, challenges and limitations intrinsic to unconstrained mobile eye tracking makes its development and usage to some extent an art. Nonetheless, researchers are pushing the boundaries of this technology to help assess museum visitors' attention not only between the exhibited works, but also within particular pieces, providing significantly more detailed insights than traditional timing-and-tracking or external observer approaches. In this paper, we present in detail the eye tracking system developed for a large scale fully-unconstrained study in the Austrian Gallery Belvedere, providing useful information for eye-tracking system designers. Furthermore, the study is described, and we report on usability and real-time performance metrics. Our results suggest that, although the system is comfortable enough, further eye tracker improvements are necessary to make it less conspicuous. Additionally, real-time accuracy already suffices for simple applications such as audio guides for the majority of users even in the absence of eye-tracker slippage compensation.
- Amazon. 2018. Hydration Backpack. Accessed in 2018-02-15. (2018). https://www.amazon.de/gp/product/B019IAB38A/ref=oh_aui_detailpage_o09_s02?ie=UTF8&psc=1Google Scholar
- Wolfgang Becker. 1989. The neurobiology of saccadic eye movements. Metrics. Reviews of oculomotor research 3 (1989), 13.Google Scholar
- Andreas Bulling and Hans Gellersen. 2010. Toward mobile eye-based human-computer interaction. IEEE Pervasive Computing 9, 4 (2010), 8--12. Google ScholarDigital Library
- Michael Calonder, Vincent Lepetit, Christoph Strecha, and Pascal Fua. 2010. Brief: Binary robust independent elementary features. In European conference on computer vision. Springer, 778--792. Google ScholarDigital Library
- Andrew T Duchowski. 2002. A breadth-first survey of eye-tracking applications. Behavior Research Methods, Instruments, & Computers 34, 4 (2002), 455--470.Google ScholarCross Ref
- Ergoneers. 2018. Dikablis Glasses Professional. Accessed in 2018-02-15. (2018). http://www.ergoneers.com/en/hardware/dikablis-glasses/Google Scholar
- S Filippini Fantoni, K Jaebker, D Bauer, and K Stofer. 2013. Capturing visitorsâĂŹ gazes. Three eye tracking studies in museums. In The annual conference of museums and the web. 17--20.Google Scholar
- Michael Frigge, David C Hoaglin, and Boris Iglewicz. 1989. Some implementations of the boxplot. The American Statistician 43, 1 (1989), 50--54.Google Scholar
- Wolfgang Fuhl, Thomas Kübler, Katrin Sippel, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2015. Excuse: Robust pupil detection in real-world scenarios. In International Conference on Computer Analysis of Images and Patterns. Springer, 39--51.Google ScholarCross Ref
- Wolfgang Fuhl, Thiago Santini, Gjergji Kasneci, and Enkelejda Kasneci. 2016a. Pupil-Net: convolutional neural networks for robust pupil detection. arXiv preprint arXiv:1601.04902 (2016).Google Scholar
- Wolfgang Fuhl, Thiago C Santini, Thomas Kübler, and Enkelejda Kasneci. 2016b. Else: Ellipse selection for robust pupil detection in real-world environments. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. ACM, 123--130. Google ScholarDigital Library
- Sergio Garrido-Jurado, Rafael Muñoz-Salinas, Francisco José Madrid-Cuevas, and Manuel Jesús Marín-Jiménez. 2014. Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recognition 47, 6 (2014), 2280--2292. Google ScholarDigital Library
- Kaiming He, Georgia Gkioxari, Piotr Dollår, and Ross Girshick. 2017. Mask r-cnn. In Computer Vision (ICCV), 2017 IEEE International Conference on. IEEE, 2980--2988.Google ScholarCross Ref
- G.E. Hein. 2002. Learning in the Museum. Taylor & Francis.Google Scholar
- Kenneth Holmqvist, Marcus Nyström, Richard Andersson, Richard Dewhurst, Halszka Jarodzka, and Joost Van de Weijer. 2011. Eye tracking: A comprehensive guide to methods and measures. Oxford University Press.Google Scholar
- Intel. 2018. Accessed in 2018-02-15. (2018). https://downloadcenter.intel.com/product/66427/Intel-Extreme-Tuning-Utility-Intel-XTU-Google Scholar
- Huaizu Jiang and Erik Learned-Miller. 2017. Face detection with the faster R-CNN. In Automatic Face & Gesture Recognition (FG 2017), 2017 12th IEEE International Conference on. IEEE, 650--657.Google ScholarCross Ref
- Enkelejda Kasneci. 2017. Towards pervasive eye tracking. IT-Information Technology 59, 5 (2017), 253--257.Google ScholarCross Ref
- Susan M Kolakowski and Jeff B Pelz. 2006. Compensating for eye tracker camera movement. In Proceedings of the 2006 symposium on Eye tracking research & applications. ACM, 79--85. Google ScholarDigital Library
- Thomas C Kübler, Enkelejda Kasneci, Wolfgang Rosenstiel, Martin Heister, Kathrin Aehling, Katja Nagel, Ulrich Schiefer, and Elena Papageorgiou. 2015. Driving with glaucoma: task performance and gaze movements. Optometry & Vision Science 92, 11 (2015), 1037--1046.Google ScholarCross Ref
- Thomas C Kübler, Colleen Rothe, Ulrich Schiefer, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2017. SubsMatch 2.0: Scanpath comparison and classification based on subsequence frequencies. Behavior research methods 49, 3 (2017), 1048--1064.Google Scholar
- Rensis Likert. 1932. A technique for the measurement of attitudes. Archives of psychology (1932).Google Scholar
- Diako Mardanbegi and Dan Witzner Hansen. 2012. Parallax error in the monocular head-mounted eye trackers. In Proceedings of the 2012 acm conference on ubiquitous computing. ACM, 689--694. Google ScholarDigital Library
- Susana Martinez-Conde, Stephen L Macknik, and David H Hubel. 2004. The role of fixational eye movements in visual perception. Nature Reviews Neuroscience 5, 3 (2004), 229.Google ScholarCross Ref
- Eva Mayr et al. 2009. In-sights into mobile learning an exploration of mobile eye tracking methodology for learning in museums. In Proceedings of the Research Methods in Mobile and Informal Learning Wrokshop. Citeseer.Google Scholar
- Moayad Mokatren, Tsvi Kuflik, and Ilan Shimshoni. 2016. Using Eye-Tracking for Enhancing the Museum Visit Experience. In Proceedings of the International Working Conference on Advanced Visual Interfaces (AVI '16). ACM, New York, NY, USA, 330--331. Google ScholarDigital Library
- Ian Morgan and Kathryn Rose. 2005. How genetic is school myopia? Progress in retinal and eye research 24, 1 (2005), 1--38.Google Scholar
- Carlos H Morimoto and Marcio RM Mimica. 2005. Eye gaze tracking techniques for interactive applications. Computer vision and image understanding 98, 1 (2005), 4--24. Google ScholarDigital Library
- Omkar M Parkhi, Andrea Vedaldi, Andrew Zisserman, et al. 2015. Deep Face Recognition.. In BMVC, Vol. 1. 6.Google Scholar
- Ken Pfeuffer, Mélodie Vidal, Jayson Turner, Andreas Bulling, and Hans Gellersen. 2013. Pursuit calibration: making gaze calibration less tedious and more flexible. In The 26th Annual ACM Symposium on User Interface Software and Technology, UIST'13, St. Andrews, United Kingdom, October 8-11, 2013. 261--270. Google ScholarDigital Library
- Alex Poole and Linden J Ball. 2006. Eye tracking in HCI and usability research. Encyclopedia of human computer interaction 1 (2006), 211--219.Google Scholar
- Pupil Labs. 2018. Accessed in 2018-02-15. (2018). https://pupil-labs.com/Google Scholar
- Ravishankar Rao and Sarma Vrudhula. 2007. Performance optimal processor throttling under thermal constraints. In Proceedings of the 2007 international conference on Compilers, architecture, and synthesis for embedded systems. ACM, 257--266. Google ScholarDigital Library
- Ethan Rublee, Vincent Rabaud, Kurt Konolige, and Gary Bradski. 2011. ORB: An efficient alternative to SIFT or SURF. In Computer Vision (ICCV), 2011 IEEE international conference on. IEEE, 2564--2571. Google ScholarDigital Library
- Thiago Santini, Wolfgang Fuhl, David Geisler, and Enkelejda Kasneci. 2017b. EyeRecToo: Open-Source Software for Real-Time Pervasive Head-Mounted Eye-Tracking. In Proceedings of the 12th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications.Google ScholarCross Ref
- Thiago Santini, Wolfgang Fuhl, and Enkelejda Kasneci. 2017a. CalibMe: Fast and Unsupervised Eye Tracker Calibration for Gaze-Based Pervasive Human-Computer Interaction. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM, 2594--2605. Google ScholarDigital Library
- Thiago Santini, Wolfgang Fuhl, and Enkelejda Kasneci. 2018a. PuRe: Robust pupil detection for real-time pervasive eye tracking. Computer Vision and Image Understanding (Feb 2018).Google Scholar
- Thiago Santini, Wolfgang Fuhl, and Enkelejda Kasneci. 2018b. PuReST: Robust Pupil Tracking for Real-Time Pervasive Eye Tracking. In Proceedings of the ACM Symposium on Eye Tracking Research & Applications. ACM. Google ScholarDigital Library
- Thiago Santini, Thomas Kübler, Lucas Draghetti, Peter Gerjets, Wolfgang Wagner, Ulrich Trautwein, and Enkelejda Kasneci. 2017c. Automatic Mapping of Remote Crowd Gaze to Stimuli in the Classroom. In Proceedings of the Eye Tracking Enhanced Learning Workshop.Google Scholar
- Felix Schüssel, Johannes Bäurle, Simon Kotzka, Michael Weber, Ferdinand Pittino, and Anke Huckauf. 2016. Design and evaluation of a gaze tracking system for free-space interaction. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct. ACM, 1676--1685. Google ScholarDigital Library
- SensoMotoric Instruments. 2018. SMI Eye Tracking Glasses 2 Wireless. Accessed in 2018-02-15. (2018). https://www.smivision.com/wp-content/uploads/2017/05/smi_prod_ETG_120Hz_asgm.pdfGoogle Scholar
- Jeffrey S Shell, Roel Vertegaal, and Alexander W Skaburskis. 2003. EyePliances: attention-seeking devices that respond to visual attention. In CHI'03 extended abstracts on Human factors in computing systems. ACM, 770--771. Google ScholarDigital Library
- Lech Świrski, Andreas Bulling, and Neil Dodgson. 2012. Robust real-time pupil tracking in highly off-axis images. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 173--176. Google ScholarDigital Library
- Benjamin W Tatler, Ross G Macdonald, Tara Hamling, and Catherine Richardson. 2016. Looking at domestic textiles: an eye-tracking experiment analysing influences on viewing behaviour at Owlpen Manor. Textile history 47, 1 (2016), 94--118.Google Scholar
- Tobii. 2018. Tobii Pro Glasses 2. Accessed in 2018-02-15. (2018). https://www.tobiipro.com/product-listing/tobii-pro-glasses-2/Google Scholar
- Marc Tonsen, Xucong Zhang, Yusuke Sugano, and Andreas Bulling. 2016. Labelled pupils in the wild: a dataset for studying pupil detection in unconstrained environments. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. ACM, 139--142. Google ScholarDigital Library
- Francesco Walker, Berno Bucker, Nicola C Anderson, Daniel Schreij, and Jan Theeuwes. 2017. Looking at paintings in the Vincent Van Gogh Museum: Eye movement patterns of children and adults. PloS one 12, 6 (2017), e0178912.Google ScholarCross Ref
- Daniel Wessel, Eva Mayr, and Kristin Knipfer. 2007. Re-viewing the museum visitor's view. In Workshop Research Methods in Informal and Mobile Learning, Institute of Education, London, UK.Google Scholar
Index Terms
- The art of pervasive eye tracking: unconstrained eye tracking in the Austrian Gallery Belvedere
Recommendations
Get a grip: slippage-robust and glint-free gaze estimation for real-time pervasive head-mounted eye tracking
ETRA '19: Proceedings of the 11th ACM Symposium on Eye Tracking Research & ApplicationsA key assumption conventionally made by flexible head-mounted eye-tracking systems is often invalid: The eye center does not remain stationary w.r.t. the eye camera due to slippage. For instance, eye-tracker slippage might happen due to head ...
PuReST: robust pupil tracking for real-time pervasive eye tracking
ETRA '18: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & ApplicationsPervasive eye-tracking applications such as gaze-based human computer interaction and advanced driver assistance require real-time, accurate, and robust pupil detection. However, automated pupil detection has proved to be an intricate task in real-world ...
PuRe: Robust pupil detection for real-time pervasive eye tracking
Highlights- A novel computer-vision based algorithm for robust pupil detection is introduced.
Graphical abstractDisplay Omitted
AbstractReal-time, accurate, and robust pupil detection is an essential prerequisite to enable pervasive eye-tracking and its applications – e.g., gaze-based human computer interaction, health monitoring, foveated rendering, and advanced ...
Comments