skip to main content
10.1145/3208031.3208032acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
research-article

The art of pervasive eye tracking: unconstrained eye tracking in the Austrian Gallery Belvedere

Published:15 June 2018Publication History

ABSTRACT

Pervasive mobile eye tracking provides a rich data source to investigate human natural behavior, providing a high degree of ecological validity in natural environments. However, challenges and limitations intrinsic to unconstrained mobile eye tracking makes its development and usage to some extent an art. Nonetheless, researchers are pushing the boundaries of this technology to help assess museum visitors' attention not only between the exhibited works, but also within particular pieces, providing significantly more detailed insights than traditional timing-and-tracking or external observer approaches. In this paper, we present in detail the eye tracking system developed for a large scale fully-unconstrained study in the Austrian Gallery Belvedere, providing useful information for eye-tracking system designers. Furthermore, the study is described, and we report on usability and real-time performance metrics. Our results suggest that, although the system is comfortable enough, further eye tracker improvements are necessary to make it less conspicuous. Additionally, real-time accuracy already suffices for simple applications such as audio guides for the majority of users even in the absence of eye-tracker slippage compensation.

References

  1. Amazon. 2018. Hydration Backpack. Accessed in 2018-02-15. (2018). https://www.amazon.de/gp/product/B019IAB38A/ref=oh_aui_detailpage_o09_s02?ie=UTF8&psc=1Google ScholarGoogle Scholar
  2. Wolfgang Becker. 1989. The neurobiology of saccadic eye movements. Metrics. Reviews of oculomotor research 3 (1989), 13.Google ScholarGoogle Scholar
  3. Andreas Bulling and Hans Gellersen. 2010. Toward mobile eye-based human-computer interaction. IEEE Pervasive Computing 9, 4 (2010), 8--12. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Michael Calonder, Vincent Lepetit, Christoph Strecha, and Pascal Fua. 2010. Brief: Binary robust independent elementary features. In European conference on computer vision. Springer, 778--792. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Andrew T Duchowski. 2002. A breadth-first survey of eye-tracking applications. Behavior Research Methods, Instruments, & Computers 34, 4 (2002), 455--470.Google ScholarGoogle ScholarCross RefCross Ref
  6. Ergoneers. 2018. Dikablis Glasses Professional. Accessed in 2018-02-15. (2018). http://www.ergoneers.com/en/hardware/dikablis-glasses/Google ScholarGoogle Scholar
  7. S Filippini Fantoni, K Jaebker, D Bauer, and K Stofer. 2013. Capturing visitorsâĂŹ gazes. Three eye tracking studies in museums. In The annual conference of museums and the web. 17--20.Google ScholarGoogle Scholar
  8. Michael Frigge, David C Hoaglin, and Boris Iglewicz. 1989. Some implementations of the boxplot. The American Statistician 43, 1 (1989), 50--54.Google ScholarGoogle Scholar
  9. Wolfgang Fuhl, Thomas Kübler, Katrin Sippel, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2015. Excuse: Robust pupil detection in real-world scenarios. In International Conference on Computer Analysis of Images and Patterns. Springer, 39--51.Google ScholarGoogle ScholarCross RefCross Ref
  10. Wolfgang Fuhl, Thiago Santini, Gjergji Kasneci, and Enkelejda Kasneci. 2016a. Pupil-Net: convolutional neural networks for robust pupil detection. arXiv preprint arXiv:1601.04902 (2016).Google ScholarGoogle Scholar
  11. Wolfgang Fuhl, Thiago C Santini, Thomas Kübler, and Enkelejda Kasneci. 2016b. Else: Ellipse selection for robust pupil detection in real-world environments. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. ACM, 123--130. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Sergio Garrido-Jurado, Rafael Muñoz-Salinas, Francisco José Madrid-Cuevas, and Manuel Jesús Marín-Jiménez. 2014. Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recognition 47, 6 (2014), 2280--2292. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Kaiming He, Georgia Gkioxari, Piotr Dollår, and Ross Girshick. 2017. Mask r-cnn. In Computer Vision (ICCV), 2017 IEEE International Conference on. IEEE, 2980--2988.Google ScholarGoogle ScholarCross RefCross Ref
  14. G.E. Hein. 2002. Learning in the Museum. Taylor & Francis.Google ScholarGoogle Scholar
  15. Kenneth Holmqvist, Marcus Nyström, Richard Andersson, Richard Dewhurst, Halszka Jarodzka, and Joost Van de Weijer. 2011. Eye tracking: A comprehensive guide to methods and measures. Oxford University Press.Google ScholarGoogle Scholar
  16. Intel. 2018. Accessed in 2018-02-15. (2018). https://downloadcenter.intel.com/product/66427/Intel-Extreme-Tuning-Utility-Intel-XTU-Google ScholarGoogle Scholar
  17. Huaizu Jiang and Erik Learned-Miller. 2017. Face detection with the faster R-CNN. In Automatic Face & Gesture Recognition (FG 2017), 2017 12th IEEE International Conference on. IEEE, 650--657.Google ScholarGoogle ScholarCross RefCross Ref
  18. Enkelejda Kasneci. 2017. Towards pervasive eye tracking. IT-Information Technology 59, 5 (2017), 253--257.Google ScholarGoogle ScholarCross RefCross Ref
  19. Susan M Kolakowski and Jeff B Pelz. 2006. Compensating for eye tracker camera movement. In Proceedings of the 2006 symposium on Eye tracking research & applications. ACM, 79--85. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Thomas C Kübler, Enkelejda Kasneci, Wolfgang Rosenstiel, Martin Heister, Kathrin Aehling, Katja Nagel, Ulrich Schiefer, and Elena Papageorgiou. 2015. Driving with glaucoma: task performance and gaze movements. Optometry & Vision Science 92, 11 (2015), 1037--1046.Google ScholarGoogle ScholarCross RefCross Ref
  21. Thomas C Kübler, Colleen Rothe, Ulrich Schiefer, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2017. SubsMatch 2.0: Scanpath comparison and classification based on subsequence frequencies. Behavior research methods 49, 3 (2017), 1048--1064.Google ScholarGoogle Scholar
  22. Rensis Likert. 1932. A technique for the measurement of attitudes. Archives of psychology (1932).Google ScholarGoogle Scholar
  23. Diako Mardanbegi and Dan Witzner Hansen. 2012. Parallax error in the monocular head-mounted eye trackers. In Proceedings of the 2012 acm conference on ubiquitous computing. ACM, 689--694. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Susana Martinez-Conde, Stephen L Macknik, and David H Hubel. 2004. The role of fixational eye movements in visual perception. Nature Reviews Neuroscience 5, 3 (2004), 229.Google ScholarGoogle ScholarCross RefCross Ref
  25. Eva Mayr et al. 2009. In-sights into mobile learning an exploration of mobile eye tracking methodology for learning in museums. In Proceedings of the Research Methods in Mobile and Informal Learning Wrokshop. Citeseer.Google ScholarGoogle Scholar
  26. Moayad Mokatren, Tsvi Kuflik, and Ilan Shimshoni. 2016. Using Eye-Tracking for Enhancing the Museum Visit Experience. In Proceedings of the International Working Conference on Advanced Visual Interfaces (AVI '16). ACM, New York, NY, USA, 330--331. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Ian Morgan and Kathryn Rose. 2005. How genetic is school myopia? Progress in retinal and eye research 24, 1 (2005), 1--38.Google ScholarGoogle Scholar
  28. Carlos H Morimoto and Marcio RM Mimica. 2005. Eye gaze tracking techniques for interactive applications. Computer vision and image understanding 98, 1 (2005), 4--24. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Omkar M Parkhi, Andrea Vedaldi, Andrew Zisserman, et al. 2015. Deep Face Recognition.. In BMVC, Vol. 1. 6.Google ScholarGoogle Scholar
  30. Ken Pfeuffer, Mélodie Vidal, Jayson Turner, Andreas Bulling, and Hans Gellersen. 2013. Pursuit calibration: making gaze calibration less tedious and more flexible. In The 26th Annual ACM Symposium on User Interface Software and Technology, UIST'13, St. Andrews, United Kingdom, October 8-11, 2013. 261--270. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Alex Poole and Linden J Ball. 2006. Eye tracking in HCI and usability research. Encyclopedia of human computer interaction 1 (2006), 211--219.Google ScholarGoogle Scholar
  32. Pupil Labs. 2018. Accessed in 2018-02-15. (2018). https://pupil-labs.com/Google ScholarGoogle Scholar
  33. Ravishankar Rao and Sarma Vrudhula. 2007. Performance optimal processor throttling under thermal constraints. In Proceedings of the 2007 international conference on Compilers, architecture, and synthesis for embedded systems. ACM, 257--266. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Ethan Rublee, Vincent Rabaud, Kurt Konolige, and Gary Bradski. 2011. ORB: An efficient alternative to SIFT or SURF. In Computer Vision (ICCV), 2011 IEEE international conference on. IEEE, 2564--2571. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Thiago Santini, Wolfgang Fuhl, David Geisler, and Enkelejda Kasneci. 2017b. EyeRecToo: Open-Source Software for Real-Time Pervasive Head-Mounted Eye-Tracking. In Proceedings of the 12th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications.Google ScholarGoogle ScholarCross RefCross Ref
  36. Thiago Santini, Wolfgang Fuhl, and Enkelejda Kasneci. 2017a. CalibMe: Fast and Unsupervised Eye Tracker Calibration for Gaze-Based Pervasive Human-Computer Interaction. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM, 2594--2605. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Thiago Santini, Wolfgang Fuhl, and Enkelejda Kasneci. 2018a. PuRe: Robust pupil detection for real-time pervasive eye tracking. Computer Vision and Image Understanding (Feb 2018).Google ScholarGoogle Scholar
  38. Thiago Santini, Wolfgang Fuhl, and Enkelejda Kasneci. 2018b. PuReST: Robust Pupil Tracking for Real-Time Pervasive Eye Tracking. In Proceedings of the ACM Symposium on Eye Tracking Research & Applications. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. Thiago Santini, Thomas Kübler, Lucas Draghetti, Peter Gerjets, Wolfgang Wagner, Ulrich Trautwein, and Enkelejda Kasneci. 2017c. Automatic Mapping of Remote Crowd Gaze to Stimuli in the Classroom. In Proceedings of the Eye Tracking Enhanced Learning Workshop.Google ScholarGoogle Scholar
  40. Felix Schüssel, Johannes Bäurle, Simon Kotzka, Michael Weber, Ferdinand Pittino, and Anke Huckauf. 2016. Design and evaluation of a gaze tracking system for free-space interaction. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct. ACM, 1676--1685. Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. SensoMotoric Instruments. 2018. SMI Eye Tracking Glasses 2 Wireless. Accessed in 2018-02-15. (2018). https://www.smivision.com/wp-content/uploads/2017/05/smi_prod_ETG_120Hz_asgm.pdfGoogle ScholarGoogle Scholar
  42. Jeffrey S Shell, Roel Vertegaal, and Alexander W Skaburskis. 2003. EyePliances: attention-seeking devices that respond to visual attention. In CHI'03 extended abstracts on Human factors in computing systems. ACM, 770--771. Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. Lech Świrski, Andreas Bulling, and Neil Dodgson. 2012. Robust real-time pupil tracking in highly off-axis images. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 173--176. Google ScholarGoogle ScholarDigital LibraryDigital Library
  44. Benjamin W Tatler, Ross G Macdonald, Tara Hamling, and Catherine Richardson. 2016. Looking at domestic textiles: an eye-tracking experiment analysing influences on viewing behaviour at Owlpen Manor. Textile history 47, 1 (2016), 94--118.Google ScholarGoogle Scholar
  45. Tobii. 2018. Tobii Pro Glasses 2. Accessed in 2018-02-15. (2018). https://www.tobiipro.com/product-listing/tobii-pro-glasses-2/Google ScholarGoogle Scholar
  46. Marc Tonsen, Xucong Zhang, Yusuke Sugano, and Andreas Bulling. 2016. Labelled pupils in the wild: a dataset for studying pupil detection in unconstrained environments. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. ACM, 139--142. Google ScholarGoogle ScholarDigital LibraryDigital Library
  47. Francesco Walker, Berno Bucker, Nicola C Anderson, Daniel Schreij, and Jan Theeuwes. 2017. Looking at paintings in the Vincent Van Gogh Museum: Eye movement patterns of children and adults. PloS one 12, 6 (2017), e0178912.Google ScholarGoogle ScholarCross RefCross Ref
  48. Daniel Wessel, Eva Mayr, and Kristin Knipfer. 2007. Re-viewing the museum visitor's view. In Workshop Research Methods in Informal and Mobile Learning, Institute of Education, London, UK.Google ScholarGoogle Scholar

Index Terms

  1. The art of pervasive eye tracking: unconstrained eye tracking in the Austrian Gallery Belvedere

            Recommendations

            Comments

            Login options

            Check if you have access through your login credentials or your institution to get full access on this article.

            Sign in

            PDF Format

            View or Download as a PDF file.

            PDF

            eReader

            View online with eReader.

            eReader