Abstract
Analysis of everyday human gaze behaviour has significant potential for ubiquitous computing, as evidenced by a large body of work in gaze-based human-computer interaction, attentive user interfaces, and eye-based user modelling. However, current mobile eye trackers are still obtrusive, which not only makes them uncomfortable to wear and socially unacceptable in daily life, but also prevents them from being widely adopted in the social and behavioural sciences. To address these challenges we present InvisibleEye, a novel approach for mobile eye tracking that uses millimetre-size RGB cameras that can be fully embedded into normal glasses frames. To compensate for the cameras’ low image resolution of only a few pixels, our approach uses multiple cameras to capture different views of the eye, as well as learning-based gaze estimation to directly regress from eye images to gaze directions. We prototypically implement our system and characterise its performance on three large-scale, increasingly realistic, and thus challenging datasets: 1) eye images synthesised using a recent computer graphics eye region model, 2) real eye images recorded of 17 participants under controlled lighting, and 3) eye images recorded of four participants over the course of four recording sessions in a mobile setting. We show that InvisibleEye achieves a top person-specific gaze estimation accuracy of 1.79° using four cameras with a resolution of only 5 × 5 pixels. Our evaluations not only demonstrate the feasibility of this novel approach but, more importantly, underline its significant potential for finally realising the vision of invisible mobile eye tracking and pervasive attentive user interfaces.
- Martín Abadi, Ashish Agarwal, Paul Barham, Eugene Brevdo, Zhifeng Chen, Craig Citro, Greg S Corrado, Andy Davis, Jeffrey Dean, Matthieu Devin, and others. 2016. Tensorflow: Large-scale Machine Learning on Heterogeneous Distributed Systems. arXiv preprint arXiv:1603.04467 (2016).Google Scholar
- Evgeniy Abdulin, Ioannis Rigas, and Oleg Komogortsev. 2016. Eye Movement Biometrics on Wearable Devices: What Are the Limits?. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. ACM, 1503--1509. Google ScholarDigital Library
- Deepak Akkil, Jari Kangas, Jussi Rantala, Poika Isokoski, Oleg Spakov, and Roope Raisamo. 2015. Glance Awareness and Gaze Interaction in Smartwatches. In Proc. of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems (HCI). ACM, 1271--1276. Google ScholarDigital Library
- Nuri Murat Arar, Hua Gao, and Jean-Philippe Thiran. 2015. Robust Gaze Estimation Based on Adaptive Fusion of Multiple Cameras. In Automatic Face and Gesture Recognition (FG), 2015 11th IEEE International Conference and Workshops on, Vol. 1. IEEE, 1--7.Google ScholarCross Ref
- Shumeet Baluja and Dean Pomerleau. 1994. Non-intrusive Gaze Tracking Using Artificial Neural Networks. Technical Report. DTIC Document. Google ScholarDigital Library
- Michael Barz, Florian Daiber, and Andreas Bulling. 2016. Prediction of Gaze Estimation Error for Error-Aware Gaze-Based Interfaces. In Proc. International Symposium on Eye Tracking Research and Applications (ETRA). 275--278. Google ScholarDigital Library
- Roman Bednarik, Hana Vrzakova, and Michal Hradis. 2012. What Do You Want to Do Next: A Novel Approach for Intent Prediction in Gaze-based Interaction. In Proc. International Symposium on Eye Tracking Research and Applications (ETRA). ACM, 83--90. Google ScholarDigital Library
- Frank H Borsato and Carlos H Morimoto. 2016. Episcleral Surface Tracking: Challenges and Possibilities for Using Mice Sensors for Wearable Eye Tracking. In Proc. International Symposium on Eye Tracking Research and Applications (ETRA). ACM, 39--46. Google ScholarDigital Library
- Andreas Bulling and Hans Gellersen. 2010. Toward Mobile Eye-Based Human-Computer Interaction. IEEE Pervasive Computing 9, 4 (2010), 8--12. Google ScholarDigital Library
- Andreas Bulling and Daniel Roggen. 2011. Recognition of Visual Memory Recall Processes Using Eye Movement Analysis. In Proc. ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp). 455--464. Google ScholarDigital Library
- Andreas Bulling, Daniel Roggen, and Gerhard Tröster. 2008. It’s in Your Eyes: Towards Context-awareness and Mobile HCI Using Wearable EOG Goggles. In Proc. of the 10th International Conference on Ubiquitous Computing (UbiComp). ACM, 84--93. Google ScholarDigital Library
- Andreas Bulling, Daniel Roggen, and Gerhard Troster. 2009. Wearable EOG Goggles: Seamless Sensing and Context-awareness in Everyday Environments. Journal of Ambient Intelligence and Smart Environments 1, 2 (2009), 157--171. Google ScholarDigital Library
- Andreas Bulling, Daniel Roggen, and Gerhard Troster. 2011. What’s in the Eyes for Context-Awareness? IEEE Pervasive Computing 10, 2 (April 2011), 48 -- 57. Google ScholarDigital Library
- Andreas Bulling, Jamie A. Ward, Hans Gellersen, and Gerhard Tröster. 2008. Robust Recognition of Reading Activity in Transit Using Wearable Electrooculography. In Proc. International Conference on Pervasive Computing (Pervasive). 19--37. Google ScholarDigital Library
- Andreas Bulling, Jamie A. Ward, Hans Gellersen, and Gerhard Troster. 2009. Eye Movement Analysis for Activity Recognition. In Proc. ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp). 41--50. Google ScholarDigital Library
- Andreas Bulling, Jamie A Ward, Hans Gellersen, and Gerhard Troster. 2011. Eye movement analysis for activity recognition using electrooculography. IEEE transactions on pattern analysis and machine intelligence 33, 4 (2011), 741--753. Google ScholarDigital Library
- Andreas Bulling, Christian Weichel, and Hans Gellersen. 2013. EyeContext: Recognition of High-level Contextual Cues from Human Visual Behaviour. In Proc. ACM SIGCHI Conference on Human Factors in Computing Systems (CHI). 305-308. Google ScholarDigital Library
- Andreas Bulling and Thorsten O. Zander. 2014. Cognition-Aware Computing. IEEE Pervasive Computing 13, 3 (2014), 80--83.Google ScholarCross Ref
- François Chollet. 2015. Keras. (2015).Google Scholar
- John Duchi, Elad Hazan, and Yoram Singer. 2011. Adaptive Subgradient Methods for Online Learning and Stochastic Optimization. Journal of Machine Learning Research 12, Jul (2011), 2121--2159. Google ScholarDigital Library
- Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015. Orbits: Enabling Gaze Interaction in Smart Watches using Moving Targets. In Proc. ACM Symposium on User Interface Software and Technology (UIST). 457--466. Google ScholarDigital Library
- Wolfgang Fuhl, Thomas Köbler, Katrin Sippel, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2015. ExCuSe: Robust Pupil Detection in Real-World Scenarios. In International Conference on Computer Analysis of Images and Patterns (CAIP). Springer, 39--51.Google Scholar
- Wolfgang Fuhl, Thiago Santini, Gjergji Kasneci, and Enkelejda Kasneci. 2016. PupilNet: Convolutional Neural Networks for Robust Pupil Detection. arXiv preprint arXiv:1601.04902 (2016).Google Scholar
- Wolfgang Fuhl, Thiago C Santini, Thomas Kübler, and Enkelejda Kasneci. 2016. ElSe: Ellipse Selection for Robust Pupil Detection in Real-World Environments. In Proc. of the Ninth Biennial ACM Symposium on Eye Tracking Research 8 Applications. ACM, 123--130. Google ScholarDigital Library
- Sabrina Hoppe, Tobias Loetscher, Stephanie Morey, and Andreas Bulling. 2015. Recognition of Curiosity Using Eye Movement Analysis. In Adj. Proc. ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp). 185--188. Google ScholarDigital Library
- Yoshio Ishiguro, Adiyan Mujibiya, Takashi Miyaki, and Jun Rekimoto. 2010. Aided Eyes: Eye Activity Sensing for Daily Life. In Proc. of the 1st Augmented Human International Conference. ACM, 25. Google ScholarDigital Library
- Amir-Homayoun Javadi, Zahra Hakimi, Morteza Barati, Vincent Walsh, and Lili Tcheang. 2015. SET: A Pupil Detection Method Using Sinusoidal Approximation. Frontiers in neuroengineering 8 (2015).Google Scholar
- Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. In Adj. Proc. ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp). 1151--1160. Google ScholarDigital Library
- Elizabeth S. Kim, Adam Naples, Giuliana Vaccarino Gearty, Quan Wang, Seth Wallace, Carla Wall, Michael Perlmutter, Jennifer Kowitt, Linda Friedlaender, Brian Reichow, Fred Volkmar, and Frederick Shic. 2014. Development of an Untethered, Mobile, Low-cost Head-mounted Eye Tracker. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA ’14). ACM, New York, NY, USA, 247--250. Google ScholarDigital Library
- Kyle Krafka, Aditya Khosla, Petr Kellnhofer, Harini Kannan, Suchendra Bhandarkar, Wojciech Matusik, and Antonio Torralba. 2016. Eye Tracking for Everyone. In Proc. of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR). 2176--2184.Google ScholarCross Ref
- Christian Lander, Sven Gehring, Antonio Krüger, Sebastian Boring, and Andreas Bulling. 2015. GazeProjector: Accurate Gaze Estimation and Seamless Gaze Interaction Across Multiple Displays. In Proc. ACM Symposium on User Interface Software and Technology (UIST). Google ScholarDigital Library
- Dongheng Li and Derrick Parkhurst. 2006. Open-Source Software for Real-Time Visible-Spectrum Eye Tracking. In Proc. of the COGAIN Conference, Vol. 17.Google Scholar
- Dongheng Li, David Winfield, and Derrick J Parkhurst. 2005. Starburst: A Hybrid Algorithm for Video-based Eye Tracking Combining Feature-based and Model-based Approaches. In 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05)-Workshops. IEEE, 79--79. Google ScholarDigital Library
- Xindian Long, Ozan K Tonguz, and Alex Kiderman. 2007. A High Speed Eye Tracking System with Robust Pupil Center Estimation Algorithm. In 2007 29th Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE, 3331--3334.Google ScholarCross Ref
- Feng Lu, Yusuke Sugano, Takahiro Okabe, and Yoichi Sato. 2014. Adaptive Linear Regression for Appearance-Based Gaze Estimation. IEEE transactions on pattern analysis and machine intelligence (TPAMI) 36, 10 (2014), 2033--2046.Google Scholar
- Päivi Majaranta and Andreas Bulling. 2014. Eye Tracking and Eye-Based Human-Computer Interaction. Springer Publishing London, 39--65.Google Scholar
- Hiroyuki Manabe and Masaaki Fukumoto. 2006. Full-time Wearable Headphone-type Gaze Detector. In CHI’06 Extended Abstracts on Human Factors in Computing Systems. ACM, 1073--1078. Google ScholarDigital Library
- Mohsen Mansouryar, Julian Steil, Yusuke Sugano, and Andreas Bulling. 2016. 3D Gaze Estimation from 2D Pupil Positions on Monocular Head-Mounted Eye Trackers. In Proc. International Symposium on Eye Tracking Research and Applications (ETRA). 197--200. Google ScholarDigital Library
- Addison Mayberry, Pan Hu, Benjamin Marlin, Christopher Salthouse, and Deepak Ganesan. 2014. iShadow: Design of a Wearable, Real-Time Mobile Gaze Tracker. In Proceedings of the 12th annual international conference on Mobile systems, applications, and services. ACM, 82--94. Google ScholarDigital Library
- Atsushi Nakazawa and Christian Nitschke. 2012. Point of Gaze Estimation through Corneal Surface Reflection in an Active Illumination Environment. Computer Vision--ECCV 2012 (2012), 159--172.Google Scholar
- Eleni Nasiopoulos, Evan F Risko, Tom Foulsham, and Alan Kingstone. 2015. Wearable Computing: Will It Make People Prosocial? British Journal of Psychology 106, 2 (2015), 209--216.Google ScholarCross Ref
- Basilio Noris, Jean-Baptiste Keller, and Aude Billard. 2011. A Wearable Gaze Tracking System for Children in Unconstrained Environments. Computer Vision and Image Understanding 115, 4 (2011), 476--486. Google ScholarDigital Library
- Bernardo Rodrigues Pires, Michäel Devyver, Akihiro Tsukada, and Takeo Kanade. 2013. Unwrapping the Eye for Visible-spectrum Gaze Tracking on Wearable Devices. In Applications of Computer Vision (WACV), 2013 IEEE Workshop on. IEEE, 369--376. Google ScholarDigital Library
- Alexander Plopski, Christian Nitschke, Kiyoshi Kiyokawa, Dieter Schmalstieg, and Haruo Takemura. 2015. Hybrid Eye Tracking: Combining Iris Contour and Corneal Imaging. In ICAT-EGVE. 183--190. Google ScholarDigital Library
- Evan F Risko and Alan Kingstone. 2011. Eyes Wide Shut: Implied Social Presence, Eye Tracking and Attention. Attention, Perception, 8 Psychophysics 73, 2 (2011), 291--296.Google Scholar
- Ravikrishna Ruddarraju, Antonio Haro, and Irfan Essa. 2003. Fast Multiple Camera Head Pose Tracking. In Vision Interface, Vol. 2. Citeseer.Google Scholar
- Ravikrishna Ruddarraju, Antonio Haro, Kris Nagel, Quan T Tran, Irfan A Essa, Gregory Abowd, and Elizabeth D Mynatt. 2003. Perceptual User Interfaces Using Vision-based Eye Tracking. In Proc. of the 5th international conference on Multimodal interfaces. ACM, 227--233. Google ScholarDigital Library
- Javier San Agustin, Henrik Skovsgaard, Emilie Mollenbach, Maria Barret, Martin Tall, Dan Witzner Hansen, and John Paulin Hansen. 2010. Evaluation of a Low-cost Open-source Gaze Tracker. In Proc. of the 2010 Symposium on Eye-Tracking Research 8 Applications. ACM, 77--80. Google ScholarDigital Library
- Hosnieh Sattar, Mario Fritz, and Andreas Bulling. 2017. Visual Decoding of Targets During Visual Search From Human Eye Fixations. arXiv:1706.05993.Google Scholar
- Hosnieh Sattar, Sabine Müller, Mario Fritz, and Andreas Bulling. 2015. Prediction of Search Targets From Fixations in Open-world Settings. In Proc. IEEE International Conference on Computer Vision and Pattern Recognition (CVPR). 981--990.Google ScholarCross Ref
- Ricardo Sousa, Martin Wany, Pedro Santos, and Fernando Morgado-Dias. 2017. NanEye--An Endoscopy Sensor with 3D Image Synchronization. IEEE Sensors Journal 17 (2017), 623--631. Issue 3.Google ScholarCross Ref
- Julian Steil and Andreas Bulling. 2015. Discovery of Everyday Human Activities From Long-term Visual Behaviour Using Topic Models. In Proc. ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp). 75--85. Google ScholarDigital Library
- Sophie Stellmach and Raimund Dachselt. 2013. Still Looking: Investigating Seamless Gaze-supported Selection, Positioning, and Manipulation of Distant Targets. In Proc. of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 285--294. Google ScholarDigital Library
- Yusuke Sugano and Andreas Bulling. 2015. Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency. In Proc. ACM Symposium on User Interface Software and Technology (UIST). 363--372. Google ScholarDigital Library
- Yusuke Sugano, Yasuyuki Matsushita, and Yoichi Sato. 2014. Learning-by-synthesis for appearance-based 3d gaze estimation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 1821--1828. Google ScholarDigital Library
- Yusuke Sugano, Xucong Zhang, and Andreas Bulling. 2016. Aggregaze: Collective estimation of audience attention on public displays. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology. ACM, 821--831. Google ScholarDigital Library
- Lech Świrski, Andreas Bulling, and Neil Dodgson. 2012. Robust real-time pupil tracking in highly off-axis images. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 173--176. Google ScholarDigital Library
- Lech Świrski and Neil Dodgson. 2013. A Fully-automatic, Temporal Approach to Single Camera, Glint-free 3d Eye Model Fitting. Proc. PETMEI (2013).Google Scholar
- Lech Świrski and Neil Dodgson. 2014. Rendering Synthetic Ground Truth Images for Eye Tracker Evaluation. In Proc. of the Symposium on Eye Tracking Research and Applications. ACM, 219--222. Google ScholarDigital Library
- Marc Tonsen, Xucong Zhang, Yusuke Sugano, and Andreas Bulling. 2016. Labelled pupils in the wild: a dataset for studying pupil detection in unconstrained environments. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research 8 Applications. ACM, 139--142. Google ScholarDigital Library
- Cihan Topal, Serkan Gunal, Onur Koçdeviren, Atakan Doğan, and Ömer N Gerek. 2014. A Low-Computational Approach on Gaze Estimation With Eye Touch System. IEEE transactions on Cybernetics 44, 2 (2014), 228--239.Google Scholar
- Akihiro Tsukada, Motoki Shino, Michael Devyver, and Takeo Kanade. 2011. Illumination-Free Gaze Estimation Method for First-Person Vision Wearable Device. In Computer Vision Workshops (ICCV Workshops), 2011 IEEE International Conference on. IEEE, 2084--2091.Google Scholar
- Jayson Turner, Andreas Bulling, Jason Alexander, and Hans Gellersen. 2014. Cross-Device Gaze-Supported Point-to-Point Content Transfer. In Proc. International Symposium on Eye Tracking Research and Applications (ETRA). 19--26. Google ScholarDigital Library
- Akira Utsumi, Kotaro Okamoto, Norihiro Hagita, and Kazuhiro Takahashi. 2012. Gaze Tracking in Wide Area Using Multiple Camera Observations. In Proc. of the Symposium on Eye Tracking Research and Applications. ACM, 273--276. Google ScholarDigital Library
- Mélodie Vidal, Ken Pfeuffer, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: eye-based interaction with moving targets. In Ext. Abstr. ACM SIGCHI Conference on Human Factors in Computing Systems (CHI). 3147--3150. Google ScholarDigital Library
- Michael Voit and Rainer Stiefelhagen. 2006. Tracking Head Pose and Focus of Attention with Multiple Far-field Cameras. In Proc. of the 8th international conference on Multimodal interfaces. ACM, 281--286. Google ScholarDigital Library
- Erroll Wood, Tadas Baltrušaitis, Louis-Philippe Morency, Peter Robinson, and Andreas Bulling. 2016. A 3D Morphable Eye Region Model for Gaze Estimation. In Proc. European Conference on Computer Vision (ECCV).Google ScholarCross Ref
- Erroll Wood, Tadas Baltrušaitis, Louis-Philippe Morency, Peter Robinson, and Andreas Bulling. 2016. Learning an Appearance-based Gaze Estimator from One Million Synthesised Images. In Proc. International Symposium on Eye Tracking Research and Applications (ETRA). ACM, 131--138. Google ScholarDigital Library
- Erroll Wood, Tadas Baltrusaitis, Xucong Zhang, Yusuke Sugano, Peter Robinson, and Andreas Bulling. 2015. Rendering of Eyes for Eye-Shape Registration and Gaze Estimation. In Proc. of the IEEE International Conference on Computer Vision (ICCV). 3756--3764. Google ScholarDigital Library
- Gregory J Zelinsky, Hossein Adeli, Yifan Peng, and Dimitris Samaras. 2013. Modelling eye movements in a categorical search task. Philosophical Transactions of the Royal Society B: Biological Sciences 368, 1628 (2013), 20130058.Google ScholarCross Ref
- Xucong Zhang, Yusuke Sugano, Mario Fritz, and Andreas Bulling. 2015. Appearance-based gaze estimation in the wild. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 4511--4520.Google ScholarCross Ref
- Xucong Zhang, Yusuke Sugano, Mario Fritz, and Andreas Bulling. 2017. It’s Written All Over Your Face: Full-Face Appearance-Based Gaze Estimation. In Proc. IEEE International Conference on Computer Vision and Pattern Recognition Workshops (CVPRW).Google ScholarCross Ref
- Yanxia Zhang, Hans Jörg Müller, Ming Ki Chong, Andreas Bulling, and Hans Gellersen. 2014. GazeHorizon: Enabling Passers-by to Interact with Public Displays by Gaze. In Proc. ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp). 559--563. Google ScholarDigital Library
Index Terms
- InvisibleEye: Mobile Eye Tracking Using Multiple Low-Resolution Cameras and Learning-Based Gaze Estimation
Recommendations
Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction
UbiComp '14 Adjunct: Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct PublicationIn this paper we present Pupil -- an accessible, affordable, and extensible open source platform for pervasive eye tracking and gaze-based interaction. Pupil comprises 1) a light-weight eye tracking headset, 2) an open source software framework for ...
Evaluation of Appearance-Based Methods and Implications for Gaze-Based Applications
CHI '19: Proceedings of the 2019 CHI Conference on Human Factors in Computing SystemsAppearance-based gaze estimation methods that only require an off-the-shelf camera have significantly improved but they are still not yet widely used in the human-computer interaction (HCI) community. This is partly because it remains unclear how they ...
Fixation detection for head-mounted eye tracking based on visual similarity of gaze targets
ETRA '18: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & ApplicationsFixations are widely analysed in human vision, gaze-based interaction, and experimental psychology research. However, robust fixation detection in mobile settings is profoundly challenging given the prevalence of user and gaze target motion. These ...
Comments