ABSTRACT
Head-mounted eye tracking has significant potential for gaze-based applications such as life logging, mental health monitoring, or the quantified self. A neglected challenge for the long-term recordings required by these applications is that drift in the initial person-specific eye tracker calibration, for example caused by physical activity, can severely impact gaze estimation accuracy and thus system performance and user experience. We first analyse calibration drift on a new dataset of natural gaze data recorded using synchronised video-based and Electrooculography-based eye trackers of 20 users performing everyday activities in a mobile setting. Based on this analysis we present a method to automatically self-calibrate head-mounted eye trackers based on a computational model of bottom-up visual saliency. Through evaluations on the dataset we show that our method 1) is effective in reducing calibration drift in calibrated eye trackers and 2) given sufficient data, can achieve gaze estimation accuracy competitive with that of a calibrated eye tracker, without any manual calibration.
Supplemental Material
- F. Alnajar, T. Gevers, R. Valenti, and S. Ghebreab. 2013. Calibration-Free Gaze Estimation Using Human Gaze Patterns. In Proc. ICCV. 137--144. Google ScholarDigital Library
- D. Bannach, P. Lukowicz, and O. Amft. 2008. Rapid prototyping of activity recognition applications. Pervasive Computing, IEEE 7, 2 (2008), 22--31. Google ScholarDigital Library
- M. Barz, A. Bulling, and F. Daiber. 2015. Computational Modelling and Prediction of Gaze Estimation Error for Head-mounted Eye Trackers. Technical Report. German Research Center for Artificial Intelligence (DFKI). 10 pages.Google Scholar
- R. Benenson, M. Mathias, R. Timofte, and L. Van Gool. 2012. Pedestrian detection at 100 frames per second. In Proc. CVPR. 2903--2910. Google ScholarDigital Library
- R. Benenson, M. Mathias, T. Tuytelaars, and L. Van Gool. 2013. Seeking the Strongest Rigid Detector. In Proc. CVPR. 3666--3673. Google ScholarDigital Library
- A. Borji and L. Itti. 2013. State-of-the-Art in Visual Attention Modeling. IEEE Trans. Pattern Anal. Mach. Intell. 35, 1 (2013), 185--207. Google ScholarDigital Library
- A. Bulling, D. Roggen, and G. Tröster. 2009. Wearable EOG goggles: Seamless sensing and context-awareness in everyday environments. J. Ambient Intell. Smart Environ. 1, 2 (2009), 157--171. Google ScholarCross Ref
- A. Bulling, J. Ward, H. Gellersen, and G. Troster. 2011. Eye Movement Analysis for Activity Recognition Using Electrooculography. IEEE Trans. Pattern Anal. Mach. Intell. 33, 4 (2011), 741--753. Google ScholarDigital Library
- A. Bulling, J. A. Ward, and H. Gellersen. 2012. Multimodal Recognition of Reading Activity in ransit Using Body-Worn Sensors. ACM Trans. Appl. Percept. 9, 1 (2012), 2:1--2:21. Google ScholarDigital Library
- A. Bulling, C. Weichel, and H. Gellersen. 2013. EyeContext: Recognition of High-level Contextual Cues from Human Visual Behaviour. In Proc. CHI. 305--308. Google ScholarDigital Library
- Z. Bylinskii, T. Judd, A. Borji, L. Itti, F. Durand, A. Oliva, and A. Torralba. 2014. MIT Saliency Benchmark. http://saliency.mit.edu/. (2014).Google Scholar
- M. Cerf, J. Harel, W. Einhaeuser, and C. Koch. 2008. Predicting human gaze using low-level saliency combined with face detection. In Proc. NIPS. 241--248.Google Scholar
- J. Chen and Q. Ji. 2015. A Probabilistic Approach to Online Eye Gaze Tracking Without Explicit Personal Calibration. IEEE Trans. Image Process. 24, 3 (2015), 1076--1086.Google ScholarDigital Library
- J. Drewes, G. S. Masson, and A. Montagnini. 2012. Shifts in Reported Gaze Position Due to Changes in Pupil Size: Ground Truth and Compensation. In Proc. ETRA. 209--212. Google ScholarDigital Library
- M. A. Fischler and R. C. Bolles. 1981. Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography. Commun. ACM 24, 6 (June 1981), 381--395. Google ScholarDigital Library
- E. Guestrin and E. Eizenman. 2006. General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Trans. Biomed. Eng. 53, 6 (2006), 1124--1133.Google ScholarCross Ref
- D. Hansen and Q. Ji. 2010. In the Eye of the Beholder: A Survey of Models for Eyes and Gaze. IEEE Trans. Pattern Anal. Mach. Intell. 32, 3 (2010), 478--500. Google ScholarDigital Library
- A. Hornof and T. Halverson. 2002. Cleaning up systematic error in eye-tracking data by using required fixation locations. Behav. Res. Meth. Instrum. Comput. 34, 4 (2002), 592--604.Google ScholarCross Ref
- Y. Ishiguro, A. Mujibiya, T. Miyaki, and J. Rekimoto. 2010. Aided Eyes: Eye Activity Sensing for Daily Life. In Proc. AH. Article 25, 25:1--25:7 pages. Google ScholarDigital Library
- Y. Ishiguro and J. Rekimoto. 2012. GazeCloud: A Thumbnail Extraction Method Using Gaze Log Data for Video Life-Log. In Proc. ISWC. 72--75. Google ScholarDigital Library
- S. Ishimaru, K. Kunze, Y. Uema, K. Kise, M. Inami, and K. Tanaka. 2014. Smarter Eyewear: Using Commercial EOG Glasses for Activity Recognition. In Proc. Ubicomp. 239--242. Google ScholarDigital Library
- Y. Itoh and G. Klinker. 2014. Interaction-free calibration for optical see-through head-mounted displays based on 3d eye localization. In Proc. 3DUI. 75--82.Google Scholar
- S. John, E. Weitnauer, and H. Koesling. 2012. Entropy-based correction of eye tracking data for static scenes. In Proc. ETRA. 297--300. Google ScholarDigital Library
- W. Kabsch. 1976. A solution for the best rotation to relate two sets of vectors. Acta Crystallographica Section A 32, 5 (09 1976), 922--923.Google ScholarCross Ref
- M. Kassner, W. Patera, and A. Bulling. 2014. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. In Proc. Ubicomp. 1151--1160. Google ScholarDigital Library
- K. Kunze, S. Ishimaru, Y. Utsumi, and K. Kise. 2013. My Reading Life: Towards Utilizing Eyetracking on Unmodified Tablets and Phones. In Proc. Ubicomp. 283--286. Google ScholarDigital Library
- J. Li and Y. Zhang. 2013. Learning SURF Cascade for Fast and Accurate Object Detection. In Proc. CVPR. 3468--3475. Google ScholarDigital Library
- P. Majaranta and A. Bulling. 2014. Eye Tracking and Eye-Based Human-Computer Interaction. Springer, 39--65.Google Scholar
- H. Manabe and M. Fukumoto. 2006. Full-time Wearable Headphone-type Gaze Detector. In CHI EA. 1073--1078. Google ScholarDigital Library
- D. Model and M. Eizenman. 2010. User-calibration-free Remote Gaze Estimation System. In Proc. ETRA. 29--36. Google ScholarDigital Library
- A. Plopski, Y. Itoh, C. Nitschke, K. Kiyokawa, G. Klinker, and H. Takemura. 2015. Corneal-imaging calibration for optical see-through head-mounted displays. IEEE Trans. Vis. Comput. Graphics 21, 4 (2015), 481--490.Google ScholarDigital Library
- D. Sculley. 2010. Web-scale K-means Clustering. In Proc. WWW. 1177--1178. Google ScholarDigital Library
- S.-W. Shih, Y.-T. Wu, and J. Liu. 2000. A calibration-free gaze tracking technique. In Proc. ICPR, Vol. 4. 201--204 vol.4.Google Scholar
- J. Steil and A. Bulling. 2015. Discovery of Everyday Human Activities From Long-term Visual Behaviour Using Topic Models. In Proc. UbiComp 2015. Google ScholarDigital Library
- S. Stellmach and R. Dachselt. 2012. Look & Touch: Gaze-supported Target Acquisition. In Proc. CHI. 2981--2990. Google ScholarDigital Library
- Y. Sugano, Y. Matsushita, and Y. Sato. 2013. Appearance-Based Gaze Estimation Using Visual Saliency. IEEE Trans. Pattern Anal. Mach. Intell. 35, 2 (2013), 329--341. Google ScholarDigital Library
- H. R. Tavakoli, E. Rahtu, and J. Heikkilä. 2011. Fast and efficient saliency detection using sparse sampling and kernel density estimation. In Proc. SCIA. 666--675. Google ScholarDigital Library
- B. Tessendorf, A. Bulling, D. Roggen, T. Stiefmeier, M. Feilner, P. Derleth, and G. Tröster. 2011. Recognition of hearing needs from body and eye movements to improve hearing instruments. In Proc. PerCom. 314--331. Google ScholarDigital Library
- J. Turner, J. Alexander, A. Bulling, D. Schmidt, and H. Gellersen. 2013. Eye Pull, Eye Push: Moving Objects between Large Screens and Personal Devices with Gaze & Touch. In Proc. INTERACT.Google Scholar
- J. Turner, A. Bulling, J. Alexander, and H. Gellersen. 2014. Cross-Device Gaze-Supported Point-to-Point Content Transfer. In Proc. ETRA. 19--26. Google ScholarDigital Library
- M. Vidal, J. Turner, A. Bulling, and H. Gellersen. 2012. Wearable Eye Tracking for Mental Health Monitoring. Computer Communications 35, 11 (2012), 1306--1311. Google ScholarDigital Library
- J. Zhang and S. Sclaroff. 2013. Saliency Detection: A Boolean Map Approach. In Proc. ICCV. 153--160. Google ScholarDigital Library
Index Terms
- Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency
Recommendations
Fixation detection for head-mounted eye tracking based on visual similarity of gaze targets
ETRA '18: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & ApplicationsFixations are widely analysed in human vision, gaze-based interaction, and experimental psychology research. However, robust fixation detection in mobile settings is profoundly challenging given the prevalence of user and gaze target motion. These ...
3D gaze estimation from 2D pupil positions on monocular head-mounted eye trackers
ETRA '16: Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications3D gaze information is important for scene-centric attention analysis, but accurate estimation and analysis of 3D gaze in real-world environments remains challenging. We present a novel 3D gaze estimation method for monocular head-mounted eye trackers. ...
Extending the visual field of a head-mounted eye tracker for pervasive eye-based interaction
ETRA '12: Proceedings of the Symposium on Eye Tracking Research and ApplicationsPervasive eye-based interaction refers to the vision of eye-based interaction becoming ubiquitously usable in everyday life, e. g. across multiple displays in the environment. While current head-mounted eye trackers work well for interaction with ...
Comments