ABSTRACT
Gaze estimation error is inherent in head-mounted eye trackers and seriously impacts performance, usability, and user experience of gaze-based interfaces. Particularly in mobile settings, this error varies constantly as users move in front and look at different parts of a display. We envision a new class of gaze-based interfaces that are aware of the gaze estimation error and adapt to it in real time. As a first step towards this vision we introduce an error model that is able to predict the gaze estimation error. Our method covers major building blocks of mobile gaze estimation, specifically mapping of pupil positions to scene camera coordinates, marker-based display detection, and mapping of gaze from scene camera to on-screen coordinates. We develop our model through a series of principled measurements of a state-of-the-art head-mounted eye tracker.
Supplemental Material
- Barz, M., Bulling, A., and Daiber, F. 2015. Computational modelling and prediction of gaze estimation error for head-mounted eye trackers. Tech. rep., German Research Center for Artificial Intelligence (DFKI).Google Scholar
- Breuninger, J., Lange, C., and Bengler, K. 2011. Implementing gaze control for peripheral devices. In Proc. PETMEI, 3--8. Google ScholarDigital Library
- Bulling, A., and Gellersen, H. 2010. Toward mobile eye-based human-computer interaction. IEEE Pervasive Computing 9, 4, 8--12. Google ScholarDigital Library
- Cerrolaza, J. J., Villanueva, A., Villanueva, M., and Cabeza, R. 2012. Error characterization and compensation in eye tracking systems. In Proc. ETRA, 205--208. Google ScholarDigital Library
- Garrido-Jurado, S., noz Salinas, R. M., MadridCuevas, F., and Marín-Jiménez, M. 2014. Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recognition 47, 6, 2280--2292. Google ScholarDigital Library
- Holmqvist, K., Nyström, M., and Mulvey, F. 2012. Eye tracker data quality: What it is and how to measure it. In Proc. ETRA, 45--52. Google ScholarDigital Library
- Kassner, M., Patera, W., and Bulling, A. 2014. Pupil: An open source platform for pervasive eye tracking and mobilegaze-based interaction. In Adj. Proc. UbiComp, 1151--1160. Google ScholarDigital Library
- Lander, C., Gehring, S., Krüger, A., Boring, S., and Bulling, A. 2015. Gazeprojector: Accurate gaze estimation and seamless gaze interaction across multiple displays. In Proc. UIST, 395--404. Google ScholarDigital Library
- Majaranta, P., and Bulling, A. 2014. Eye Tracking and Eye-Based Human-Computer Interaction. Advances in Physiological Computing. Springer, 39--65.Google Scholar
- Mardanbegi, D., and Hansen, D. W. 2011. Mobile gaze-based screen interaction in 3d environments. In Proc. NGCA, 2. Google ScholarDigital Library
- Mardanbegi, D., and Hansen, D. W. 2012. Parallax error in the monocular head-mounted eye trackers. In Proc. UbiComp, 689--694. Google ScholarDigital Library
- Špakov, O., and Gizatdinova, Y. 2014. Real-time hidden gaze point correction. In Proc. ETRA, 291--294. Google ScholarDigital Library
- Špakov, O. 2012. Comparison of eye movement filters used in hci. In Proc. ETRA, 281--284. Google ScholarDigital Library
- Sugano, Y., and Bulling, A. 2015. Self-calibrating head-mounted eye trackers using egocentric visual saliency. In Proc. UIST, 363--372. Google ScholarDigital Library
- Yu, L. H., and Eizenman, E. 2004. A new methodology for determining point-of-gaze in head-mounted eye tracking systems. IEEE Transactions on Biomedical Engineering 51, 10, 1765--1773.Google ScholarCross Ref
Index Terms
- Prediction of gaze estimation error for error-aware gaze-based interfaces
Recommendations
Error-aware gaze-based interfaces for robust mobile gaze interaction
ETRA '18: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & ApplicationsGaze estimation error can severely hamper usability and performance of mobile gaze-based interfaces given that the error varies constantly for different interaction positions. In this work, we explore error-aware gaze-based interfaces that estimate and ...
Simple gaze gestures and the closure of the eyes as an interaction technique
ETRA '12: Proceedings of the Symposium on Eye Tracking Research and ApplicationsWe created a set of gaze gestures that utilize the following three elements: simple one-segment gestures, off-screen space, and the closure of the eyes. These gestures are to be used as the moving tool in a gaze-only controlled drawing application. We ...
Comparing Dwell time, Pursuits and Gaze Gestures for Gaze Interaction on Handheld Mobile Devices
CHI '23: Proceedings of the 2023 CHI Conference on Human Factors in Computing SystemsGaze is promising for hands-free interaction on mobile devices. However, it is not clear how gaze interaction methods compare to each other in mobile settings. This paper presents the first experiment in a mobile setting that compares three of the most ...
Comments