skip to main content
10.1145/2857491.2857493acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper

Prediction of gaze estimation error for error-aware gaze-based interfaces

Authors Info & Claims
Published:14 March 2016Publication History

ABSTRACT

Gaze estimation error is inherent in head-mounted eye trackers and seriously impacts performance, usability, and user experience of gaze-based interfaces. Particularly in mobile settings, this error varies constantly as users move in front and look at different parts of a display. We envision a new class of gaze-based interfaces that are aware of the gaze estimation error and adapt to it in real time. As a first step towards this vision we introduce an error model that is able to predict the gaze estimation error. Our method covers major building blocks of mobile gaze estimation, specifically mapping of pupil positions to scene camera coordinates, marker-based display detection, and mapping of gaze from scene camera to on-screen coordinates. We develop our model through a series of principled measurements of a state-of-the-art head-mounted eye tracker.

Skip Supplemental Material Section

Supplemental Material

p275-barz.mp4

mp4

38.2 MB

References

  1. Barz, M., Bulling, A., and Daiber, F. 2015. Computational modelling and prediction of gaze estimation error for head-mounted eye trackers. Tech. rep., German Research Center for Artificial Intelligence (DFKI).Google ScholarGoogle Scholar
  2. Breuninger, J., Lange, C., and Bengler, K. 2011. Implementing gaze control for peripheral devices. In Proc. PETMEI, 3--8. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Bulling, A., and Gellersen, H. 2010. Toward mobile eye-based human-computer interaction. IEEE Pervasive Computing 9, 4, 8--12. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Cerrolaza, J. J., Villanueva, A., Villanueva, M., and Cabeza, R. 2012. Error characterization and compensation in eye tracking systems. In Proc. ETRA, 205--208. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Garrido-Jurado, S., noz Salinas, R. M., MadridCuevas, F., and Marín-Jiménez, M. 2014. Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recognition 47, 6, 2280--2292. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Holmqvist, K., Nyström, M., and Mulvey, F. 2012. Eye tracker data quality: What it is and how to measure it. In Proc. ETRA, 45--52. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Kassner, M., Patera, W., and Bulling, A. 2014. Pupil: An open source platform for pervasive eye tracking and mobilegaze-based interaction. In Adj. Proc. UbiComp, 1151--1160. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Lander, C., Gehring, S., Krüger, A., Boring, S., and Bulling, A. 2015. Gazeprojector: Accurate gaze estimation and seamless gaze interaction across multiple displays. In Proc. UIST, 395--404. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Majaranta, P., and Bulling, A. 2014. Eye Tracking and Eye-Based Human-Computer Interaction. Advances in Physiological Computing. Springer, 39--65.Google ScholarGoogle Scholar
  10. Mardanbegi, D., and Hansen, D. W. 2011. Mobile gaze-based screen interaction in 3d environments. In Proc. NGCA, 2. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Mardanbegi, D., and Hansen, D. W. 2012. Parallax error in the monocular head-mounted eye trackers. In Proc. UbiComp, 689--694. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Špakov, O., and Gizatdinova, Y. 2014. Real-time hidden gaze point correction. In Proc. ETRA, 291--294. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Špakov, O. 2012. Comparison of eye movement filters used in hci. In Proc. ETRA, 281--284. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Sugano, Y., and Bulling, A. 2015. Self-calibrating head-mounted eye trackers using egocentric visual saliency. In Proc. UIST, 363--372. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Yu, L. H., and Eizenman, E. 2004. A new methodology for determining point-of-gaze in head-mounted eye tracking systems. IEEE Transactions on Biomedical Engineering 51, 10, 1765--1773.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Prediction of gaze estimation error for error-aware gaze-based interfaces

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        ETRA '16: Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications
        March 2016
        378 pages
        ISBN:9781450341257
        DOI:10.1145/2857491

        Copyright © 2016 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 14 March 2016

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • short-paper

        Acceptance Rates

        Overall Acceptance Rate69of137submissions,50%

        Upcoming Conference

        ETRA '24
        The 2024 Symposium on Eye Tracking Research and Applications
        June 4 - 7, 2024
        Glasgow , United Kingdom

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader