skip to main content
10.1145/2807442.2807445acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
research-article

Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency

Authors Info & Claims
Published:05 November 2015Publication History

ABSTRACT

Head-mounted eye tracking has significant potential for gaze-based applications such as life logging, mental health monitoring, or the quantified self. A neglected challenge for the long-term recordings required by these applications is that drift in the initial person-specific eye tracker calibration, for example caused by physical activity, can severely impact gaze estimation accuracy and thus system performance and user experience. We first analyse calibration drift on a new dataset of natural gaze data recorded using synchronised video-based and Electrooculography-based eye trackers of 20 users performing everyday activities in a mobile setting. Based on this analysis we present a method to automatically self-calibrate head-mounted eye trackers based on a computational model of bottom-up visual saliency. Through evaluations on the dataset we show that our method 1) is effective in reducing calibration drift in calibrated eye trackers and 2) given sufficient data, can achieve gaze estimation accuracy competitive with that of a calibrated eye tracker, without any manual calibration.

Skip Supplemental Material Section

Supplemental Material

p363.mp4

mp4

74.7 MB

References

  1. F. Alnajar, T. Gevers, R. Valenti, and S. Ghebreab. 2013. Calibration-Free Gaze Estimation Using Human Gaze Patterns. In Proc. ICCV. 137--144. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. D. Bannach, P. Lukowicz, and O. Amft. 2008. Rapid prototyping of activity recognition applications. Pervasive Computing, IEEE 7, 2 (2008), 22--31. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. M. Barz, A. Bulling, and F. Daiber. 2015. Computational Modelling and Prediction of Gaze Estimation Error for Head-mounted Eye Trackers. Technical Report. German Research Center for Artificial Intelligence (DFKI). 10 pages.Google ScholarGoogle Scholar
  4. R. Benenson, M. Mathias, R. Timofte, and L. Van Gool. 2012. Pedestrian detection at 100 frames per second. In Proc. CVPR. 2903--2910. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. R. Benenson, M. Mathias, T. Tuytelaars, and L. Van Gool. 2013. Seeking the Strongest Rigid Detector. In Proc. CVPR. 3666--3673. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. A. Borji and L. Itti. 2013. State-of-the-Art in Visual Attention Modeling. IEEE Trans. Pattern Anal. Mach. Intell. 35, 1 (2013), 185--207. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. A. Bulling, D. Roggen, and G. Tröster. 2009. Wearable EOG goggles: Seamless sensing and context-awareness in everyday environments. J. Ambient Intell. Smart Environ. 1, 2 (2009), 157--171. Google ScholarGoogle ScholarCross RefCross Ref
  8. A. Bulling, J. Ward, H. Gellersen, and G. Troster. 2011. Eye Movement Analysis for Activity Recognition Using Electrooculography. IEEE Trans. Pattern Anal. Mach. Intell. 33, 4 (2011), 741--753. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. A. Bulling, J. A. Ward, and H. Gellersen. 2012. Multimodal Recognition of Reading Activity in ransit Using Body-Worn Sensors. ACM Trans. Appl. Percept. 9, 1 (2012), 2:1--2:21. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. A. Bulling, C. Weichel, and H. Gellersen. 2013. EyeContext: Recognition of High-level Contextual Cues from Human Visual Behaviour. In Proc. CHI. 305--308. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Z. Bylinskii, T. Judd, A. Borji, L. Itti, F. Durand, A. Oliva, and A. Torralba. 2014. MIT Saliency Benchmark. http://saliency.mit.edu/. (2014).Google ScholarGoogle Scholar
  12. M. Cerf, J. Harel, W. Einhaeuser, and C. Koch. 2008. Predicting human gaze using low-level saliency combined with face detection. In Proc. NIPS. 241--248.Google ScholarGoogle Scholar
  13. J. Chen and Q. Ji. 2015. A Probabilistic Approach to Online Eye Gaze Tracking Without Explicit Personal Calibration. IEEE Trans. Image Process. 24, 3 (2015), 1076--1086.Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. J. Drewes, G. S. Masson, and A. Montagnini. 2012. Shifts in Reported Gaze Position Due to Changes in Pupil Size: Ground Truth and Compensation. In Proc. ETRA. 209--212. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. M. A. Fischler and R. C. Bolles. 1981. Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography. Commun. ACM 24, 6 (June 1981), 381--395. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. E. Guestrin and E. Eizenman. 2006. General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Trans. Biomed. Eng. 53, 6 (2006), 1124--1133.Google ScholarGoogle ScholarCross RefCross Ref
  17. D. Hansen and Q. Ji. 2010. In the Eye of the Beholder: A Survey of Models for Eyes and Gaze. IEEE Trans. Pattern Anal. Mach. Intell. 32, 3 (2010), 478--500. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. A. Hornof and T. Halverson. 2002. Cleaning up systematic error in eye-tracking data by using required fixation locations. Behav. Res. Meth. Instrum. Comput. 34, 4 (2002), 592--604.Google ScholarGoogle ScholarCross RefCross Ref
  19. Y. Ishiguro, A. Mujibiya, T. Miyaki, and J. Rekimoto. 2010. Aided Eyes: Eye Activity Sensing for Daily Life. In Proc. AH. Article 25, 25:1--25:7 pages. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Y. Ishiguro and J. Rekimoto. 2012. GazeCloud: A Thumbnail Extraction Method Using Gaze Log Data for Video Life-Log. In Proc. ISWC. 72--75. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. S. Ishimaru, K. Kunze, Y. Uema, K. Kise, M. Inami, and K. Tanaka. 2014. Smarter Eyewear: Using Commercial EOG Glasses for Activity Recognition. In Proc. Ubicomp. 239--242. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Y. Itoh and G. Klinker. 2014. Interaction-free calibration for optical see-through head-mounted displays based on 3d eye localization. In Proc. 3DUI. 75--82.Google ScholarGoogle Scholar
  23. S. John, E. Weitnauer, and H. Koesling. 2012. Entropy-based correction of eye tracking data for static scenes. In Proc. ETRA. 297--300. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. W. Kabsch. 1976. A solution for the best rotation to relate two sets of vectors. Acta Crystallographica Section A 32, 5 (09 1976), 922--923.Google ScholarGoogle ScholarCross RefCross Ref
  25. M. Kassner, W. Patera, and A. Bulling. 2014. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. In Proc. Ubicomp. 1151--1160. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. K. Kunze, S. Ishimaru, Y. Utsumi, and K. Kise. 2013. My Reading Life: Towards Utilizing Eyetracking on Unmodified Tablets and Phones. In Proc. Ubicomp. 283--286. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. J. Li and Y. Zhang. 2013. Learning SURF Cascade for Fast and Accurate Object Detection. In Proc. CVPR. 3468--3475. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. P. Majaranta and A. Bulling. 2014. Eye Tracking and Eye-Based Human-Computer Interaction. Springer, 39--65.Google ScholarGoogle Scholar
  29. H. Manabe and M. Fukumoto. 2006. Full-time Wearable Headphone-type Gaze Detector. In CHI EA. 1073--1078. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. D. Model and M. Eizenman. 2010. User-calibration-free Remote Gaze Estimation System. In Proc. ETRA. 29--36. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. A. Plopski, Y. Itoh, C. Nitschke, K. Kiyokawa, G. Klinker, and H. Takemura. 2015. Corneal-imaging calibration for optical see-through head-mounted displays. IEEE Trans. Vis. Comput. Graphics 21, 4 (2015), 481--490.Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. D. Sculley. 2010. Web-scale K-means Clustering. In Proc. WWW. 1177--1178. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. S.-W. Shih, Y.-T. Wu, and J. Liu. 2000. A calibration-free gaze tracking technique. In Proc. ICPR, Vol. 4. 201--204 vol.4.Google ScholarGoogle Scholar
  34. J. Steil and A. Bulling. 2015. Discovery of Everyday Human Activities From Long-term Visual Behaviour Using Topic Models. In Proc. UbiComp 2015. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. S. Stellmach and R. Dachselt. 2012. Look & Touch: Gaze-supported Target Acquisition. In Proc. CHI. 2981--2990. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Y. Sugano, Y. Matsushita, and Y. Sato. 2013. Appearance-Based Gaze Estimation Using Visual Saliency. IEEE Trans. Pattern Anal. Mach. Intell. 35, 2 (2013), 329--341. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. H. R. Tavakoli, E. Rahtu, and J. Heikkilä. 2011. Fast and efficient saliency detection using sparse sampling and kernel density estimation. In Proc. SCIA. 666--675. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. B. Tessendorf, A. Bulling, D. Roggen, T. Stiefmeier, M. Feilner, P. Derleth, and G. Tröster. 2011. Recognition of hearing needs from body and eye movements to improve hearing instruments. In Proc. PerCom. 314--331. Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. J. Turner, J. Alexander, A. Bulling, D. Schmidt, and H. Gellersen. 2013. Eye Pull, Eye Push: Moving Objects between Large Screens and Personal Devices with Gaze & Touch. In Proc. INTERACT.Google ScholarGoogle Scholar
  40. J. Turner, A. Bulling, J. Alexander, and H. Gellersen. 2014. Cross-Device Gaze-Supported Point-to-Point Content Transfer. In Proc. ETRA. 19--26. Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. M. Vidal, J. Turner, A. Bulling, and H. Gellersen. 2012. Wearable Eye Tracking for Mental Health Monitoring. Computer Communications 35, 11 (2012), 1306--1311. Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. J. Zhang and S. Sclaroff. 2013. Saliency Detection: A Boolean Map Approach. In Proc. ICCV. 153--160. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        UIST '15: Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology
        November 2015
        686 pages
        ISBN:9781450337793
        DOI:10.1145/2807442

        Copyright © 2015 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 5 November 2015

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        UIST '15 Paper Acceptance Rate70of297submissions,24%Overall Acceptance Rate842of3,967submissions,21%

        Upcoming Conference

        UIST '24

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader