Skip to main content
Erschienen in: Machine Vision and Applications 3/2019

12.02.2019 | Original Paper

CamType: assistive text entry using gaze with an off-the-shelf webcam

verfasst von: Yi Liu, Bu-Sung Lee, Deepu Rajan, Andrzej Sluzek, Martin J. McKeown

Erschienen in: Machine Vision and Applications | Ausgabe 3/2019

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

As modern assistive technology advances, eye-based text entry systems have been developed to help a subset of physically challenged people to improve their communication ability. However, speed of text entry in early eye-typing system tends to be relatively slow due to dwell time. Recently, dwell-free methods have been proposed which outperform the dwell-based systems in terms of speed and resilience, but the extra eye-tracking device is still an indispensable equipment. In this article, we propose a prototype of eye-typing system using an off-the-shelf webcam without the extra eye tracker, in which the appearance-based method is proposed to estimate people’s gaze coordinates on the screen based on the frontal face images captured by the webcam. We also investigate some critical issues of the appearance-based method, which helps to improve the estimation accuracy and reduce computing complexity in practice. The performance evaluation shows that eye typing with webcam using the proposed method is comparable to the eye tracker under a small degree of head movement.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Adjouadi, M., Sesin, A., Ayala, M., Cabrerizo, M.: Remote eye gaze tracking system as a computer interface for persons with severe motor disability. Springer, Berlin (2004)CrossRef Adjouadi, M., Sesin, A., Ayala, M., Cabrerizo, M.: Remote eye gaze tracking system as a computer interface for persons with severe motor disability. Springer, Berlin (2004)CrossRef
2.
Zurück zum Zitat Baltru, T., Robinson, P., Morency, L.-P., et al.: Openface: an open source facial behavior analysis toolkit. In: 2016 IEEE Winter Conference on Applications of Computer Vision (WACV), pp. 1–10 (2016) Baltru, T., Robinson, P., Morency, L.-P., et al.: Openface: an open source facial behavior analysis toolkit. In: 2016 IEEE Winter Conference on Applications of Computer Vision (WACV), pp. 1–10 (2016)
3.
Zurück zum Zitat Brolly, X. L., Mulligan, J. B.: Implicit calibration of a remote gaze tracker. In: Conference on Computer Vision and Pattern Recognition Workshop, 2004, CVPRW’04, p. 134 (2004) Brolly, X. L., Mulligan, J. B.: Implicit calibration of a remote gaze tracker. In: Conference on Computer Vision and Pattern Recognition Workshop, 2004, CVPRW’04, p. 134 (2004)
4.
Zurück zum Zitat Caligari, M., Godi, M., Guglielmetti, S., Franchignoni, F., Nardone, A.: Eye tracking communication devices in amyotrophic lateral sclerosis: impact on disability and quality of life. Amyotroph. Lateral Scler. Frontotemporal Degener. 14(7–8), 546–552 (2013)CrossRef Caligari, M., Godi, M., Guglielmetti, S., Franchignoni, F., Nardone, A.: Eye tracking communication devices in amyotrophic lateral sclerosis: impact on disability and quality of life. Amyotroph. Lateral Scler. Frontotemporal Degener. 14(7–8), 546–552 (2013)CrossRef
5.
Zurück zum Zitat Cerrolaza, J.J., Villanueva, A., Cabeza, R.: Taxonomic study of polynomial regressions applied to the calibration of video-oculographic systems. In: Proceedings of the 2008 Symposium on Eye Tracking Research and Applications, pp. 259–266 (2008) Cerrolaza, J.J., Villanueva, A., Cabeza, R.: Taxonomic study of polynomial regressions applied to the calibration of video-oculographic systems. In: Proceedings of the 2008 Symposium on Eye Tracking Research and Applications, pp. 259–266 (2008)
6.
Zurück zum Zitat Chen, J., Ji, Q.: Probabilistic gaze estimation without active personal calibration. In: 2011 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 609-616 (2011) Chen, J., Ji, Q.: Probabilistic gaze estimation without active personal calibration. In: 2011 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 609-616 (2011)
8.
Zurück zum Zitat Ebisawa, Y., Satoh, S.-I.: Effectiveness of pupil area detection technique using two light sources and image difference method. In: Proceedings of the 15th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 1268–1269 (1993) Ebisawa, Y., Satoh, S.-I.: Effectiveness of pupil area detection technique using two light sources and image difference method. In: Proceedings of the 15th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 1268–1269 (1993)
9.
Zurück zum Zitat Hansen, D.W., Ji, Q.: In the eye of the beholder: a survey of models for eyes and gaze. IEEE Trans. Pattern Anal. Mach. Intell. 32(3), 478–500 (2010)CrossRef Hansen, D.W., Ji, Q.: In the eye of the beholder: a survey of models for eyes and gaze. IEEE Trans. Pattern Anal. Mach. Intell. 32(3), 478–500 (2010)CrossRef
10.
Zurück zum Zitat Hoppe, S., Löchtefeld, M., Daiber, F.: Eype—using eye-traces for eye-typing. In: Workshop on Grand Challenges in Text Entry (CHI 2013) (2013) Hoppe, S., Löchtefeld, M., Daiber, F.: Eype—using eye-traces for eye-typing. In: Workshop on Grand Challenges in Text Entry (CHI 2013) (2013)
11.
Zurück zum Zitat Huey, E.B.: The Psychology and Pedagogy of Reading. Macmillan, New York (1908) Huey, E.B.: The Psychology and Pedagogy of Reading. Macmillan, New York (1908)
12.
Zurück zum Zitat Jacob, R., Karn, K.S.: Eye tracking in human–computer interaction and usability research: ready to deliver the promises. Mind 2(3), 4 (2003) Jacob, R., Karn, K.S.: Eye tracking in human–computer interaction and usability research: ready to deliver the promises. Mind 2(3), 4 (2003)
13.
Zurück zum Zitat Kim, S.-T., Choi, K.-A., Shin, Y.-G., Ko, S.-J.: A novel iris center localization based on circle fitting using radially sampled features. In: 2015 IEEE International Symposium on Consumer Electronics (ISCE), pp. 1–2 (2015) Kim, S.-T., Choi, K.-A., Shin, Y.-G., Ko, S.-J.: A novel iris center localization based on circle fitting using radially sampled features. In: 2015 IEEE International Symposium on Consumer Electronics (ISCE), pp. 1–2 (2015)
14.
Zurück zum Zitat Kocejko, T., Bujnowski, A., Wtorek, J.: Eye-mouse for disabled. In: Hippe, Z.S., Kulikowski, J.L. (eds.) Human–Computer Systems Interaction, pp. 109–122. Springer, Berlin (2009)CrossRef Kocejko, T., Bujnowski, A., Wtorek, J.: Eye-mouse for disabled. In: Hippe, Z.S., Kulikowski, J.L. (eds.) Human–Computer Systems Interaction, pp. 109–122. Springer, Berlin (2009)CrossRef
15.
Zurück zum Zitat Kotani, K., Yamaguchi, Y., Asao, T., Horii, K.: Design of eye-typing interface using saccadic latency of eye movement. Int. J. Hum. Comput. Interaction 26(4), 361–376 (2010)CrossRef Kotani, K., Yamaguchi, Y., Asao, T., Horii, K.: Design of eye-typing interface using saccadic latency of eye movement. Int. J. Hum. Comput. Interaction 26(4), 361–376 (2010)CrossRef
16.
Zurück zum Zitat Krafka, K., Khosla, A., Kellnhofer, P., Kannan, H., Bhandarkar, S., Matusik, W., Torralba, A.: Eye tracking for everyone. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2176–2184 (2016) Krafka, K., Khosla, A., Kellnhofer, P., Kannan, H., Bhandarkar, S., Matusik, W., Torralba, A.: Eye tracking for everyone. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2176–2184 (2016)
17.
Zurück zum Zitat Kristensson, P.O., Vertanen, K.: The potential of dwell-free eye-typing for fast assistive gaze communication. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 241–244 (2012) Kristensson, P.O., Vertanen, K.: The potential of dwell-free eye-typing for fast assistive gaze communication. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 241–244 (2012)
18.
Zurück zum Zitat Kristensson, P.-O., Zhai, S.: Shark 2: a large vocabulary shorthand writing system for pen-based computers. In: Proceedings of the 17th Annual ACM Symposium on User Interface Software and Technology, pp. 43–52 (2004) Kristensson, P.-O., Zhai, S.: Shark 2: a large vocabulary shorthand writing system for pen-based computers. In: Proceedings of the 17th Annual ACM Symposium on User Interface Software and Technology, pp. 43–52 (2004)
19.
Zurück zum Zitat Liu, Y., Lee, B.-S., McKeown, M., Lee, C.: A robust recognition approach in eye-based dwell-free typing. In: Proceedings of 2015 International Conference on Progress in Informatics and Computing, pp. 5–9 (2015) Liu, Y., Lee, B.-S., McKeown, M., Lee, C.: A robust recognition approach in eye-based dwell-free typing. In: Proceedings of 2015 International Conference on Progress in Informatics and Computing, pp. 5–9 (2015)
21.
Zurück zum Zitat Liu, Y., Zhang, C., Lee, C., Lee, B.-S., Chen, A. Q.: Gazetry: swipe text typing using gaze. In: Proceedings of the Annual Meeting of the Australian Special Interest Group for Computer Human Interaction, pp. 192–196 (2015) Liu, Y., Zhang, C., Lee, C., Lee, B.-S., Chen, A. Q.: Gazetry: swipe text typing using gaze. In: Proceedings of the Annual Meeting of the Australian Special Interest Group for Computer Human Interaction, pp. 192–196 (2015)
22.
Zurück zum Zitat Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Inferring human gaze from appearance via adaptive linear regression. In: 2011 IEEE International Conference on Computer Vision (ICCV), pp. 153–160 (2011) Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Inferring human gaze from appearance via adaptive linear regression. In: 2011 IEEE International Conference on Computer Vision (ICCV), pp. 153–160 (2011)
23.
Zurück zum Zitat Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Trans. Pattern Anal. Mach. Intell. 36(10), 2033–2046 (2014)CrossRef Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Trans. Pattern Anal. Mach. Intell. 36(10), 2033–2046 (2014)CrossRef
24.
Zurück zum Zitat MacKenzie, I.S., Zhang, X.: Eye typing using word and letter prediction and a fixation algorithm. In: Proceedings of the 2008 Symposium on Eye Tracking Research and Applications, pp. 55–58 (2008) MacKenzie, I.S., Zhang, X.: Eye typing using word and letter prediction and a fixation algorithm. In: Proceedings of the 2008 Symposium on Eye Tracking Research and Applications, pp. 55–58 (2008)
25.
Zurück zum Zitat Majaranta, P., Ahola, U.-K., Špakov, O.: Fast gaze typing with an adjustable dwell time. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 357–360 (2009) Majaranta, P., Ahola, U.-K., Špakov, O.: Fast gaze typing with an adjustable dwell time. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 357–360 (2009)
27.
Zurück zum Zitat Morimoto, C.H., Mimica, M.R.: Eye gaze tracking techniques for interactive applications. Comput. Vis. Image Underst. 98(1), 4–24 (2005)CrossRef Morimoto, C.H., Mimica, M.R.: Eye gaze tracking techniques for interactive applications. Comput. Vis. Image Underst. 98(1), 4–24 (2005)CrossRef
28.
Zurück zum Zitat Murata, A.: Eye-gaze input versus mouse: cursor control as a function of age. Int. J. Hum. Comput. Interaction 21(1), 1–14 (2006) Murata, A.: Eye-gaze input versus mouse: cursor control as a function of age. Int. J. Hum. Comput. Interaction 21(1), 1–14 (2006)
29.
Zurück zum Zitat Ohno, T.: Eyeprint: using passive eye trace from reading to enhance document access and comprehension. Int. J. Hum. Comput. Interaction 23(1–2), 71–94 (2007)CrossRef Ohno, T.: Eyeprint: using passive eye trace from reading to enhance document access and comprehension. Int. J. Hum. Comput. Interaction 23(1–2), 71–94 (2007)CrossRef
30.
Zurück zum Zitat Pedrosa, D., Pimentel, M.G., Truong, K.N.: Filteryedping: a dwell-free eye typing technique. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems, pp. 303–306 (2015) Pedrosa, D., Pimentel, M.G., Truong, K.N.: Filteryedping: a dwell-free eye typing technique. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems, pp. 303–306 (2015)
31.
Zurück zum Zitat Pedrosa, D., Pimentel, M.D.G., Wright, A., Truong, K.N.: Filteryedping: design challenges and user performance of dwell-free eye typing. ACM Trans. Access. Comput. 6(1), 3 (2015)CrossRef Pedrosa, D., Pimentel, M.D.G., Wright, A., Truong, K.N.: Filteryedping: design challenges and user performance of dwell-free eye typing. ACM Trans. Access. Comput. 6(1), 3 (2015)CrossRef
32.
Zurück zum Zitat Räihä, K.-J., Ovaska, S.: An exploratory study of eye typing fundamentals: dwell time, text entry rate, errors, and workload. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp. 3001–3010 (2012) Räihä, K.-J., Ovaska, S.: An exploratory study of eye typing fundamentals: dwell time, text entry rate, errors, and workload. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp. 3001–3010 (2012)
33.
Zurück zum Zitat Sarcar, S., Panwar, P., Chakraborty, T.: Eyek: an efficient dwell-free eye gaze-based text entry system. In: Proceedings of the 11th Asia Pacific Conference on Computer Human Interaction, pp. 215–220 (2013) Sarcar, S., Panwar, P., Chakraborty, T.: Eyek: an efficient dwell-free eye gaze-based text entry system. In: Proceedings of the 11th Asia Pacific Conference on Computer Human Interaction, pp. 215–220 (2013)
34.
Zurück zum Zitat Sesma, L., Villanueva, A., Cabeza, R.: Evaluation of pupil center-eye corner vector for gaze estimation using a web cam. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 217–220. ACM, New York (2012). https://doi.org/10.1145/2168556.2168598 Sesma, L., Villanueva, A., Cabeza, R.: Evaluation of pupil center-eye corner vector for gaze estimation using a web cam. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 217–220. ACM, New York (2012). https://​doi.​org/​10.​1145/​2168556.​2168598
35.
Zurück zum Zitat Skodras, E., Fakotakis, N.: Precise localization of eye centers in low resolution color images. Image Vis. Comput. 36, 51–60 (2015)CrossRef Skodras, E., Fakotakis, N.: Precise localization of eye centers in low resolution color images. Image Vis. Comput. 36, 51–60 (2015)CrossRef
36.
Zurück zum Zitat Spataro, R., Ciriacono, M., Manno, C., La Bella, V.: The eye-tracking computer device for communication in amyotrophic lateral sclerosis. Acta Neurol. Scand. 130(1), 40–45 (2014)CrossRef Spataro, R., Ciriacono, M., Manno, C., La Bella, V.: The eye-tracking computer device for communication in amyotrophic lateral sclerosis. Acta Neurol. Scand. 130(1), 40–45 (2014)CrossRef
37.
Zurück zum Zitat Su, M.-C., Wang, K.-C., Chen, G.-D.: An eye tracking system and its application in aids for people with severe disabilities. Biomed. Eng. Appl. Basis Commun. 18(06), 319–327 (2006)CrossRef Su, M.-C., Wang, K.-C., Chen, G.-D.: An eye tracking system and its application in aids for people with severe disabilities. Biomed. Eng. Appl. Basis Commun. 18(06), 319–327 (2006)CrossRef
38.
Zurück zum Zitat Sugano, Y., Matsushita, Y., Sato, Y., Koike, H.: An incremental learning method for unconstrained gaze estimation. In: European Conference on Computer Vision, pp. 656–667 (2008) Sugano, Y., Matsushita, Y., Sato, Y., Koike, H.: An incremental learning method for unconstrained gaze estimation. In: European Conference on Computer Vision, pp. 656–667 (2008)
39.
Zurück zum Zitat Tan, K.-H., Kriegman, D. J., Ahuja, N.: Appearance-based eye gaze estimation. In: Proceedings of Sixth IEEE Workshop on Applications Of Computer Vision, 2002 (WACV 2002), pp. 191–195 (2002) Tan, K.-H., Kriegman, D. J., Ahuja, N.: Appearance-based eye gaze estimation. In: Proceedings of Sixth IEEE Workshop on Applications Of Computer Vision, 2002 (WACV 2002), pp. 191–195 (2002)
40.
Zurück zum Zitat Urbina, M.H., Huckauf, A.: Alternatives to single character entry and dwell time selection on eye typing. In: Proceedings of the 2010 Symposium on Eye-tracking Research and Applications, pp. 315–322 (2010) Urbina, M.H., Huckauf, A.: Alternatives to single character entry and dwell time selection on eye typing. In: Proceedings of the 2010 Symposium on Eye-tracking Research and Applications, pp. 315–322 (2010)
41.
Zurück zum Zitat Vadillo, M.A., Street, C.N., Beesley, T., Shanks, D.R.: A simple algorithm for the offline recalibration of eye-tracking data through best-fitting linear transformation. Behav. Res. Methods 47(4), 1365–1376 (2015)CrossRef Vadillo, M.A., Street, C.N., Beesley, T., Shanks, D.R.: A simple algorithm for the offline recalibration of eye-tracking data through best-fitting linear transformation. Behav. Res. Methods 47(4), 1365–1376 (2015)CrossRef
42.
Zurück zum Zitat Valenti, R., Gevers, T.: Accurate eye center location through invariant isocentric patterns. IEEE Trans. Pattern Anal. Mach. Intell. 34(9), 1785–1798 (2012)CrossRef Valenti, R., Gevers, T.: Accurate eye center location through invariant isocentric patterns. IEEE Trans. Pattern Anal. Mach. Intell. 34(9), 1785–1798 (2012)CrossRef
43.
Zurück zum Zitat Villanueva, A., Cabeza, R., Porta, S.: Gaze tracking system model based on physical parameters. Int. J. Pattern Recognit. Artif. Intell. 21(05), 855–877 (2007)CrossRef Villanueva, A., Cabeza, R., Porta, S.: Gaze tracking system model based on physical parameters. Int. J. Pattern Recognit. Artif. Intell. 21(05), 855–877 (2007)CrossRef
44.
Zurück zum Zitat Viola, P., Jones, M.J.: Robust real-time face detection. Int. J. Comput. Vis. 57(2), 137–154 (2004)CrossRef Viola, P., Jones, M.J.: Robust real-time face detection. Int. J. Comput. Vis. 57(2), 137–154 (2004)CrossRef
45.
Zurück zum Zitat Wang, J.-G., Sung, E., Venkateswarlu, R.: Estimating the eye gaze from one eye. Comput. Vis. Image Underst. 98(1), 83–103 (2005)CrossRef Wang, J.-G., Sung, E., Venkateswarlu, R.: Estimating the eye gaze from one eye. Comput. Vis. Image Underst. 98(1), 83–103 (2005)CrossRef
46.
Zurück zum Zitat Wang, P., Green, M. B., Ji, Q., Wayman, J.: Automatic eye detection and its validation. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition-workshops, 2005. CVPR Workshops, pp. 164 (2005) Wang, P., Green, M. B., Ji, Q., Wayman, J.: Automatic eye detection and its validation. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition-workshops, 2005. CVPR Workshops, pp. 164 (2005)
47.
Zurück zum Zitat Ward, D.J., MacKay, D.J.: Artificial intelligence: fast hands-free writing by gaze direction. Nature 418, 838 (2002)CrossRef Ward, D.J., MacKay, D.J.: Artificial intelligence: fast hands-free writing by gaze direction. Nature 418, 838 (2002)CrossRef
48.
Zurück zum Zitat Williams, O., Blake, A., Cipolla, R.: Sparse and semi-supervised visual mapping with the \(\text{s}^{\wedge }\)3GP. In: 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’06), vol. 1, pp. 230–237 (2006) Williams, O., Blake, A., Cipolla, R.: Sparse and semi-supervised visual mapping with the \(\text{s}^{\wedge }\)3GP. In: 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’06), vol. 1, pp. 230–237 (2006)
49.
Zurück zum Zitat Yu, P., Zhou, J., Wu, Y.: Learning reconstruction-based remote gaze estimation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 3447–3455 (2016) Yu, P., Zhou, J., Wu, Y.: Learning reconstruction-based remote gaze estimation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 3447–3455 (2016)
50.
Zurück zum Zitat Zander, T.O., Gaertner, M., Kothe, C., Vilimek, R.: Combining eye gaze input with a brain-computer interface for touchless human–computer interaction. Int. J. Hum. Comput. Interaction 27(1), 38–51 (2010)CrossRef Zander, T.O., Gaertner, M., Kothe, C., Vilimek, R.: Combining eye gaze input with a brain-computer interface for touchless human–computer interaction. Int. J. Hum. Comput. Interaction 27(1), 38–51 (2010)CrossRef
51.
Zurück zum Zitat Zhang, X., Sugano, Y., Fritz, M., Bulling, A.: Appearance-based gaze estimation in the wild. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 4511–4520 (2015) Zhang, X., Sugano, Y., Fritz, M., Bulling, A.: Appearance-based gaze estimation in the wild. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 4511–4520 (2015)
52.
Zurück zum Zitat Zhang, Y., Hornof, A.J.: Easy post-hoc spatial recalibration of eye tracking data. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 95–98 (2014) Zhang, Y., Hornof, A.J.: Easy post-hoc spatial recalibration of eye tracking data. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 95–98 (2014)
53.
Zurück zum Zitat Zhou, Z.-H., Geng, X.: Projection functions for eye detection. Pattern Recognit. 37(5), 1049–1056 (2004)MATHCrossRef Zhou, Z.-H., Geng, X.: Projection functions for eye detection. Pattern Recognit. 37(5), 1049–1056 (2004)MATHCrossRef
54.
Zurück zum Zitat Zhu, Z., Ji, Q.: Robust real-time eye detection and tracking under variable lighting conditions and various face orientations. Comput. Vis. Image Underst. 98(1), 124–154 (2005)CrossRef Zhu, Z., Ji, Q.: Robust real-time eye detection and tracking under variable lighting conditions and various face orientations. Comput. Vis. Image Underst. 98(1), 124–154 (2005)CrossRef
Metadaten
Titel
CamType: assistive text entry using gaze with an off-the-shelf webcam
verfasst von
Yi Liu
Bu-Sung Lee
Deepu Rajan
Andrzej Sluzek
Martin J. McKeown
Publikationsdatum
12.02.2019
Verlag
Springer Berlin Heidelberg
Erschienen in
Machine Vision and Applications / Ausgabe 3/2019
Print ISSN: 0932-8092
Elektronische ISSN: 1432-1769
DOI
https://doi.org/10.1007/s00138-018-00997-4

Weitere Artikel der Ausgabe 3/2019

Machine Vision and Applications 3/2019 Zur Ausgabe

Premium Partner