Skip to main content

2014 | OriginalPaper | Buchkapitel

3. Eye Tracking and Eye-Based Human–Computer Interaction

verfasst von : Päivi Majaranta, Andreas Bulling

Erschienen in: Advances in Physiological Computing

Verlag: Springer London

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Eye tracking has a long history in medical and psychological research as a tool for recording and studying human visual behavior. Real-time gaze-based text entry can also be a powerful means of communication and control for people with physical disabilities. Following recent technological advances and the advent of affordable eye trackers, there is a growing interest in pervasive attention-aware systems and interfaces that have the potential to revolutionize mainstream human-technology interaction. In this chapter, we provide an introduction to the state-of-the art in eye tracking technology and gaze estimation. We discuss challenges involved in using a perceptual organ, the eye, as an input modality. Examples of real life applications are reviewed, together with design solutions derived from research results. We also discuss how to match the user requirements and key features of different eye tracking systems to find the best system for each task and application.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
Zurück zum Zitat Alapetite A, Hansen JP, MacKenzie IS (2012) Demo of gaze controlled flying. In: Proceedings of the 7th Nordic conference on human-computer interaction: making sense through design, NordiCHI’12. ACM, New York, pp 773–774 Alapetite A, Hansen JP, MacKenzie IS (2012) Demo of gaze controlled flying. In: Proceedings of the 7th Nordic conference on human-computer interaction: making sense through design, NordiCHI’12. ACM, New York, pp 773–774
Zurück zum Zitat Ashmore M, Duchowski AT, Shoemaker G (2005) Efficient eye pointing with a fisheye lens. In: Proceedings of graphics interface 2005, GI’05. Canadian Human-Computer Communications Society, Waterloo, Ontario, pp 203–210 Ashmore M, Duchowski AT, Shoemaker G (2005) Efficient eye pointing with a fisheye lens. In: Proceedings of graphics interface 2005, GI’05. Canadian Human-Computer Communications Society, Waterloo, Ontario, pp 203–210
Zurück zum Zitat Barea F, Boquete L, Mazo M, Lopez E (2002) System for assisted mobility using eye movements based on electrooculography. IEEE Trans Neural Syst Rehabil Eng 10(4):209–218CrossRef Barea F, Boquete L, Mazo M, Lopez E (2002) System for assisted mobility using eye movements based on electrooculography. IEEE Trans Neural Syst Rehabil Eng 10(4):209–218CrossRef
Zurück zum Zitat Bates R, Donegan M, Istance HO et al (2006) Introducing COGAIN—communication by gaze interaction. In: Clarkson J, Langdon P, Robinson P (eds) Designing accessible technology. Springer, London, pp 77–84CrossRef Bates R, Donegan M, Istance HO et al (2006) Introducing COGAIN—communication by gaze interaction. In: Clarkson J, Langdon P, Robinson P (eds) Designing accessible technology. Springer, London, pp 77–84CrossRef
Zurück zum Zitat Bates R, Istance H (2002) Zooming interfaces!: enhancing the performance of eye controlled pointing devices. In: Proceedings of the 5th international ACM conference on assistive technologies, Assets’02. ACM, New York, pp 119–126 Bates R, Istance H (2002) Zooming interfaces!: enhancing the performance of eye controlled pointing devices. In: Proceedings of the 5th international ACM conference on assistive technologies, Assets’02. ACM, New York, pp 119–126
Zurück zum Zitat Bates R, Istance HO, Vickers S (2008) Gaze interaction with virtual on-line communities. Designing inclusive futures. Springer, London, pp 149–162 Bates R, Istance HO, Vickers S (2008) Gaze interaction with virtual on-line communities. Designing inclusive futures. Springer, London, pp 149–162
Zurück zum Zitat Bednarik R, Vrzakova H, Hradis M (2012) What you want to do next: a novel approach for intent prediction in gaze-based interaction. In: Proceedings of the symposium on eye tracking research and applications, ETRA’12. ACM, New York, pp 83–90 Bednarik R, Vrzakova H, Hradis M (2012) What you want to do next: a novel approach for intent prediction in gaze-based interaction. In: Proceedings of the symposium on eye tracking research and applications, ETRA’12. ACM, New York, pp 83–90
Zurück zum Zitat Bengoechea JJ, Villanueva A, Cabeza R (2012) Hybrid eye detection algorithm for outdoor environments. In: Proceedings of the 2012 ACM conference on ubiquitous computing, UbiComp’12. ACM, New York, pp 685–688 Bengoechea JJ, Villanueva A, Cabeza R (2012) Hybrid eye detection algorithm for outdoor environments. In: Proceedings of the 2012 ACM conference on ubiquitous computing, UbiComp’12. ACM, New York, pp 685–688
Zurück zum Zitat Brigham FJ, Zaimi E, Matkins JJ et al (2001) The eyes may have it: reconsidering eye-movement research in human cognition. In: Scruggs TE, Mastropieri MA (eds) Technological applications. Advances in learning and behavioral disabilities, vol 15. Emerald Group Publishing Limited, Bingley, pp 39–59 Brigham FJ, Zaimi E, Matkins JJ et al (2001) The eyes may have it: reconsidering eye-movement research in human cognition. In: Scruggs TE, Mastropieri MA (eds) Technological applications. Advances in learning and behavioral disabilities, vol 15. Emerald Group Publishing Limited, Bingley, pp 39–59
Zurück zum Zitat Borghetti D, Bruni A, Fabbrini M et al (2007) A low-cost interface for control of computer functions by means of eye movements. Comput Biol Med 37(12):1765–1770CrossRef Borghetti D, Bruni A, Fabbrini M et al (2007) A low-cost interface for control of computer functions by means of eye movements. Comput Biol Med 37(12):1765–1770CrossRef
Zurück zum Zitat Bulling A, Cheng S, Brône G et al (2012a) 2nd international workshop on pervasive eye tracking and mobile eye-based interaction (PETMEI 2012). In: Proceedings of the 2012 ACM conference on ubiquitous computing, UbiComp 2012. ACM, New York, pp 673–676 Bulling A, Cheng S, Brône G et al (2012a) 2nd international workshop on pervasive eye tracking and mobile eye-based interaction (PETMEI 2012). In: Proceedings of the 2012 ACM conference on ubiquitous computing, UbiComp 2012. ACM, New York, pp 673–676
Zurück zum Zitat Bulling A, Ward JA, Gellersen H et al (2008a) Robust recognition of reading activity in transit using wearable electrooculography. In: Proceedings of the 6th international conference on pervasive computing, Pervasive 2008, pp 19–37 Bulling A, Ward JA, Gellersen H et al (2008a) Robust recognition of reading activity in transit using wearable electrooculography. In: Proceedings of the 6th international conference on pervasive computing, Pervasive 2008, pp 19–37
Zurück zum Zitat Bulling A, Gellersen H (2010) Toward mobile eye-based human-computer interaction. IEEE Pervasive Comput 9(4):8–12CrossRef Bulling A, Gellersen H (2010) Toward mobile eye-based human-computer interaction. IEEE Pervasive Comput 9(4):8–12CrossRef
Zurück zum Zitat Bulling A, Roggen D, Tröster G (2008b) It’s in your eyes—Towards context-awareness and mobile hci using wearable EOG goggles. In: Proceedings of the 10th international conference on ubiquitous computing. ACM, New York, pp 84–93 Bulling A, Roggen D, Tröster G (2008b) It’s in your eyes—Towards context-awareness and mobile hci using wearable EOG goggles. In: Proceedings of the 10th international conference on ubiquitous computing. ACM, New York, pp 84–93
Zurück zum Zitat Bulling A, Roggen D, Tröster G (2009a) Wearable EOG goggles: eye-based interaction in everyday environments. In: Extended abstracts of the 27th ACM conference on human factors in computing systems, CHI’09. ACM, New York, pp 3259–3264 Bulling A, Roggen D, Tröster G (2009a) Wearable EOG goggles: eye-based interaction in everyday environments. In: Extended abstracts of the 27th ACM conference on human factors in computing systems, CHI’09. ACM, New York, pp 3259–3264
Zurück zum Zitat Bulling A, Roggen D, Tröster G (2009b) Wearable EOG goggles: seamless sensing and context-awareness in everyday environments. J Ambient Intell Smart Environ 1(2):157–171 Bulling A, Roggen D, Tröster G (2009b) Wearable EOG goggles: seamless sensing and context-awareness in everyday environments. J Ambient Intell Smart Environ 1(2):157–171
Zurück zum Zitat Bulling A, Ward JA, Gellersen H et al (2009c) Eye movement analysis for activity recognition. In: Proceedings of the 11th international conference on ubiquitous computing, UbiComp 2009. ACM, New York, pp 41–50 Bulling A, Ward JA, Gellersen H et al (2009c) Eye movement analysis for activity recognition. In: Proceedings of the 11th international conference on ubiquitous computing, UbiComp 2009. ACM, New York, pp 41–50
Zurück zum Zitat Bulling A, Roggen D (2011a) Recognition of visual memory recall processes using eye movement analysis. In: Proceedings of the 13th international conference on ubiquitous computing, UbiComp 2011. ACM, New York, pp 455–464 Bulling A, Roggen D (2011a) Recognition of visual memory recall processes using eye movement analysis. In: Proceedings of the 13th international conference on ubiquitous computing, UbiComp 2011. ACM, New York, pp 455–464
Zurück zum Zitat Bulling A, Ward JA, Gellersen H et al (2011b) Eye movement analysis for activity recognition using electrooculography. IEEE Trans Pattern Anal Mach Intell 33(4):741–753 Bulling A, Ward JA, Gellersen H et al (2011b) Eye movement analysis for activity recognition using electrooculography. IEEE Trans Pattern Anal Mach Intell 33(4):741–753
Zurück zum Zitat Bulling A, Ward JA, Gellersen H (2012b) Multimodal recognition of reading activity in transit using body-worn sensors. ACM Trans Appl Percept 9(1):2:1–2:21 Bulling A, Ward JA, Gellersen H (2012b) Multimodal recognition of reading activity in transit using body-worn sensors. ACM Trans Appl Percept 9(1):2:1–2:21
Zurück zum Zitat Bulling A, Weichel C, Gellersen H (2013) EyeContext: recognition of high-level contextual cues from human visual behavior. In: Proceedings of the 31st SIGCHI international conference on human factors in computing systems, CHI 2013. ACM, New York, pp 305–308 Bulling A, Weichel C, Gellersen H (2013) EyeContext: recognition of high-level contextual cues from human visual behavior. In: Proceedings of the 31st SIGCHI international conference on human factors in computing systems, CHI 2013. ACM, New York, pp 305–308
Zurück zum Zitat Canosa RL (2009) Real-world vision: selective perception and task. ACM Trans Appl Percept 6(2):article 11, 34 pp Canosa RL (2009) Real-world vision: selective perception and task. ACM Trans Appl Percept 6(2):article 11, 34 pp
Zurück zum Zitat Castellina E, Corno F (2007) Accessible web surfing through gaze interaction. In: Proceedings of the 3rd Conference on communication by gaze interaction, COGAIN 2007, Leicester, 3–4 Sept 2007, pp 74–77 Castellina E, Corno F (2007) Accessible web surfing through gaze interaction. In: Proceedings of the 3rd Conference on communication by gaze interaction, COGAIN 2007, Leicester, 3–4 Sept 2007, pp 74–77
Zurück zum Zitat Chen Y, Newman WS (2004) A human-robot interface based on electrooculography. In: Proceedings of the international conference on robotics and automation, ICRA 2004, vol 1, pp 243–248 Chen Y, Newman WS (2004) A human-robot interface based on electrooculography. In: Proceedings of the international conference on robotics and automation, ICRA 2004, vol 1, pp 243–248
Zurück zum Zitat Corno F, Gale A, Majaranta P et al (2010) Eye-based direct interaction for environmental control in heterogeneous smart environments. In: Nakashima H et al (eds) Handbook of ambient intelligence and smart environments. Springer, New York, pp 1117–1138 Corno F, Gale A, Majaranta P et al (2010) Eye-based direct interaction for environmental control in heterogeneous smart environments. In: Nakashima H et al (eds) Handbook of ambient intelligence and smart environments. Springer, New York, pp 1117–1138
Zurück zum Zitat Ding Q, Tong K, Li G (2005) Development of an EOG (ElectroOculography) based human-computer interface. In: Proceedings of the 27th annual international conference of the engineering in medicine and biology society, EMBS 2005, pp 6829–6831 Ding Q, Tong K, Li G (2005) Development of an EOG (ElectroOculography) based human-computer interface. In: Proceedings of the 27th annual international conference of the engineering in medicine and biology society, EMBS 2005, pp 6829–6831
Zurück zum Zitat Donegan M, Morris DJ, Corno F et al (2009) Understanding users and their needs. Univ Access Inf Soc 8(4):259–275CrossRef Donegan M, Morris DJ, Corno F et al (2009) Understanding users and their needs. Univ Access Inf Soc 8(4):259–275CrossRef
Zurück zum Zitat Drewes H, Schmidt A (2007) Interacting with the computer using gaze gestures. In: Proceedings of INTERACT ‘07. Lecture notes in computer science, vol 4663. Springer, Heidelberg, pp 475-488 Drewes H, Schmidt A (2007) Interacting with the computer using gaze gestures. In: Proceedings of INTERACT ‘07. Lecture notes in computer science, vol 4663. Springer, Heidelberg, pp 475-488
Zurück zum Zitat Du R, Liu R, Wu T et al (2012) Online vigilance analysis combining video and electrooculography features. In: Proceedings of 19th international conference on neural information processing, ICONIP 2012. Lecture notes in computer science, vol 7667. Springer, Heidelberg, pp 447–454 Du R, Liu R, Wu T et al (2012) Online vigilance analysis combining video and electrooculography features. In: Proceedings of 19th international conference on neural information processing, ICONIP 2012. Lecture notes in computer science, vol 7667. Springer, Heidelberg, pp 447–454
Zurück zum Zitat Duchowski AT (2002) A breadth-first survey of eye-tracking applications. Behav Res Meth 34(4):455–470CrossRef Duchowski AT (2002) A breadth-first survey of eye-tracking applications. Behav Res Meth 34(4):455–470CrossRef
Zurück zum Zitat Duchowski AT (2003) Eye tracking methodology: theory and practice. Springer, LondonCrossRef Duchowski AT (2003) Eye tracking methodology: theory and practice. Springer, LondonCrossRef
Zurück zum Zitat Duchowski AT, Cournia NA, Murphy HA (2004) Gaze-contingent displays: a review. CyberPsychol Behav 7(6):621–634CrossRef Duchowski AT, Cournia NA, Murphy HA (2004) Gaze-contingent displays: a review. CyberPsychol Behav 7(6):621–634CrossRef
Zurück zum Zitat Dybdal ML, San Agustin J, Hansen JP (2012) Gaze input for mobile devices by dwell and gestures. In: Proceedings of the symposium on eye tracking research and applications, ETRA ‘12. ACM, New York, pp 225–228 Dybdal ML, San Agustin J, Hansen JP (2012) Gaze input for mobile devices by dwell and gestures. In: Proceedings of the symposium on eye tracking research and applications, ETRA ‘12. ACM, New York, pp 225–228
Zurück zum Zitat ElHelw MA, Atkins S, Nicolaou M et al (2008) A gaze-based study for investigating the perception of photorealism in simulated scenes. ACM Trans Appl Percept 5(1):article 3, 20 pp ElHelw MA, Atkins S, Nicolaou M et al (2008) A gaze-based study for investigating the perception of photorealism in simulated scenes. ACM Trans Appl Percept 5(1):article 3, 20 pp
Zurück zum Zitat Ellis S, Cadera R, Misner J et al (1998) Windows to the soul? What eye movements tell us about software usability. In: Proceedings of 7th annual conference of the usability professionals association, Washington, pp 151–178 Ellis S, Cadera R, Misner J et al (1998) Windows to the soul? What eye movements tell us about software usability. In: Proceedings of 7th annual conference of the usability professionals association, Washington, pp 151–178
Zurück zum Zitat Essig K, Dornbusch D, Prinzhorn D et al (2012) Automatic analysis of 3D gaze coordinates on scene objects using data from eye-tracking and motion-capture systems. In: Proceedings of the symposium on eye tracking research and applications, ETRA’12. ACM, New York, pp 37–44 Essig K, Dornbusch D, Prinzhorn D et al (2012) Automatic analysis of 3D gaze coordinates on scene objects using data from eye-tracking and motion-capture systems. In: Proceedings of the symposium on eye tracking research and applications, ETRA’12. ACM, New York, pp 37–44
Zurück zum Zitat Fairclough SH (2011) Physiological computing: interacting with the human nervous system. In: Ouwerkerk M, Westerlink J, Krans M (eds) Sensing emotions in context: the impact of context on behavioural and physiological experience measurements. Springer, Amsterdam, pp 1–22 Fairclough SH (2011) Physiological computing: interacting with the human nervous system. In: Ouwerkerk M, Westerlink J, Krans M (eds) Sensing emotions in context: the impact of context on behavioural and physiological experience measurements. Springer, Amsterdam, pp 1–22
Zurück zum Zitat Fejtová M, Figueiredo L, Novák P et al (2009) Hands-free interaction with a computer and other technologies. Univ Access Inf Soc 8(4):277–295 Fejtová M, Figueiredo L, Novák P et al (2009) Hands-free interaction with a computer and other technologies. Univ Access Inf Soc 8(4):277–295
Zurück zum Zitat Fono D, Vertegaal R (2005) EyeWindows: evaluation of eye-controlled zooming windows for focus selection. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI’05. ACM, New York, pp 151–160 Fono D, Vertegaal R (2005) EyeWindows: evaluation of eye-controlled zooming windows for focus selection. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI’05. ACM, New York, pp 151–160
Zurück zum Zitat Friedman MB, Kiliany G, Dzmura et al (1982) The eyetracker communication system. Johns Hopkins APL Technical Digest 3(3):250–252 Friedman MB, Kiliany G, Dzmura et al (1982) The eyetracker communication system. Johns Hopkins APL Technical Digest 3(3):250–252
Zurück zum Zitat Goldberg JH, Wichansky AM (2003) Eye tracking in usability evaluation: a practitioner’s guide. In: Hyönä J, Radach R, Deubel H (eds) The mind’s eye: cognitive and applied aspects of eye movement research. North-Holland, Amsterdam, pp 493–516CrossRef Goldberg JH, Wichansky AM (2003) Eye tracking in usability evaluation: a practitioner’s guide. In: Hyönä J, Radach R, Deubel H (eds) The mind’s eye: cognitive and applied aspects of eye movement research. North-Holland, Amsterdam, pp 493–516CrossRef
Zurück zum Zitat Greene MR, Liu TY, Wolfe JM (2012) Reconsidering Yarbus: a failure to predict observers’ task from eye movement patterns. Vis Res 62:1–8CrossRef Greene MR, Liu TY, Wolfe JM (2012) Reconsidering Yarbus: a failure to predict observers’ task from eye movement patterns. Vis Res 62:1–8CrossRef
Zurück zum Zitat Hacisalihzade SS, Stark LW, Allen JS (1992) Visual perception and sequences of eye movement fixations: a stochastic modeling approach. IEEE Trans Syst Man Cybern 22(3):474–481CrossRef Hacisalihzade SS, Stark LW, Allen JS (1992) Visual perception and sequences of eye movement fixations: a stochastic modeling approach. IEEE Trans Syst Man Cybern 22(3):474–481CrossRef
Zurück zum Zitat Hammoud R (ed) (2008) Passive eye monitoring: algorithms, applications and experiments. Series: signals and communication technology. Springer, Berlin Hammoud R (ed) (2008) Passive eye monitoring: algorithms, applications and experiments. Series: signals and communication technology. Springer, Berlin
Zurück zum Zitat Hansen DW, Ji Q (2009) In the eye of the beholder: a survey of models for eyes and gaze. IEEE Trans Pattern Anal Mach Intell 32(3):478–500CrossRef Hansen DW, Ji Q (2009) In the eye of the beholder: a survey of models for eyes and gaze. IEEE Trans Pattern Anal Mach Intell 32(3):478–500CrossRef
Zurück zum Zitat Hansen DW, Majaranta P (2012) Basics of camera-based gaze tracking. In: Majaranta P et al (eds) Gaze interaction and applications of eye tracking: advances in assistive technologies. Medical Information Science Reference, Hershey, pp 21–26 Hansen DW, Majaranta P (2012) Basics of camera-based gaze tracking. In: Majaranta P et al (eds) Gaze interaction and applications of eye tracking: advances in assistive technologies. Medical Information Science Reference, Hershey, pp 21–26
Zurück zum Zitat Hansen DW, Pece AEC (2005) Eye tracking in the wild. Comput Vis Image Underst 98(1):155–181CrossRef Hansen DW, Pece AEC (2005) Eye tracking in the wild. Comput Vis Image Underst 98(1):155–181CrossRef
Zurück zum Zitat Hansen DW, Skovsgaard HH, Hansen JP et al (2008) Noise tolerant selection by gaze-controlled pan and zoom in 3D. In: Proceedings of the symposium on eye tracking research and applications, ETRA’08. ACM, New York, pp 205–212 Hansen DW, Skovsgaard HH, Hansen JP et al (2008) Noise tolerant selection by gaze-controlled pan and zoom in 3D. In: Proceedings of the symposium on eye tracking research and applications, ETRA’08. ACM, New York, pp 205–212
Zurück zum Zitat Hansen DW, San Agustin J, Villanueva A (2010) Homography normalization for robust gaze estimation in uncalibrated setups. In: Proceedings of the 2010 symposium on eye-tracking research and applications, ETRA’10. ACM, New York, pp 13–20 Hansen DW, San Agustin J, Villanueva A (2010) Homography normalization for robust gaze estimation in uncalibrated setups. In: Proceedings of the 2010 symposium on eye-tracking research and applications, ETRA’10. ACM, New York, pp 13–20
Zurück zum Zitat Hansen JP, Tørning K, Johansen AS et al (2004) Gaze typing compared with input by head and hand. In: Proceedings of the 2004 symposium on eye tracking research and applications, ETRA’04. ACM, New York, pp 131–138 Hansen JP, Tørning K, Johansen AS et al (2004) Gaze typing compared with input by head and hand. In: Proceedings of the 2004 symposium on eye tracking research and applications, ETRA’04. ACM, New York, pp 131–138
Zurück zum Zitat Hillstrom AP, Yantis S (1994) Visual motion and attentional capture. Percept Psychophys 55(4):399–411CrossRef Hillstrom AP, Yantis S (1994) Visual motion and attentional capture. Percept Psychophys 55(4):399–411CrossRef
Zurück zum Zitat Hori J, Sakano K, Miyakawa M, Saitoh Y (2006) Eye movement communication control system based on EOG and voluntary eye blink. In: Proceedings of the 9th international conference on computers helping people with special needs, ICCHP, vol 4061, pp 950–953 Hori J, Sakano K, Miyakawa M, Saitoh Y (2006) Eye movement communication control system based on EOG and voluntary eye blink. In: Proceedings of the 9th international conference on computers helping people with special needs, ICCHP, vol 4061, pp 950–953
Zurück zum Zitat Huckauf A, Urbina MH (2008) On object selection in gaze controlled environments. J Eye Mov Res 2(4):1–7 Huckauf A, Urbina MH (2008) On object selection in gaze controlled environments. J Eye Mov Res 2(4):1–7
Zurück zum Zitat Hyrskykari A, Majaranta P, Aaltonen A et al (2000) Design issues of iDICT: a gaze-assisted translation aid. In: Proceedings of the 2000 symposium on eye tracking research and applications, ETRA 2000. ACM, New York, pp 9–14 Hyrskykari A, Majaranta P, Aaltonen A et al (2000) Design issues of iDICT: a gaze-assisted translation aid. In: Proceedings of the 2000 symposium on eye tracking research and applications, ETRA 2000. ACM, New York, pp 9–14
Zurück zum Zitat Hyrskykari A, Majaranta P, Räihä KJ (2003) Proactive response to eye movements. In: Rauterberg et al (eds) Proceedings of INTERACT 2003, pp 129–136 Hyrskykari A, Majaranta P, Räihä KJ (2003) Proactive response to eye movements. In: Rauterberg et al (eds) Proceedings of INTERACT 2003, pp 129–136
Zurück zum Zitat Isokoski P (2000) Text input methods for eye trackers using off-screen targets. In: Proceedings of the symposium on eye tracking research and applications, ETRA’00. ACM, New York, pp 15–21 Isokoski P (2000) Text input methods for eye trackers using off-screen targets. In: Proceedings of the symposium on eye tracking research and applications, ETRA’00. ACM, New York, pp 15–21
Zurück zum Zitat Isokoski P, Joos M, Spakov O et al (2009) Gaze controlled games. Univ Access Inf Soc 8(4):323–337CrossRef Isokoski P, Joos M, Spakov O et al (2009) Gaze controlled games. Univ Access Inf Soc 8(4):323–337CrossRef
Zurück zum Zitat Istance H, Hyrskykari A (2012) Gaze-aware systems and attentive applications. In: Majaranta P et al (eds) Gaze interaction and applications of eye tracking: advances in assistive technologies. IGI Global, Hershey, pp 175–195 Istance H, Hyrskykari A (2012) Gaze-aware systems and attentive applications. In: Majaranta P et al (eds) Gaze interaction and applications of eye tracking: advances in assistive technologies. IGI Global, Hershey, pp 175–195
Zurück zum Zitat Jacob RJK (1991) The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Trans Inf Sys 9(3):152–169CrossRef Jacob RJK (1991) The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Trans Inf Sys 9(3):152–169CrossRef
Zurück zum Zitat Jacob RJK (1993) Eye movement-based human-computer interaction techniques: toward non-command interfaces. In: Hartson HR, Hix D (eds) Advances in human-computer interaction, vol 4. Ablex Publishing Co, Norwood, pp 151–190 Jacob RJK (1993) Eye movement-based human-computer interaction techniques: toward non-command interfaces. In: Hartson HR, Hix D (eds) Advances in human-computer interaction, vol 4. Ablex Publishing Co, Norwood, pp 151–190
Zurück zum Zitat Jacob RJK (1995) Eye tracking in advanced interface design. In: Barfield W, Furness TA (eds) Virtual environments and advanced interface design. Oxford University Press, New York, pp 258–288 Jacob RJK (1995) Eye tracking in advanced interface design. In: Barfield W, Furness TA (eds) Virtual environments and advanced interface design. Oxford University Press, New York, pp 258–288
Zurück zum Zitat Jokinen K, Majaranta P (2013) Eye-gaze and facial expressions as feedback signals in educational interactions. In: Barres DG et al (eds) Technologies for inclusive education: beyond traditional integration approaches. IGI Global, Hershey, pp 38–58 Jokinen K, Majaranta P (2013) Eye-gaze and facial expressions as feedback signals in educational interactions. In: Barres DG et al (eds) Technologies for inclusive education: beyond traditional integration approaches. IGI Global, Hershey, pp 38–58
Zurück zum Zitat Kandemir M, Kaski S (2012) Learning relevance from natural eye movements in pervasive interfaces. In: Proceedings of the 14th ACM international conference on multimodal interaction, ICMI’12. ACM, New York, pp 85–92 Kandemir M, Kaski S (2012) Learning relevance from natural eye movements in pervasive interfaces. In: Proceedings of the 14th ACM international conference on multimodal interaction, ICMI’12. ACM, New York, pp 85–92
Zurück zum Zitat Kherlopian AR, Gerrein JP, Yue M et al (2006) Electrooculogram based system for computer control using a multiple feature classification model. In: Proceedings of the 28th annual international conference of the engineering in medicine and biology society, EMBS 2006, pp 1295–1298 Kherlopian AR, Gerrein JP, Yue M et al (2006) Electrooculogram based system for computer control using a multiple feature classification model. In: Proceedings of the 28th annual international conference of the engineering in medicine and biology society, EMBS 2006, pp 1295–1298
Zurück zum Zitat Kim Y, Doh N, Youm Y et al (2001) Development of a human-mobile communication system using electrooculogram signals. In: Proceedings of the 2001 IEEE/RSJ international conference on intelligent robots and systems, IROS 2001, vol 4, pp 2160–2165 Kim Y, Doh N, Youm Y et al (2001) Development of a human-mobile communication system using electrooculogram signals. In: Proceedings of the 2001 IEEE/RSJ international conference on intelligent robots and systems, IROS 2001, vol 4, pp 2160–2165
Zurück zum Zitat Kinsman TB, Pelz JB (2012) Location by parts: model generation and feature fusion for mobile eye pupil tracking under challenging lighting. In: Proceedings of the 2012 ACM conference on ubiquitous computing, UbiComp’12. ACM, New York, pp 695–700 Kinsman TB, Pelz JB (2012) Location by parts: model generation and feature fusion for mobile eye pupil tracking under challenging lighting. In: Proceedings of the 2012 ACM conference on ubiquitous computing, UbiComp’12. ACM, New York, pp 695–700
Zurück zum Zitat Kleinke CL (1986) Gaze and eye contact: a research review. Psychol Bull 100(1):78–100CrossRef Kleinke CL (1986) Gaze and eye contact: a research review. Psychol Bull 100(1):78–100CrossRef
Zurück zum Zitat Land MF, Furneaux S (1997) The knowledge base of the oculomotor system. Philos Trans Biol Sci 352(1358):1231–1239CrossRef Land MF, Furneaux S (1997) The knowledge base of the oculomotor system. Philos Trans Biol Sci 352(1358):1231–1239CrossRef
Zurück zum Zitat Liu SC, Delbruck T (2010) Neuromorphic sensory systems. Curr Opin Neurobiol 20:1–8CrossRef Liu SC, Delbruck T (2010) Neuromorphic sensory systems. Curr Opin Neurobiol 20:1–8CrossRef
Zurück zum Zitat Majaranta P (2012) Communication and text entry by gaze. In: Majaranta P et al (eds) Gaze interaction and applications of eye tracking: advances in assistive technologies. IGI Global, Hershey, pp 63–77 Majaranta P (2012) Communication and text entry by gaze. In: Majaranta P et al (eds) Gaze interaction and applications of eye tracking: advances in assistive technologies. IGI Global, Hershey, pp 63–77
Zurück zum Zitat Majaranta P, Ahola UK, Špakov O (2009a) Fast gaze typing with an adjustable dwell time. In: Proceedings of the 27th international conference on human factors in computing systems, CHI 2009. ACM, New York, pp 357–360 Majaranta P, Ahola UK, Špakov O (2009a) Fast gaze typing with an adjustable dwell time. In: Proceedings of the 27th international conference on human factors in computing systems, CHI 2009. ACM, New York, pp 357–360
Zurück zum Zitat Majaranta P, Bates R, Donegan M (2009b) Eye tracking. In: Stephanidis C. (ed) The universal access handbook, chapter 36. CRC Press, Boca Raton, 20 pp Majaranta P, Bates R, Donegan M (2009b) Eye tracking. In: Stephanidis C. (ed) The universal access handbook, chapter 36. CRC Press, Boca Raton, 20 pp
Zurück zum Zitat Majaranta P, MacKenzie IS, Aula A et al (2006) Effects of feedback and dwell time on eye typing speed and accuracy. Univ Access in Inf Soc 5(2):199–208CrossRef Majaranta P, MacKenzie IS, Aula A et al (2006) Effects of feedback and dwell time on eye typing speed and accuracy. Univ Access in Inf Soc 5(2):199–208CrossRef
Zurück zum Zitat Majaranta P, Räihä KJ (2002) Twenty years of eye typing: systems and design issues. In: Proceedings of 2002 symposium on eye tracking research and applications, ETRA 2002. ACM, New York, pp 15–22 Majaranta P, Räihä KJ (2002) Twenty years of eye typing: systems and design issues. In: Proceedings of 2002 symposium on eye tracking research and applications, ETRA 2002. ACM, New York, pp 15–22
Zurück zum Zitat Manabe H, Fukumoto M (2006) Full-time wearable headphone-type gaze detector. In: Extended abstracts of the SIGCHI conference on human factors in computing systems, CHI 2006. ACM, New York, pp 1073–1078 Manabe H, Fukumoto M (2006) Full-time wearable headphone-type gaze detector. In: Extended abstracts of the SIGCHI conference on human factors in computing systems, CHI 2006. ACM, New York, pp 1073–1078
Zurück zum Zitat Mele ML, Federici S (2012) A psychotechnological review on eye-tracking systems: towards user experience. Disabil Rehabil Assist Technol 7(4):261–281CrossRef Mele ML, Federici S (2012) A psychotechnological review on eye-tracking systems: towards user experience. Disabil Rehabil Assist Technol 7(4):261–281CrossRef
Zurück zum Zitat Miniotas D, Špakov O, Tugoy I et al (2006) Speech-augmented eye gaze interaction with small closely spaced targets. In: Proceedings of the 2006 symposium on eye tracking research and applications, ETRA ‘06. ACM, New York, pp 67–72 Miniotas D, Špakov O, Tugoy I et al (2006) Speech-augmented eye gaze interaction with small closely spaced targets. In: Proceedings of the 2006 symposium on eye tracking research and applications, ETRA ‘06. ACM, New York, pp 67–72
Zurück zum Zitat Mizuno F, Hayasaka T, Tsubota K et al (2003) Development of hands-free operation interface for wearable computer-hyper hospital at home. In: Proceedings of the 25th annual international conference of the engineering in medicine and biology society, EMBS 2003, vol 4, 17–21 Sept 2003, pp 3740–3743 Mizuno F, Hayasaka T, Tsubota K et al (2003) Development of hands-free operation interface for wearable computer-hyper hospital at home. In: Proceedings of the 25th annual international conference of the engineering in medicine and biology society, EMBS 2003, vol 4, 17–21 Sept 2003, pp 3740–3743
Zurück zum Zitat Mohammad Y, Okada S, Nishida T (2010) Autonomous development of gaze control for natural human-robot interaction. In Proceedings of the 2010 workshop on eye gaze in intelligent human machine interaction (EGIHMI ‘10). ACM, New York, NY, USA, pp 63–70 Mohammad Y, Okada S, Nishida T (2010) Autonomous development of gaze control for natural human-robot interaction. In Proceedings of the 2010 workshop on eye gaze in intelligent human machine interaction (EGIHMI ‘10). ACM, New York, NY, USA, pp 63–70
Zurück zum Zitat Morimoto CH, Mimica MRM (2005) Eye gaze tracking techniques for interactive applications. Comput Vis Image Underst 98(1):4–24CrossRef Morimoto CH, Mimica MRM (2005) Eye gaze tracking techniques for interactive applications. Comput Vis Image Underst 98(1):4–24CrossRef
Zurück zum Zitat Mulvey F (2012) Eye anatomy, eye movements and vision. In: Majaranta P et al (eds) Gaze interaction and applications of eye tracking: advances in assistive technologies. IGI Global, Hershey, pp 10–20 Mulvey F (2012) Eye anatomy, eye movements and vision. In: Majaranta P et al (eds) Gaze interaction and applications of eye tracking: advances in assistive technologies. IGI Global, Hershey, pp 10–20
Zurück zum Zitat Murphy-Chutorian E, Trivedi MM (2009) Head pose estimation in computer vision: a survey. IEEE Trans Pattern Anal Mach Intell 31(4):607–626CrossRef Murphy-Chutorian E, Trivedi MM (2009) Head pose estimation in computer vision: a survey. IEEE Trans Pattern Anal Mach Intell 31(4):607–626CrossRef
Zurück zum Zitat Nagamatsu N, Sugano R, Iwamoto Y et al (2010) User-calibration-free gaze tracking with estimation of the horizontal angles between the visual and the optical axes of both eyes. In: Proceedings of the 2010 symposium on eye-tracking research and applications, ETRA’10. ACM, New York, pp 251–254 Nagamatsu N, Sugano R, Iwamoto Y et al (2010) User-calibration-free gaze tracking with estimation of the horizontal angles between the visual and the optical axes of both eyes. In: Proceedings of the 2010 symposium on eye-tracking research and applications, ETRA’10. ACM, New York, pp 251–254
Zurück zum Zitat Nakano YI, Jokinen J, Huang HH (2012) 4th workshop on eye gaze in intelligent human machine interaction: eye gaze and multimodality. In: Proceedings of the 14th ACM international conference on multimodal interaction, ICMI’12. ACM, New York, pp 611–612 Nakano YI, Jokinen J, Huang HH (2012) 4th workshop on eye gaze in intelligent human machine interaction: eye gaze and multimodality. In: Proceedings of the 14th ACM international conference on multimodal interaction, ICMI’12. ACM, New York, pp 611–612
Zurück zum Zitat Nielsen J (1993) Noncommand user interfaces. Commun ACM 36(4):82–99CrossRef Nielsen J (1993) Noncommand user interfaces. Commun ACM 36(4):82–99CrossRef
Zurück zum Zitat Ohno T (1998) Features of eye gaze interface for selection tasks. In: Proceedings of the 3rd Asia Pacific computer-human interaction, APCHI’98. IEEE Computer Society, Washington, pp 176–182 Ohno T (1998) Features of eye gaze interface for selection tasks. In: Proceedings of the 3rd Asia Pacific computer-human interaction, APCHI’98. IEEE Computer Society, Washington, pp 176–182
Zurück zum Zitat Osterberg G (1935) Topography of the layer of rods and cones in the human retina. Acta Ophthalmol Suppl 13(6):1–102 Osterberg G (1935) Topography of the layer of rods and cones in the human retina. Acta Ophthalmol Suppl 13(6):1–102
Zurück zum Zitat Patmore DW, Knapp RB (1998) Towards an EOG-based eye tracker for computer control. In Proceedings of the 3rd international ACM conference on assistive technologies, ASSETS’98. ACM, New York, pp 197–203 Patmore DW, Knapp RB (1998) Towards an EOG-based eye tracker for computer control. In Proceedings of the 3rd international ACM conference on assistive technologies, ASSETS’98. ACM, New York, pp 197–203
Zurück zum Zitat Penzel T, Lo CC, Ivanov PC et al (2005) Analysis of sleep fragmentation and sleep structure in patients with sleep apnea and normal volunteers. In: 27th annual international conference of the engineering in medicine and biology society, IEEE-EMBS 2005, pp 2591–2594 Penzel T, Lo CC, Ivanov PC et al (2005) Analysis of sleep fragmentation and sleep structure in patients with sleep apnea and normal volunteers. In: 27th annual international conference of the engineering in medicine and biology society, IEEE-EMBS 2005, pp 2591–2594
Zurück zum Zitat Philips GR, Catellier AA, Barrett SF et al (2007) Electrooculogram wheelchair control. Biomed Sci Instrum 43:164–169 Philips GR, Catellier AA, Barrett SF et al (2007) Electrooculogram wheelchair control. Biomed Sci Instrum 43:164–169
Zurück zum Zitat Porta M Ravarelli A, Spagnoli G (2010) ceCursor, a contextual eye cursor for general pointing in windows environments. In: Proceedings of the 2010 symposium on eye-tracking research and applications, ETRA’10. ACM, New York, pp 331–337 Porta M Ravarelli A, Spagnoli G (2010) ceCursor, a contextual eye cursor for general pointing in windows environments. In: Proceedings of the 2010 symposium on eye-tracking research and applications, ETRA’10. ACM, New York, pp 331–337
Zurück zum Zitat Rayner K (1995) Eye movements and cognitive processes in reading, visual search, and scene perception. In: Findlay JM et al (eds) Eye movement research: mechanisms, processes and applications. North Holland, Amsterdam, pp 3–22CrossRef Rayner K (1995) Eye movements and cognitive processes in reading, visual search, and scene perception. In: Findlay JM et al (eds) Eye movement research: mechanisms, processes and applications. North Holland, Amsterdam, pp 3–22CrossRef
Zurück zum Zitat Rothkopf CA, Pelz JP (2004) Head movement estimation for wearable eye tracker. In: Proceedings of the 2004 symposium on eye tracking research and applications, ETRA’04. ACM, New York, pp 123–130 Rothkopf CA, Pelz JP (2004) Head movement estimation for wearable eye tracker. In: Proceedings of the 2004 symposium on eye tracking research and applications, ETRA’04. ACM, New York, pp 123–130
Zurück zum Zitat Räihä K-J, Hyrskykari A, Majaranta P (2011) Tracking of visual attention and adaptive applications. In: Roda C (ed) Human attention in digital environments. Cambridge University Press, Cambridge, pp 166–185 Räihä K-J, Hyrskykari A, Majaranta P (2011) Tracking of visual attention and adaptive applications. In: Roda C (ed) Human attention in digital environments. Cambridge University Press, Cambridge, pp 166–185
Zurück zum Zitat Salvucci DD, Anderson JR (2001) Automated eye-movement protocol analysis. Human-Comput Interact 16(1):39–86CrossRef Salvucci DD, Anderson JR (2001) Automated eye-movement protocol analysis. Human-Comput Interact 16(1):39–86CrossRef
Zurück zum Zitat Schneider E, Dera T, Bard K et al (2005) Eye movement driven head-mounted camera: it looks where the eyes look. In: IEEE international conference on systems, man and cybernetics, vol 3, pp 2437–2442 Schneider E, Dera T, Bard K et al (2005) Eye movement driven head-mounted camera: it looks where the eyes look. In: IEEE international conference on systems, man and cybernetics, vol 3, pp 2437–2442
Zurück zum Zitat Shell JS, Vertegaal R, Cheng D et al (2004) ECSGlasses and EyePliances: using attention to open sociable windows of interaction. In: Proceedings of the 2004 symposium on eye tracking research and applications, ETRA’04. ACM, New York, pp 93–100 Shell JS, Vertegaal R, Cheng D et al (2004) ECSGlasses and EyePliances: using attention to open sociable windows of interaction. In: Proceedings of the 2004 symposium on eye tracking research and applications, ETRA’04. ACM, New York, pp 93–100
Zurück zum Zitat Sibert LE, Jacob RJK (2000) Evaluation of eye gaze interaction. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’00, ACM, pp 281–288 Sibert LE, Jacob RJK (2000) Evaluation of eye gaze interaction. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’00, ACM, pp 281–288
Zurück zum Zitat Skovsgaard H, Räihä KJ, Tall M (2012) Computer control by gaze. In: Majaranta P et al. (eds) Gaze interaction and applications of eye tracking: advances in assistive technologies. IGI Global, Hershey, pp 78–102 Skovsgaard H, Räihä KJ, Tall M (2012) Computer control by gaze. In: Majaranta P et al. (eds) Gaze interaction and applications of eye tracking: advances in assistive technologies. IGI Global, Hershey, pp 78–102
Zurück zum Zitat Smith JR, Cronin MJ, Karacan I (1971) A multichannel hybrid system for rapid eye movement detection (REM detection). Comp Biomed Res 4(3):275–290CrossRef Smith JR, Cronin MJ, Karacan I (1971) A multichannel hybrid system for rapid eye movement detection (REM detection). Comp Biomed Res 4(3):275–290CrossRef
Zurück zum Zitat Špakov O, Majaranta P (2012) Enhanced gaze interaction using simple head gestures. In: Proceedings of the 14th international conference on ubiquitous computing, UbiComp’12. ACM Press, New York, pp 705–710 Špakov O, Majaranta P (2012) Enhanced gaze interaction using simple head gestures. In: Proceedings of the 14th international conference on ubiquitous computing, UbiComp’12. ACM Press, New York, pp 705–710
Zurück zum Zitat Stampe DM, Reingold EM (1995) Selection by looking: a novel computer interface and its application to psychological research. In: Findlay JM, Walker R, Kentridge RW (eds) Eye movement research: mechanisms, processes and applications. Elsevier Science, Amsterdam, pp 467–478CrossRef Stampe DM, Reingold EM (1995) Selection by looking: a novel computer interface and its application to psychological research. In: Findlay JM, Walker R, Kentridge RW (eds) Eye movement research: mechanisms, processes and applications. Elsevier Science, Amsterdam, pp 467–478CrossRef
Zurück zum Zitat Stellmach S, Dachselt F (2012) Look and touch: gaze-supported target acquisition. In: Proceedings of the 2012 ACM annual conference on human factors in computing systems, CHI’12. ACM, New York, pp 2981–2990 Stellmach S, Dachselt F (2012) Look and touch: gaze-supported target acquisition. In: Proceedings of the 2012 ACM annual conference on human factors in computing systems, CHI’12. ACM, New York, pp 2981–2990
Zurück zum Zitat Sugioka A, Ebisawa Y, Ohtani M (1996) Noncontact video-based eye-gaze detection method allowing large head displacements. In: Proceedings of the 18th annual international conference of the IEEE engineering in medicine and biology society. Bridging disciplines for biomedicine, vol 2, pp 526–528 Sugioka A, Ebisawa Y, Ohtani M (1996) Noncontact video-based eye-gaze detection method allowing large head displacements. In: Proceedings of the 18th annual international conference of the IEEE engineering in medicine and biology society. Bridging disciplines for biomedicine, vol 2, pp 526–528
Zurück zum Zitat Ten Kate JH, Frietman EEE, Willems W et al (1979) Eye-switch controlled communication aids. In: Proceedings of the 12th international conference on medical and biological engineering, Jerusalem, Israel, August 1979 Ten Kate JH, Frietman EEE, Willems W et al (1979) Eye-switch controlled communication aids. In: Proceedings of the 12th international conference on medical and biological engineering, Jerusalem, Israel, August 1979
Zurück zum Zitat Tessendorf B, Bulling A, Roggen D et al (2011) Recognition of hearing needs from body and eye movements to improve hearing instruments. In: Proceedings of the 9th international conference on pervasive computing, Pervasive 2011. Lecture notes in computer science, vol 6696. Springer, Heidelberg, pp 314–331 Tessendorf B, Bulling A, Roggen D et al (2011) Recognition of hearing needs from body and eye movements to improve hearing instruments. In: Proceedings of the 9th international conference on pervasive computing, Pervasive 2011. Lecture notes in computer science, vol 6696. Springer, Heidelberg, pp 314–331
Zurück zum Zitat Vehkaoja AT, Verho JA, Puurtinen MM et al (2005) Wireless head cap for EOG and facial EMG measurements. In: Proceedings of the 27th annual international conference of the engineering in medicine and biology society, IEEE EMBS 2005, pp 5865–5868 Vehkaoja AT, Verho JA, Puurtinen MM et al (2005) Wireless head cap for EOG and facial EMG measurements. In: Proceedings of the 27th annual international conference of the engineering in medicine and biology society, IEEE EMBS 2005, pp 5865–5868
Zurück zum Zitat Velichkovsky B, Sprenger A, Unema P (1997) Towards gaze-mediated interaction: collecting solutions of the “Midas touch problem”. In: Proceedings of the IFIP TC13 international conference on human-computer interaction, INTERACT’97. Chapman and Hall, London, pp 509–516 Velichkovsky B, Sprenger A, Unema P (1997) Towards gaze-mediated interaction: collecting solutions of the “Midas touch problem”. In: Proceedings of the IFIP TC13 international conference on human-computer interaction, INTERACT’97. Chapman and Hall, London, pp 509–516
Zurück zum Zitat Venkataramanan, A Prabhat P, Choudhury SR et al (2005) Biomedical instrumentation based on electrooculogram (EOG) signal processing and application to a hospital alarm system. In: Proceedings of the 3rd international conference on intelligent sensing and information processing, ICISIP 2005, IEEE Conference Publications, pp 535–540 Venkataramanan, A Prabhat P, Choudhury SR et al (2005) Biomedical instrumentation based on electrooculogram (EOG) signal processing and application to a hospital alarm system. In: Proceedings of the 3rd international conference on intelligent sensing and information processing, ICISIP 2005, IEEE Conference Publications, pp 535–540
Zurück zum Zitat Vesterby T, Voss JC, Hansen JP et al (2005) Gaze-guided viewing of interactive movies. Digit Creativity 16(4):193–204CrossRef Vesterby T, Voss JC, Hansen JP et al (2005) Gaze-guided viewing of interactive movies. Digit Creativity 16(4):193–204CrossRef
Zurück zum Zitat Vickers S, Istance H, Smalley M (2010) EyeGuitar: making rhythm based music video games accessible using only eye movements. In: Proceedings of the 7th international conference on advances in computer entertainment technology, ACE’10. ACM, New York, pp 36–39 Vickers S, Istance H, Smalley M (2010) EyeGuitar: making rhythm based music video games accessible using only eye movements. In: Proceedings of the 7th international conference on advances in computer entertainment technology, ACE’10. ACM, New York, pp 36–39
Zurück zum Zitat Vidal M, Bulling A, Gellersen H (2013) Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In: Proceedings of the 2013 ACM international joint conference on pervasive and ubiquitous computing, UbiComp 2013. ACM, New York Vidal M, Bulling A, Gellersen H (2013) Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In: Proceedings of the 2013 ACM international joint conference on pervasive and ubiquitous computing, UbiComp 2013. ACM, New York
Zurück zum Zitat Wade NJ, Tatler BW (2005) The moving tablet of the eye: the origins of modern eye movement research. Oxford University Press, OxfordCrossRef Wade NJ, Tatler BW (2005) The moving tablet of the eye: the origins of modern eye movement research. Oxford University Press, OxfordCrossRef
Zurück zum Zitat Ware C, Mikaelian HH (1987) An evaluation of an eye tracker as a device for computer input. In: Proceedings of the SIGCHI/GI conference on human factors in computing systems and graphics interface, CHI and GI’87. ACM, New York, pp 183–188 Ware C, Mikaelian HH (1987) An evaluation of an eye tracker as a device for computer input. In: Proceedings of the SIGCHI/GI conference on human factors in computing systems and graphics interface, CHI and GI’87. ACM, New York, pp 183–188
Zurück zum Zitat Wijesoma WS, Wee Ks, Wee OC et al (2005) EOG based control of mobile assistive platforms for the severely disabled. In: Proceedings of the IEEE international conference on robotics and biomimetics, ROBIO 2005, pp 490–494 Wijesoma WS, Wee Ks, Wee OC et al (2005) EOG based control of mobile assistive platforms for the severely disabled. In: Proceedings of the IEEE international conference on robotics and biomimetics, ROBIO 2005, pp 490–494
Zurück zum Zitat Wobbrock JO, Rubinstein J, Sawyer MW et al (2008) Longitudinal evaluation of discrete consecutive gaze gestures for text entry. In: Proceedings of the symposium on eye tracking research and applications, ETRA 2008. ACM, New York, pp 11–18 Wobbrock JO, Rubinstein J, Sawyer MW et al (2008) Longitudinal evaluation of discrete consecutive gaze gestures for text entry. In: Proceedings of the symposium on eye tracking research and applications, ETRA 2008. ACM, New York, pp 11–18
Zurück zum Zitat Wästlund W, Sponseller K, Pettersson O (2010) What you see is where you go: testing a gaze-driven power wheelchair for individuals with severe multiple disabilities. In: Proceedings of the 2010 symposium on eye-tracking research and applications, ETRA 2010. ACM, New York, pp 133–136 Wästlund W, Sponseller K, Pettersson O (2010) What you see is where you go: testing a gaze-driven power wheelchair for individuals with severe multiple disabilities. In: Proceedings of the 2010 symposium on eye-tracking research and applications, ETRA 2010. ACM, New York, pp 133–136
Zurück zum Zitat Zhang Y, Rasku J, Juhola M (2012) Biometric verification of subjects using saccade eye movements. Int J Biometr 4(4):317–337CrossRef Zhang Y, Rasku J, Juhola M (2012) Biometric verification of subjects using saccade eye movements. Int J Biometr 4(4):317–337CrossRef
Zurück zum Zitat Zhang Y, Bulling A, Gellersen H (2013) SideWays: a gaze interface for spontaneous interaction with situated displays. In: Proceedings of the 31st SIGCHI international conference on human factors in computing systems, CHI 2013. ACM, New York, pp 851–860 Zhang Y, Bulling A, Gellersen H (2013) SideWays: a gaze interface for spontaneous interaction with situated displays. In: Proceedings of the 31st SIGCHI international conference on human factors in computing systems, CHI 2013. ACM, New York, pp 851–860
Zurück zum Zitat Zhu Z, Ji Q (2005) Eye gaze tracking under natural head movements. In: IEEE computer society conference on computer vision and pattern recognition, CVPR 2005, vol 1, pp 918–923 Zhu Z, Ji Q (2005) Eye gaze tracking under natural head movements. In: IEEE computer society conference on computer vision and pattern recognition, CVPR 2005, vol 1, pp 918–923
Metadaten
Titel
Eye Tracking and Eye-Based Human–Computer Interaction
verfasst von
Päivi Majaranta
Andreas Bulling
Copyright-Jahr
2014
Verlag
Springer London
DOI
https://doi.org/10.1007/978-1-4471-6392-3_3

Neuer Inhalt