Skip to main content
Erschienen in: Journal on Multimodal User Interfaces 3/2017

06.03.2017 | Original Paper

Multi-modal user interface combining eye tracking and hand gesture recognition

verfasst von: Hansol Kim, Kun Ha Suh, Eui Chul Lee

Erschienen in: Journal on Multimodal User Interfaces | Ausgabe 3/2017

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Many studies on eye tracking have been conducted in diverse research areas. Nevertheless, eye tracking continues to be limited by low accuracy and a severe vibration problem due to pupil tremors. Furthermore, because almost all selection interactions, such as click events, use a dwell-time or eye-blinking method, eye tracking presents issues for both time consumption and involuntary blinking. In this paper, we therefore propose a multi-modal interaction method using a combination of eye tracking and hand gesture recognition with the commercial hand gesture controller. This method performs global and intuitive navigation using eye tracking, and local and detailed navigation using hand gesture controller. It supports intuitive hand gestures for mouse-button clicking. Experimental results indicate that the targeting time for small points is significantly improved using the proposed method. Especially, the proposed method has advantages in large display with high spatial resolution environment. Also, the proposed clicking interaction and modality switching concept showed accurate recognition rate and positive training effect, respectively.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat MacKenzie IS, Zhang X (2008) Eye typing using word and letter prediction and a fixation algorithm. In: 2008 symposium on eye tracking research and applications, pp 55–58 MacKenzie IS, Zhang X (2008) Eye typing using word and letter prediction and a fixation algorithm. In: 2008 symposium on eye tracking research and applications, pp 55–58
2.
Zurück zum Zitat Zhu Z, Ji Q (2004) Eye and gaze tracking for interactive graphic display. Mach Vis Appl 15(July):139–148 Zhu Z, Ji Q (2004) Eye and gaze tracking for interactive graphic display. Mach Vis Appl 15(July):139–148
3.
Zurück zum Zitat Cutrell E, Guan Z (2007) What are you looking for? An eye-tracking study of information usage in web Search. SIGCHI conference on human factors in computing systems, pp 407–416 Cutrell E, Guan Z (2007) What are you looking for? An eye-tracking study of information usage in web Search. SIGCHI conference on human factors in computing systems, pp 407–416
4.
Zurück zum Zitat Pan B, Hembrooke HA, Gay GK, Granka LA, Feusner MK, Newman JK (2004) Determinants of web page viewing behavior: an eye-tracking study. In: 2004 symposium on eye tracking research and applications, pp 147–154 Pan B, Hembrooke HA, Gay GK, Granka LA, Feusner MK, Newman JK (2004) Determinants of web page viewing behavior: an eye-tracking study. In: 2004 symposium on eye tracking research and applications, pp 147–154
5.
Zurück zum Zitat Lee EC, Park KR (2007) A study on eye gaze estimation method based on cornea model of human eye. Comput Vis/Comput Gr Collab Tech 4418(March):307–317 Lee EC, Park KR (2007) A study on eye gaze estimation method based on cornea model of human eye. Comput Vis/Comput Gr Collab Tech 4418(March):307–317
6.
Zurück zum Zitat Suh KH, Kim Y, Kim Y, Ko D, Lee EC (2015) Monocular eye tracking system using webcam and zoom lens. Adv Multimed Ubiquitous Eng 352(June):135–141 Suh KH, Kim Y, Kim Y, Ko D, Lee EC (2015) Monocular eye tracking system using webcam and zoom lens. Adv Multimed Ubiquitous Eng 352(June):135–141
7.
Zurück zum Zitat Grauman K, Betke M, Gips J, Bradski GR (2001) Communication via eye blinks: detection and duration analysis in real time. In: 2001 IEEE computer society conference on computer vision and pattern recognition, vol 1, pp I-1010-I-1017 Grauman K, Betke M, Gips J, Bradski GR (2001) Communication via eye blinks: detection and duration analysis in real time. In: 2001 IEEE computer society conference on computer vision and pattern recognition, vol 1, pp I-1010-I-1017
8.
Zurück zum Zitat Grauman K, Betke M, Lombardi J, Gips J, Bradski GR (2003) Communication via eye blinks and eyebrow raises: video-based human-computer interfaces. Univers Access Inf Soc 2:2–4CrossRef Grauman K, Betke M, Lombardi J, Gips J, Bradski GR (2003) Communication via eye blinks and eyebrow raises: video-based human-computer interfaces. Univers Access Inf Soc 2:2–4CrossRef
9.
Zurück zum Zitat Yamamoto Y, Yoda I, Sakaue K (2004) Arm-pointing gesture interface using surrounded stereo cameras system. In: 17th international conference on pattern recognition, vol 4, pp 965–970 Yamamoto Y, Yoda I, Sakaue K (2004) Arm-pointing gesture interface using surrounded stereo cameras system. In: 17th international conference on pattern recognition, vol 4, pp 965–970
10.
Zurück zum Zitat Jing P, Yepeng G (2013) Human-computer Interaction using pointing gesture based on an adaptive virtual touch screen. J Signal Process Pattern Recognit 6(4):81–92 Jing P, Yepeng G (2013) Human-computer Interaction using pointing gesture based on an adaptive virtual touch screen. J Signal Process Pattern Recognit 6(4):81–92
11.
Zurück zum Zitat Yepeng G, Mingen Z (2008) Real-time 3D pointing gesture recognition for natural HCI. In: 7th World Congress on intelligent control and automation, pp 2433–2436 Yepeng G, Mingen Z (2008) Real-time 3D pointing gesture recognition for natural HCI. In: 7th World Congress on intelligent control and automation, pp 2433–2436
12.
Zurück zum Zitat Carbini S, Viallet JE, Bernier O (2004) Pointing gesture visual recognition for large display. In: International workshop on visual observation of deictic gestures, pp 27–32 Carbini S, Viallet JE, Bernier O (2004) Pointing gesture visual recognition for large display. In: International workshop on visual observation of deictic gestures, pp 27–32
13.
Zurück zum Zitat Park H, Choi J, Park J, Moon K (2013) A study on hand region detection for kinect-based hand shape recognition. Korean Soc Broadcast Eng 18(3):393–400CrossRef Park H, Choi J, Park J, Moon K (2013) A study on hand region detection for kinect-based hand shape recognition. Korean Soc Broadcast Eng 18(3):393–400CrossRef
14.
Zurück zum Zitat Choi J, Park H, Park J (2011) Hand shape recognition using distance transform and shape decomposition. In: 18th IEEE international conference on image processing (ICIP), pp 3605–3608 Choi J, Park H, Park J (2011) Hand shape recognition using distance transform and shape decomposition. In: 18th IEEE international conference on image processing (ICIP), pp 3605–3608
15.
Zurück zum Zitat Oikonomidis I, Kyriazis N, Argyros A (2010) Markerless and efficient 26-DOF hand pose recovery. In: 10th Asian conference on computer vision, pp 744–757 Oikonomidis I, Kyriazis N, Argyros A (2010) Markerless and efficient 26-DOF hand pose recovery. In: 10th Asian conference on computer vision, pp 744–757
16.
Zurück zum Zitat Cao C, Sun Y, Li R, Chen L (2011) Hand posture recognition via joint feature sparse representation. Optical Eng 50(12):127210CrossRef Cao C, Sun Y, Li R, Chen L (2011) Hand posture recognition via joint feature sparse representation. Optical Eng 50(12):127210CrossRef
17.
Zurück zum Zitat Shin G, Chun J (2007) Vision-based multimodal human computer interface based on parallel tracking of eye and hand motion. In: 2007 international conference on convergence information technology, pp 2443–2448 Shin G, Chun J (2007) Vision-based multimodal human computer interface based on parallel tracking of eye and hand motion. In: 2007 international conference on convergence information technology, pp 2443–2448
18.
Zurück zum Zitat Bachmann D, Weichert F, Rinkenauer G (2014) Evaluation of the leap motion controller as a new contact-free pointing device. Sensors 15:214–233CrossRef Bachmann D, Weichert F, Rinkenauer G (2014) Evaluation of the leap motion controller as a new contact-free pointing device. Sensors 15:214–233CrossRef
21.
Zurück zum Zitat Guna J, Jakus G, Pogacnik M, Tomazic S, Sodnik J (2014) An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking. Sensors 14:3702–3720CrossRef Guna J, Jakus G, Pogacnik M, Tomazic S, Sodnik J (2014) An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking. Sensors 14:3702–3720CrossRef
Metadaten
Titel
Multi-modal user interface combining eye tracking and hand gesture recognition
verfasst von
Hansol Kim
Kun Ha Suh
Eui Chul Lee
Publikationsdatum
06.03.2017
Verlag
Springer International Publishing
Erschienen in
Journal on Multimodal User Interfaces / Ausgabe 3/2017
Print ISSN: 1783-7677
Elektronische ISSN: 1783-8738
DOI
https://doi.org/10.1007/s12193-017-0242-2

Weitere Artikel der Ausgabe 3/2017

Journal on Multimodal User Interfaces 3/2017 Zur Ausgabe

Premium Partner