Skip to main content
Top

2018 | OriginalPaper | Chapter

Non-invasive Gaze Direction Estimation from Head Orientation for Human-Machine Interaction

Authors : Zhi Zheng, Yuguang Wang, Jaclyn Barnes, Xingliang Li, Chung-Hyuk Park, Myounghoon Jeon

Published in: Human-Computer Interaction. Interaction Technologies

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Gaze direction is one of the most important interaction cues that is widely used in human-machine interactions. In scenarios where participants’ head movement is involved and/or participants are sensitive to body-attached sensors, traditional gaze tracking methods, such as using commercial eye trackers are not appropriate. This is because the participants need to hold head pose during tracking or wear invasive sensors that are distractive and uncomfortable. Thus, head orientation has been used to approximate gaze directions in these cases. However, the difference between head orientation and gaze direction has not been thoroughly and numerically evaluated, and thus how to derive gaze direction accurately from head orientation is still an open question. In this article, we have two contributions in solving these problems. First, we evaluated the difference between people’s frontal head orientation and their gaze direction when looking at an object in different directions. Second, we developed functions that can map people’s gaze direction using their frontal head orientation. The accuracy of the proposed gaze tracking method is around 7°, and the method can be easily embedded on top of any existing remote head orientation method to perform non-invasive gaze direction estimation.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Hutchinson, T.E., et al.: Human-computer interaction using eye-gaze input. IEEE Trans. Syst. Man Cybern. 19(6), 1527–1534 (1989)CrossRef Hutchinson, T.E., et al.: Human-computer interaction using eye-gaze input. IEEE Trans. Syst. Man Cybern. 19(6), 1527–1534 (1989)CrossRef
2.
go back to reference Jacob, R., Karn, K.S.: Eye tracking in human-computer interaction and usability research: ready to deliver the promises. Mind 2(3), 4 (2003) Jacob, R., Karn, K.S.: Eye tracking in human-computer interaction and usability research: ready to deliver the promises. Mind 2(3), 4 (2003)
3.
go back to reference Lazar, J., Feng, J.H., Hochheiser, H.: Research Methods in Human-Computer Interaction. Morgan Kaufmann, San Francisco (2017) Lazar, J., Feng, J.H., Hochheiser, H.: Research Methods in Human-Computer Interaction. Morgan Kaufmann, San Francisco (2017)
4.
go back to reference Rich, C., et al.: Recognizing engagement in human-robot interaction. In: 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE (2010) Rich, C., et al.: Recognizing engagement in human-robot interaction. In: 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE (2010)
5.
go back to reference Lallée, S., et al.: Cooperative human robot interaction systems: IV. Communication of Shared Plans with Naïve Humans Using Gaze and Speech. In: 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE (2013) Lallée, S., et al.: Cooperative human robot interaction systems: IV. Communication of Shared Plans with Naïve Humans Using Gaze and Speech. In: 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE (2013)
6.
go back to reference Moon, A., et al.: Meet me where i’m gazing: how shared attention gaze affects human-robot handover timing. In: Proceedings of the 2014 ACM/IEEE International Conference on Human-Robot Interaction. ACM (2014) Moon, A., et al.: Meet me where i’m gazing: how shared attention gaze affects human-robot handover timing. In: Proceedings of the 2014 ACM/IEEE International Conference on Human-Robot Interaction. ACM (2014)
7.
go back to reference Ohshima, T., Yamamoto, H., Tamura, H.: Gaze-directed adaptive rendering for interacting with virtual space. In: Proceedings of the IEEE 1996 Virtual Reality Annual International Symposium. IEEE (1996) Ohshima, T., Yamamoto, H., Tamura, H.: Gaze-directed adaptive rendering for interacting with virtual space. In: Proceedings of the IEEE 1996 Virtual Reality Annual International Symposium. IEEE (1996)
9.
go back to reference Holmqvist, K., et al.: Eye Tracking: a Comprehensive Guide to Methods and Measures. OUP, Oxford (2011) Holmqvist, K., et al.: Eye Tracking: a Comprehensive Guide to Methods and Measures. OUP, Oxford (2011)
10.
go back to reference Ye, Z., et al.: Detecting eye contact using wearable eye-tracking glasses. In: Proceedings of the 2012 ACM Conference on Ubiquitous Computing. ACM (2012) Ye, Z., et al.: Detecting eye contact using wearable eye-tracking glasses. In: Proceedings of the 2012 ACM Conference on Ubiquitous Computing. ACM (2012)
11.
go back to reference Manabe, H., Fukumoto, M., Yagi, T.: Direct gaze estimation based on nonlinearity of eog. IEEE Trans. Biomed. Eng. 62(6), 1553–1562 (2015)CrossRef Manabe, H., Fukumoto, M., Yagi, T.: Direct gaze estimation based on nonlinearity of eog. IEEE Trans. Biomed. Eng. 62(6), 1553–1562 (2015)CrossRef
12.
go back to reference Zheng, Z., et al.: Design of an autonomous social orienting training system (ASOTS) for young children with autism. IEEE Trans. Neural Syst. Rehabil. Eng. 25(6), 668–678 (2017)CrossRef Zheng, Z., et al.: Design of an autonomous social orienting training system (ASOTS) for young children with autism. IEEE Trans. Neural Syst. Rehabil. Eng. 25(6), 668–678 (2017)CrossRef
13.
go back to reference Harezlak, K., Kasprowski, P., Stasch, M.: Towards accurate eye tracker calibration–methods and procedures. Procedia Comput. Sci. 35, 1073–1081 (2014)CrossRef Harezlak, K., Kasprowski, P., Stasch, M.: Towards accurate eye tracker calibration–methods and procedures. Procedia Comput. Sci. 35, 1073–1081 (2014)CrossRef
15.
go back to reference Bekele, E.T., et al.: A step towards developing adaptive robot-mediated intervention architecture (ARIA) for children with autism. IEEE Trans. Neural Syst. Rehabil. Eng. 21(2), 289–299 (2013)CrossRef Bekele, E.T., et al.: A step towards developing adaptive robot-mediated intervention architecture (ARIA) for children with autism. IEEE Trans. Neural Syst. Rehabil. Eng. 21(2), 289–299 (2013)CrossRef
16.
go back to reference Massé, B., Ba, S., Horaud, R.: Tracking gaze and visual focus of attention of people involved in social interaction. IEEE Trans. Pattern Anal. Mach. Intell. (2017) Massé, B., Ba, S., Horaud, R.: Tracking gaze and visual focus of attention of people involved in social interaction. IEEE Trans. Pattern Anal. Mach. Intell. (2017)
17.
go back to reference Sheikhi, S., Odobez, J.-M.: Combining dynamic head pose–gaze mapping with the robot conversational state for attention recognition in human–robot interactions. Pattern Recogn. Lett. 66, 81–90 (2015)CrossRef Sheikhi, S., Odobez, J.-M.: Combining dynamic head pose–gaze mapping with the robot conversational state for attention recognition in human–robot interactions. Pattern Recogn. Lett. 66, 81–90 (2015)CrossRef
18.
go back to reference Mukherjee, S.S., Robertson, N.M.: Deep head pose: Gaze-direction estimation in multimodal video. IEEE Trans. Multimed. 17(11), 2094–2107 (2015)CrossRef Mukherjee, S.S., Robertson, N.M.: Deep head pose: Gaze-direction estimation in multimodal video. IEEE Trans. Multimed. 17(11), 2094–2107 (2015)CrossRef
19.
go back to reference Lu, F., et al.: Learning gaze biases with head motion for head pose-free gaze estimation. Image Vis. Comput. 32(3), 169–179 (2014)CrossRef Lu, F., et al.: Learning gaze biases with head motion for head pose-free gaze estimation. Image Vis. Comput. 32(3), 169–179 (2014)CrossRef
20.
go back to reference Valenti, R., Sebe, N., Gevers, T.: Combining head pose and eye location information for gaze estimation. IEEE Trans. Image Process. 99(1), 1 (2011)MATH Valenti, R., Sebe, N., Gevers, T.: Combining head pose and eye location information for gaze estimation. IEEE Trans. Image Process. 99(1), 1 (2011)MATH
21.
go back to reference Murphy-Chutorian, E., Trivedi, M.M.: Head pose estimation in computer vision: a survey. IEEE Trans. Pattern Anal. Mach. Intell. 31(4), 607–626 (2009)CrossRef Murphy-Chutorian, E., Trivedi, M.M.: Head pose estimation in computer vision: a survey. IEEE Trans. Pattern Anal. Mach. Intell. 31(4), 607–626 (2009)CrossRef
22.
go back to reference Cox, M., et al.: CSIRO face analysis SDK. In: 10th IEEE International Conference on Automatic Face and Gesture Recognition (2013) Cox, M., et al.: CSIRO face analysis SDK. In: 10th IEEE International Conference on Automatic Face and Gesture Recognition (2013)
Metadata
Title
Non-invasive Gaze Direction Estimation from Head Orientation for Human-Machine Interaction
Authors
Zhi Zheng
Yuguang Wang
Jaclyn Barnes
Xingliang Li
Chung-Hyuk Park
Myounghoon Jeon
Copyright Year
2018
DOI
https://doi.org/10.1007/978-3-319-91250-9_30