Skip to main content
Erschienen in: Progress in Artificial Intelligence 1/2023

31.01.2023 | Regular Paper

Strabismus free gaze detection system for driver’s using deep learning technique

verfasst von: Nageshwar Nath Pandey, Naresh Babu Muppalaneni

Erschienen in: Progress in Artificial Intelligence | Ausgabe 1/2023

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Negligence driving behavior is one of the principal reasons of vehicle collisions as well as fatal deaths on road. The proper clues about gaze zone can play a significant role to justify the driver’s state of alertness. Although, supervising real-time driving performance exhibit some challenges such as distraction after long journey, lighting variations, eye recognition under sun glasses and so on. Including these challenges, there is a trend to issue the fake driving licenses by road authorities, even though the driver has already been suffering from vision problem as well as problem of occluded eye due to maximum pose variation. To resolve these issues, our proposed work scrutinize such frames which persist vision trouble primarily in the form of strabismus as well occluded eye frame, and filtered those frames from the dataset. Further, Histogram equalization is applied over each region of interest (ROI) image to diminish the light effect. Thus, strabismus containing frame as well as occluded eye frame removal and normalization via Histogram equalization upgrade the overall functioning of our model. Afterward, we have employed three parallel convolution neural networks (CNN) to capture features from such an absolutely refined ROI frames and Euclidean distances were computed based on the these features. Further, fusion scores were calculated over these estimated distances to classify the driver’s gaze zones. The proposed model is justified on open Columbia gaze dataset CAVE-DB. The designed method revealed the better accuracy in contrast of the earlier gaze categorization method.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Ji, Q., Yang, X.: Real-time eye, gaze, and face pose tracking for monitoring driver vigilance. Real-time Imaging 8(5), 357–377 (2002)CrossRefMATH Ji, Q., Yang, X.: Real-time eye, gaze, and face pose tracking for monitoring driver vigilance. Real-time Imaging 8(5), 357–377 (2002)CrossRefMATH
2.
Zurück zum Zitat Dong, Y., Hu, Z., Uchimura, K., Murayama, N.: Driver inattention monitoring system for intelligent vehicles: a review. IEEE Trans. Intell. Transp. Syst. 12(2), 596–614 (2010)CrossRef Dong, Y., Hu, Z., Uchimura, K., Murayama, N.: Driver inattention monitoring system for intelligent vehicles: a review. IEEE Trans. Intell. Transp. Syst. 12(2), 596–614 (2010)CrossRef
3.
Zurück zum Zitat National Center for Statistics and Analysis.: Distracted driving in fatal crashes, 2017. (Traffic Safety Facts Research Note. Report No. DOT HS 812 700). Washington, DC: National Highway Traffic Safety Administration (2019) National Center for Statistics and Analysis.: Distracted driving in fatal crashes, 2017. (Traffic Safety Facts Research Note. Report No. DOT HS 812 700). Washington, DC: National Highway Traffic Safety Administration (2019)
4.
Zurück zum Zitat National Center for Statistics and Analysis. Driver electronic device use in 2017 (Traffic Safety Facts Research Note. Report No. DOT HS 812 665). Washington, DC: National Highway Traffic Safety Administration (2019) National Center for Statistics and Analysis. Driver electronic device use in 2017 (Traffic Safety Facts Research Note. Report No. DOT HS 812 665). Washington, DC: National Highway Traffic Safety Administration (2019)
5.
Zurück zum Zitat Bhatt, P.P., Trivedi, J.A.: Various methods for driver drowsiness detection: an overview. Int. J. Comput. Sci. Eng. (IJCSE) 9(3), 70–74 (2017) Bhatt, P.P., Trivedi, J.A.: Various methods for driver drowsiness detection: an overview. Int. J. Comput. Sci. Eng. (IJCSE) 9(3), 70–74 (2017)
6.
Zurück zum Zitat McDonald, A. D., Schwarz, C., Lee, J.D., Brown, T.L.: Real-time detection of drowsiness related lane departures using steering wheel angle. In: Proceedings of the human factors and ergonomics society annual meeting, vol. 56, no. 1, pp. 2201–2205. Sage CA: Los Angeles, CA: SAGE Publications (2012) McDonald, A. D., Schwarz, C., Lee, J.D., Brown, T.L.: Real-time detection of drowsiness related lane departures using steering wheel angle. In: Proceedings of the human factors and ergonomics society annual meeting, vol. 56, no. 1, pp. 2201–2205. Sage CA: Los Angeles, CA: SAGE Publications (2012)
7.
Zurück zum Zitat Zheng, A., Zheng, R., Kaizuka, T., Shimono, K., Nakano, K.: W c guidance steering system on fatigue-related driver behavior. IEEE Trans. Hum.-Mach. Syst. 47(5), 741–748 (2017)CrossRef Zheng, A., Zheng, R., Kaizuka, T., Shimono, K., Nakano, K.: W c guidance steering system on fatigue-related driver behavior. IEEE Trans. Hum.-Mach. Syst. 47(5), 741–748 (2017)CrossRef
8.
Zurück zum Zitat Bhujbal, P. N., and Narote, S. P.: Lane departure warning system based on Hough transform and Euclidean distance. In: 2015 third international conference on image information processing (ICIIP), IEEE, pp. 370–373 (2015) Bhujbal, P. N., and Narote, S. P.: Lane departure warning system based on Hough transform and Euclidean distance. In: 2015 third international conference on image information processing (ICIIP), IEEE, pp. 370–373 (2015)
9.
Zurück zum Zitat Zhang, R.-H., You, F., Chen, F., He, W.-Q.: Vehicle detection method for intelligent vehicle at night time based on video and laser information. Int. J. Pattern Recognit. Artif. Intell. 32(4), 1850009 (2018)MathSciNetCrossRef Zhang, R.-H., You, F., Chen, F., He, W.-Q.: Vehicle detection method for intelligent vehicle at night time based on video and laser information. Int. J. Pattern Recognit. Artif. Intell. 32(4), 1850009 (2018)MathSciNetCrossRef
10.
Zurück zum Zitat Zhang, R.-H., He, Z.-C., Wang, H.-W., You, F., Li, K.-N.: Study on self-tuning tyre friction control for developing main-servo loop integrated chassis control system. IEEE Access 5, 6649–6660 (2017)CrossRef Zhang, R.-H., He, Z.-C., Wang, H.-W., You, F., Li, K.-N.: Study on self-tuning tyre friction control for developing main-servo loop integrated chassis control system. IEEE Access 5, 6649–6660 (2017)CrossRef
11.
Zurück zum Zitat Srivastava, S., Delp, E.J.: Video-based real-time surveillance of vehicles. J. Electr. Imaging 22(4), 041103 (2013)CrossRef Srivastava, S., Delp, E.J.: Video-based real-time surveillance of vehicles. J. Electr. Imaging 22(4), 041103 (2013)CrossRef
12.
Zurück zum Zitat Danisman, T., Bilasco, I. M., Djeraba, C., and Ihaddadene, N.: Drowsy driver detection system using eye blink patterns. In: 2010 international conference on machine and web intelligence, IEEE, pp. 230–233 (2010) Danisman, T., Bilasco, I. M., Djeraba, C., and Ihaddadene, N.: Drowsy driver detection system using eye blink patterns. In: 2010 international conference on machine and web intelligence, IEEE, pp. 230–233 (2010)
13.
Zurück zum Zitat Catalbas, M. C., Cegovnik, T., Sodnik, J., and Gulten, A.: Driver fatigue detection based on saccadic eye movements. In: 10th international conference on electrical and electronics engineering (ELECO), IEEE, pp. 913–917 (2017) Catalbas, M. C., Cegovnik, T., Sodnik, J., and Gulten, A.: Driver fatigue detection based on saccadic eye movements. In: 10th international conference on electrical and electronics engineering (ELECO), IEEE, pp. 913–917 (2017)
14.
Zurück zum Zitat Danisman, T., Bilasco, I.M., Djeraba, C., and Ihaddadene, N.: Drowsy driver detection system using eye blink patterns. In: International conference on machine and web intelligence, IEEE, pp. 230–233 (2010) Danisman, T., Bilasco, I.M., Djeraba, C., and Ihaddadene, N.: Drowsy driver detection system using eye blink patterns. In: International conference on machine and web intelligence, IEEE, pp. 230–233 (2010)
15.
Zurück zum Zitat Ghosh, S., Nandy, T., and Manna, N.: Real time eye detection and tracking method for driver assistance system. In: Advancements of Medical Electronics, Springer, New Delhi, pp. 13–25 (2015) Ghosh, S., Nandy, T., and Manna, N.: Real time eye detection and tracking method for driver assistance system. In: Advancements of Medical Electronics, Springer, New Delhi, pp. 13–25 (2015)
16.
Zurück zum Zitat Asim, A.-A., Ghassan, J., Nordin, M.J., Razooq, M.M.: Automatic driver drowsiness detection using Haar algorithm and support vector machine techniques. Asian J. Appl. Sci. 8(2), 149–157 (2015)CrossRef Asim, A.-A., Ghassan, J., Nordin, M.J., Razooq, M.M.: Automatic driver drowsiness detection using Haar algorithm and support vector machine techniques. Asian J. Appl. Sci. 8(2), 149–157 (2015)CrossRef
17.
Zurück zum Zitat Aguilar, W. G., Estrella, J. I., López, W., and Abad, V.: Driver fatigue detection based on real-time eye gaze pattern analysis. In: International conference on intelligent robotics and applications, Springer, Cham, pp. 683–694 (2017) Aguilar, W. G., Estrella, J. I., López, W., and Abad, V.: Driver fatigue detection based on real-time eye gaze pattern analysis. In: International conference on intelligent robotics and applications, Springer, Cham, pp. 683–694 (2017)
18.
Zurück zum Zitat Reddy, B., Kim, Y.-H., Yun, S., Seo, C., and Jang, J.: Real-time driver drowsiness detection for embedded system using model compression of deep neural networks. In: Proceedings of the IEEE conference on computer vision and pattern recognition workshops, pp. 121–128 (2017) Reddy, B., Kim, Y.-H., Yun, S., Seo, C., and Jang, J.: Real-time driver drowsiness detection for embedded system using model compression of deep neural networks. In: Proceedings of the IEEE conference on computer vision and pattern recognition workshops, pp. 121–128 (2017)
19.
Zurück zum Zitat Zhang, F., Su, J., Geng, L., and Xiao, Z.: Driver fatigue detection based on eye state recognition. In: International conference on machine vision and information technology (CMVIT), IEEE, pp. 105–110 (2017) Zhang, F., Su, J., Geng, L., and Xiao, Z.: Driver fatigue detection based on eye state recognition. In: International conference on machine vision and information technology (CMVIT), IEEE, pp. 105–110 (2017)
20.
Zurück zum Zitat Coughlin, J.F., Reimer, B., Mehler, B.: Monitoring, managing, and motivating driver safety and well-being. IEEE Perv. Comput. 10(3), 14–21 (2011)CrossRef Coughlin, J.F., Reimer, B., Mehler, B.: Monitoring, managing, and motivating driver safety and well-being. IEEE Perv. Comput. 10(3), 14–21 (2011)CrossRef
21.
Zurück zum Zitat Pan, W., Jung, D., Yoon, H.S., Lee, D.E., Naqvi, R.A., Lee, K.W., Park, K.R.: Empirical study on designing of gaze tracking camera based on the information user’s of head movement. Sensors 16(9), 1396 (2016)CrossRef Pan, W., Jung, D., Yoon, H.S., Lee, D.E., Naqvi, R.A., Lee, K.W., Park, K.R.: Empirical study on designing of gaze tracking camera based on the information user’s of head movement. Sensors 16(9), 1396 (2016)CrossRef
22.
Zurück zum Zitat Jung, D., Lee, J.M., Gwon, S.Y., Pan, W., Lee, H.C., Park, K.R., Kim, H.-C.: Compensation method of natural head movement for gaze tracking system using an ultrasonic sensor for distance measurement. Sensors 16(1), 110 (2016)CrossRef Jung, D., Lee, J.M., Gwon, S.Y., Pan, W., Lee, H.C., Park, K.R., Kim, H.-C.: Compensation method of natural head movement for gaze tracking system using an ultrasonic sensor for distance measurement. Sensors 16(1), 110 (2016)CrossRef
23.
Zurück zum Zitat Lee, S., Yoo, J., Han, G.: Gaze-assisted user intention prediction for initial delay reduction in web video access. Sensors 15(6), 14679–14700 (2015)CrossRef Lee, S., Yoo, J., Han, G.: Gaze-assisted user intention prediction for initial delay reduction in web video access. Sensors 15(6), 14679–14700 (2015)CrossRef
24.
Zurück zum Zitat Franchak, J.M., Kretch, K.S., Soska, K.C., Adolph, K.E.: Head-mounted eye tracking: a new method to describe infant looking. Child Dev. 82(6), 1738–1750 (2011)CrossRef Franchak, J.M., Kretch, K.S., Soska, K.C., Adolph, K.E.: Head-mounted eye tracking: a new method to describe infant looking. Child Dev. 82(6), 1738–1750 (2011)CrossRef
25.
Zurück zum Zitat Noris, B., Keller, J.-B., Billard, A.: A wearable gaze tracking system for children in unconstrained environments. Comput. Vis. Image Understand. 115(4), 476–486 (2011)CrossRef Noris, B., Keller, J.-B., Billard, A.: A wearable gaze tracking system for children in unconstrained environments. Comput. Vis. Image Understand. 115(4), 476–486 (2011)CrossRef
26.
Zurück zum Zitat Lee, E.C., Ko, Y.J., Park, K.R.: Gaze tracking based on active appearance model and multiple support vector regression on mobile devices. Opt. Eng. 48(7), 077002 (2009)CrossRef Lee, E.C., Ko, Y.J., Park, K.R.: Gaze tracking based on active appearance model and multiple support vector regression on mobile devices. Opt. Eng. 48(7), 077002 (2009)CrossRef
27.
Zurück zum Zitat Toyama, T., Dengel, A., Suzuki, W., and Kise, K.: Wearable reading assists system: Augmented reality document combining document retrieval and eye tracking. In: 12th international conference on document analysis and recognition, IEEE, pp. 30–34 (2013) Toyama, T., Dengel, A., Suzuki, W., and Kise, K.: Wearable reading assists system: Augmented reality document combining document retrieval and eye tracking. In: 12th international conference on document analysis and recognition, IEEE, pp. 30–34 (2013)
28.
Zurück zum Zitat Su, M.-C., Wang, K.-C., Chen, G.-D.: An eye tracking system and its application in aids for people with severe disabilities. Biomed. Eng. Appl. Basis Commun. 18(6), 319–327 (2006)CrossRef Su, M.-C., Wang, K.-C., Chen, G.-D.: An eye tracking system and its application in aids for people with severe disabilities. Biomed. Eng. Appl. Basis Commun. 18(6), 319–327 (2006)CrossRef
29.
Zurück zum Zitat Kumar, M., Garfinkel, T., Boneh, D., and Winograd, T.: Reducing shoulder-surfing by using gaze-based password entry. In: Proceedings of the 3rd symposium on usable privacy and security, pp. 13–19 (2007) Kumar, M., Garfinkel, T., Boneh, D., and Winograd, T.: Reducing shoulder-surfing by using gaze-based password entry. In: Proceedings of the 3rd symposium on usable privacy and security, pp. 13–19 (2007)
30.
Zurück zum Zitat Bulling, A., Alt, F., and Schmidt, A.: Increasing the security of gaze-based cued-recall graphical passwords using saliency masks. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp. 3011–3020 (2012) Bulling, A., Alt, F., and Schmidt, A.: Increasing the security of gaze-based cued-recall graphical passwords using saliency masks. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp. 3011–3020 (2012)
31.
Zurück zum Zitat Iqbal, S. T., and Bailey, B. P.: Using eye gaze patterns to identify user tasks. In: The Grace Hopper Celebration of Women in Computing, vol. 1 (2004) Iqbal, S. T., and Bailey, B. P.: Using eye gaze patterns to identify user tasks. In: The Grace Hopper Celebration of Women in Computing, vol. 1 (2004)
32.
Zurück zum Zitat Tawari, A., and Trivedi, M. M.: Robust and continuous estimation of driver gaze zone by dynamic analysis of multiple face videos. In: IEEE intelligent vehicles symposium proceedings, pp. 344–349 (2014) Tawari, A., and Trivedi, M. M.: Robust and continuous estimation of driver gaze zone by dynamic analysis of multiple face videos. In: IEEE intelligent vehicles symposium proceedings, pp. 344–349 (2014)
33.
Zurück zum Zitat Tawari, A., Chen, K. H., and Trivedi, M. M.: Where is the driver looking: analysis of head, eye and iris for robust gaze zone estimation. In: 17th international IEEE conference on intelligent transportation systems (ITSC), pp. 988–994 (2014) Tawari, A., Chen, K. H., and Trivedi, M. M.: Where is the driver looking: analysis of head, eye and iris for robust gaze zone estimation. In: 17th international IEEE conference on intelligent transportation systems (ITSC), pp. 988–994 (2014)
34.
Zurück zum Zitat Liang, Y., Reyes, M.L., Lee, J.D.: Real-time detection of driver cognitive distraction using support vector machines. IEEE Trans. Intell. Transp. Syst. 8(2), 340–350 (2007)CrossRef Liang, Y., Reyes, M.L., Lee, J.D.: Real-time detection of driver cognitive distraction using support vector machines. IEEE Trans. Intell. Transp. Syst. 8(2), 340–350 (2007)CrossRef
35.
Zurück zum Zitat Ahlstrom, C., Kircher, K., Kircher, A.: A gaze-based driver distraction warning system and its effect on visual behavior. IEEE Trans. Intell. Transp. Syst. 14(2), 965–973 (2013)CrossRef Ahlstrom, C., Kircher, K., Kircher, A.: A gaze-based driver distraction warning system and its effect on visual behavior. IEEE Trans. Intell. Transp. Syst. 14(2), 965–973 (2013)CrossRef
36.
Zurück zum Zitat Lee, S.J., Jo, J., Jung, H.G., Park, K.R., Kim, J.: Real-time gaze estimator based on driver’s head orientation for forward collision warning system. IEEE Trans. Intell. Transp. Syst. 12(1), 254–267 (2011)CrossRef Lee, S.J., Jo, J., Jung, H.G., Park, K.R., Kim, J.: Real-time gaze estimator based on driver’s head orientation for forward collision warning system. IEEE Trans. Intell. Transp. Syst. 12(1), 254–267 (2011)CrossRef
37.
Zurück zum Zitat Fu, X., Guan, X., Peli, E., Liu, H., Luo, G.: Automatic calibration method for driver’s head orientation in natural driving environment. IEEE Trans. Intell. Transp. Syst. 14(1), 303–312 (2012)CrossRef Fu, X., Guan, X., Peli, E., Liu, H., Luo, G.: Automatic calibration method for driver’s head orientation in natural driving environment. IEEE Trans. Intell. Transp. Syst. 14(1), 303–312 (2012)CrossRef
38.
Zurück zum Zitat Vicente, F., Huang, Z., Xiong, X., De la Torre, F., Zhang, W., Levi, D.: Driver gaze tracking and eyes off the road detection system. IEEE Trans. Intell. Transp. Syst. 16(4), 2014–2027 (2015)CrossRef Vicente, F., Huang, Z., Xiong, X., De la Torre, F., Zhang, W., Levi, D.: Driver gaze tracking and eyes off the road detection system. IEEE Trans. Intell. Transp. Syst. 16(4), 2014–2027 (2015)CrossRef
39.
Zurück zum Zitat He, Y., Jiali, C., Tan, T., and Wang, Y.: Key techniques and methods for imaging iris in focus. In: 18th international conference on pattern recognition (ICPR'06), IEEE, vol. 4, pp. 557–561 (2006) He, Y., Jiali, C., Tan, T., and Wang, Y.: Key techniques and methods for imaging iris in focus. In: 18th international conference on pattern recognition (ICPR'06), IEEE, vol. 4, pp. 557–561 (2006)
40.
Zurück zum Zitat Chuang, M.-C., Bala, R., Bernal, E. A., Paul, P., and Burry, A.: Estimating gaze direction of vehicle drivers using a smartphone camera. In: Proceedings of the IEEE conference on computer vision and pattern recognition workshops, pp. 165–170 (2014) Chuang, M.-C., Bala, R., Bernal, E. A., Paul, P., and Burry, A.: Estimating gaze direction of vehicle drivers using a smartphone camera. In: Proceedings of the IEEE conference on computer vision and pattern recognition workshops, pp. 165–170 (2014)
41.
Zurück zum Zitat Anandan, P., Bergen, J. R., Hanna, K. J., and Hingorani, R.: Hierarchical model-based motion estimation: In: Motion Analysis and Image Sequence Processing, Springer, Boston, MA, pp. 1–22 (1993) Anandan, P., Bergen, J. R., Hanna, K. J., and Hingorani, R.: Hierarchical model-based motion estimation: In: Motion Analysis and Image Sequence Processing, Springer, Boston, MA, pp. 1–22 (1993)
42.
Zurück zum Zitat Fridman, L., Lee, J., Reimer, V., Victor, T.: ‘Owl’and ‘Lizard’: patterns of head pose and eye pose in driver gaze classification. IET Comput. Vis. 10(4), 308–314 (2016)CrossRef Fridman, L., Lee, J., Reimer, V., Victor, T.: ‘Owl’and ‘Lizard’: patterns of head pose and eye pose in driver gaze classification. IET Comput. Vis. 10(4), 308–314 (2016)CrossRef
43.
Zurück zum Zitat George, A., and Routray A.: Real-time eye gaze direction classification using convolutional neural network. In: International conference on signal processing and communications (SPCOM), IEEE, pp. 1–5 (2016) George, A., and Routray A.: Real-time eye gaze direction classification using convolutional neural network. In: International conference on signal processing and communications (SPCOM), IEEE, pp. 1–5 (2016)
44.
Zurück zum Zitat Vora, S., Rangesh, A., and Trivedim, M.M.: On generalizing driver gaze zone estimation using convolutional neural networks. In IEEE intelligent vehicles symposium (IV), IEEE, pp. 849–854, (2017) Vora, S., Rangesh, A., and Trivedim, M.M.: On generalizing driver gaze zone estimation using convolutional neural networks. In IEEE intelligent vehicles symposium (IV), IEEE, pp. 849–854, (2017)
45.
Zurück zum Zitat Konrad, R., Shikhar S., and Paroma V.: Near-Eye Display Gaze Tracking via Convolutional Neural Networks Konrad, R., Shikhar S., and Paroma V.: Near-Eye Display Gaze Tracking via Convolutional Neural Networks
46.
Zurück zum Zitat Krafka, K., Khosla, A., Kellnhofer, P., Kannan, H., Bhandarkar, S., Matusik, W., and Torralba, A.: Eye tracking for everyone. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 2176–2184 (2016) Krafka, K., Khosla, A., Kellnhofer, P., Kannan, H., Bhandarkar, S., Matusik, W., and Torralba, A.: Eye tracking for everyone. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 2176–2184 (2016)
47.
Zurück zum Zitat Zhang, X., Yusuke S., Mario F., and Bulling, A.: Appearance-based gaze estimation in the wild. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 4511–4520 (2015) Zhang, X., Yusuke S., Mario F., and Bulling, A.: Appearance-based gaze estimation in the wild. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 4511–4520 (2015)
48.
Zurück zum Zitat Ghosh, S., Tanaya, N., and Manna, N.: Real time eye detection and tracking method for driver assistance system. In: Advancements of Medical Electronics, Springer, New Delhi, pp. 13–25 (2015) Ghosh, S., Tanaya, N., and Manna, N.: Real time eye detection and tracking method for driver assistance system. In: Advancements of Medical Electronics, Springer, New Delhi, pp. 13–25 (2015)
49.
Zurück zum Zitat Cyganek, B., Gruszczyński, S.: Hybrid computer vision system for drivers’ eye recognition and fatigue monitoring. Neurocomputing 126, 78–94 (2014)CrossRef Cyganek, B., Gruszczyński, S.: Hybrid computer vision system for drivers’ eye recognition and fatigue monitoring. Neurocomputing 126, 78–94 (2014)CrossRef
50.
Zurück zum Zitat El Kaddouhi, S., Saaidi, A., Abarkan, M.: Eye detection based on the Viola-Jones method and corners points. Multimed. Tools Appl. 76(21), 23077–23097 (2017)CrossRef El Kaddouhi, S., Saaidi, A., Abarkan, M.: Eye detection based on the Viola-Jones method and corners points. Multimed. Tools Appl. 76(21), 23077–23097 (2017)CrossRef
51.
Zurück zum Zitat Wu, Y.-L., Chun-Tsai, Y., Wei-Chih, H., Cheng-Yuan, T.: Gaze direction estimation using support vector machine with active appearance model. Multimed. Tools Appl. 70(3), 2037–2062 (2014)CrossRef Wu, Y.-L., Chun-Tsai, Y., Wei-Chih, H., Cheng-Yuan, T.: Gaze direction estimation using support vector machine with active appearance model. Multimed. Tools Appl. 70(3), 2037–2062 (2014)CrossRef
52.
Zurück zum Zitat Lu, H.-C., Chao W., and Yen-wei C.: Gaze tracking by binocular vision and LBP features. In: 19th international conference on pattern recognition, IEEE, pp. 1–4 (2008) Lu, H.-C., Chao W., and Yen-wei C.: Gaze tracking by binocular vision and LBP features. In: 19th international conference on pattern recognition, IEEE, pp. 1–4 (2008)
53.
Zurück zum Zitat Choi, I.-H., Sung K. H., and Yong-Guk K.: Real-time categorization of driver's gaze zone using the deep learning techniques. In: International conference on big data and smart computing (BigComp), IEEE, pp. 143–148 (2016) Choi, I.-H., Sung K. H., and Yong-Guk K.: Real-time categorization of driver's gaze zone using the deep learning techniques. In: International conference on big data and smart computing (BigComp), IEEE, pp. 143–148 (2016)
54.
Zurück zum Zitat Naqvi, R.A., Muhammad, A., Ganbayar, B., Hyo, S.Y., Kang, R.P.: Deep learning-based gaze detection system for automobile drivers using a NIR camera sensor. Sensors 18(2), 456 (2018)CrossRef Naqvi, R.A., Muhammad, A., Ganbayar, B., Hyo, S.Y., Kang, R.P.: Deep learning-based gaze detection system for automobile drivers using a NIR camera sensor. Sensors 18(2), 456 (2018)CrossRef
56.
Zurück zum Zitat Kazemi, V., and Josephine S.: One millisecond face alignment with an ensemble of regression trees. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 1867–1874 (2014) Kazemi, V., and Josephine S.: One millisecond face alignment with an ensemble of regression trees. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 1867–1874 (2014)
57.
Zurück zum Zitat Zhu, Z., Qiang, J.: Eye and gaze tracking for interactive graphic display. Mach. Vis. Appl. 15(3), 139–148 (2004)CrossRef Zhu, Z., Qiang, J.: Eye and gaze tracking for interactive graphic display. Mach. Vis. Appl. 15(3), 139–148 (2004)CrossRef
58.
Zurück zum Zitat Zhu, Z., Qiang, J.: Robust real-time eye detection and tracking under variable lighting conditions and various face orientations. Comput. Vis. Image Understand. 98(1), 124–154 (2005)CrossRef Zhu, Z., Qiang, J.: Robust real-time eye detection and tracking under variable lighting conditions and various face orientations. Comput. Vis. Image Understand. 98(1), 124–154 (2005)CrossRef
59.
Zurück zum Zitat Ji, Y., Shigang, W., Yang, L., Jian, W., Yan, Z.: Eye and mouth state detection algorithm based on contour feature extraction. J. Electr. Imaging 27(5), 051205 (2018)CrossRef Ji, Y., Shigang, W., Yang, L., Jian, W., Yan, Z.: Eye and mouth state detection algorithm based on contour feature extraction. J. Electr. Imaging 27(5), 051205 (2018)CrossRef
60.
Zurück zum Zitat Parkhi, O.M., Andrea V., and Andrew Z.: Deep face recognition (2015) Parkhi, O.M., Andrea V., and Andrew Z.: Deep face recognition (2015)
61.
Zurück zum Zitat Simonyan, K., and Andrew Z.: Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv, 1409–1556 (2014) Simonyan, K., and Andrew Z.: Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv, 1409–1556 (2014)
63.
Zurück zum Zitat Krizhevsky, A., Ilya S., and Geoffrey E. H.: Imagenet classification with deep convolutional neural networks. In: Advances in Neural Information Processing Systems, pp. 1097–1105 (2012) Krizhevsky, A., Ilya S., and Geoffrey E. H.: Imagenet classification with deep convolutional neural networks. In: Advances in Neural Information Processing Systems, pp. 1097–1105 (2012)
Metadaten
Titel
Strabismus free gaze detection system for driver’s using deep learning technique
verfasst von
Nageshwar Nath Pandey
Naresh Babu Muppalaneni
Publikationsdatum
31.01.2023
Verlag
Springer Berlin Heidelberg
Erschienen in
Progress in Artificial Intelligence / Ausgabe 1/2023
Print ISSN: 2192-6352
Elektronische ISSN: 2192-6360
DOI
https://doi.org/10.1007/s13748-023-00296-8

Weitere Artikel der Ausgabe 1/2023

Progress in Artificial Intelligence 1/2023 Zur Ausgabe

Premium Partner