Skip to main content

2020 | OriginalPaper | Buchkapitel

Recognition and Localisation of Pointing Gestures Using a RGB-D Camera

verfasst von : Naina Dhingra, Eugenio Valli, Andreas Kunz

Erschienen in: HCI International 2020 - Posters

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Non-verbal communication is part of our regular conversation, and multiple gestures are used to exchange information. Among those gestures, pointing is the most important one. If such gestures cannot be perceived by other team members, e.g. by blind and visually impaired people (BVIP), they lack important information and can hardly participate in a lively workflow. Thus, this paper describes a system for detecting such pointing gestures to provide input for suitable output modalities to BVIP. Our system employs an RGB-D camera to recognize the pointing gestures performed by the users. The system also locates the target of pointing e.g. on a common workspace. We evaluated the system by conducting a user study with 26 users. The results show that the system has a success rate of 89.59 and 79.92 % for a \(2 \times 3\) matrix using the left and right arm respectively, and 73.57 and 68.99% for \(3 \times 4\) matrix using the left and right arm respectively.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Brannigan, C.R., Humphries, D.A.: Human non-verbal behavior, a means of communication. Etholog. Stud. Child Behav., 37–64 (1972) Brannigan, C.R., Humphries, D.A.: Human non-verbal behavior, a means of communication. Etholog. Stud. Child Behav., 37–64 (1972)
3.
Zurück zum Zitat Dhingra, N., Kunz, A.: Res3atn-deep 3D residual attention network for hand gesture recognition in videos. In: International Conference on 3D Vision (3DV 2019), pp. 491–501. IEEE (2019) Dhingra, N., Kunz, A.: Res3atn-deep 3D residual attention network for hand gesture recognition in videos. In: International Conference on 3D Vision (3DV 2019), pp. 491–501. IEEE (2019)
4.
Zurück zum Zitat Günther, S., et al.: Mapvi: meeting accessibility for persons with visual impairments. In: Proceedings of the 12th ACM International Conference on PErvasive Technologies Related to Assistive Environments, pp. 343–352. ACM (2019) Günther, S., et al.: Mapvi: meeting accessibility for persons with visual impairments. In: Proceedings of the 12th ACM International Conference on PErvasive Technologies Related to Assistive Environments, pp. 343–352. ACM (2019)
5.
Zurück zum Zitat Huang, Y., Liu, X., Zhang, X., Jin, L.: A pointing gesture based egocentric interaction system: dataset, approach and application. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, pp. 16–23 (2016) Huang, Y., Liu, X., Zhang, X., Jin, L.: A pointing gesture based egocentric interaction system: dataset, approach and application. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, pp. 16–23 (2016)
6.
Zurück zum Zitat LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature 521(7553), 436 (2015)CrossRef LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature 521(7553), 436 (2015)CrossRef
7.
Zurück zum Zitat Munaro, M., Basso, F., Menegatti, E.: Openptrack: open source multi-camera calibration and people tracking for RGB-D camera networks. Robot. Auton. Syst. 75, 525–538 (2016)CrossRef Munaro, M., Basso, F., Menegatti, E.: Openptrack: open source multi-camera calibration and people tracking for RGB-D camera networks. Robot. Auton. Syst. 75, 525–538 (2016)CrossRef
9.
Zurück zum Zitat Park, C.B., Lee, S.W.: Real-time 3D pointing gesture recognition for mobile robots with cascade HMM and particle filter. Image Vis. Comput. 29(1), 51–63 (2011)CrossRef Park, C.B., Lee, S.W.: Real-time 3D pointing gesture recognition for mobile robots with cascade HMM and particle filter. Image Vis. Comput. 29(1), 51–63 (2011)CrossRef
10.
Zurück zum Zitat Quam, D.L.: Gesture recognition with a dataglove. In: IEEE Conference on Aerospace and Electronics, pp. 755–760. IEEE (1990) Quam, D.L.: Gesture recognition with a dataglove. In: IEEE Conference on Aerospace and Electronics, pp. 755–760. IEEE (1990)
12.
Zurück zum Zitat Wilson, A.D., Bobick, A.F.: Parametric hidden markov models for gesture recognition. IEEE Trans. Pattern Anal. Mach. Intell. 21(9), 884–900 (1999)CrossRef Wilson, A.D., Bobick, A.F.: Parametric hidden markov models for gesture recognition. IEEE Trans. Pattern Anal. Mach. Intell. 21(9), 884–900 (1999)CrossRef
Metadaten
Titel
Recognition and Localisation of Pointing Gestures Using a RGB-D Camera
verfasst von
Naina Dhingra
Eugenio Valli
Andreas Kunz
Copyright-Jahr
2020
DOI
https://doi.org/10.1007/978-3-030-50726-8_27

Neuer Inhalt