Skip to main content
Erschienen in: International Journal of Social Robotics 4/2017

12.05.2017

Foundations of Visual Linear Human–Robot Interaction via Pointing Gesture Navigation

verfasst von: Michal Tölgyessy, Martin Dekan, František Duchoň, Jozef Rodina, Peter Hubinský, L’uboš Chovanec

Erschienen in: International Journal of Social Robotics | Ausgabe 4/2017

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

This paper presents a human–robot interaction method for controlling an autonomous mobile robot with a referential pointing gesture. A human user points to a specific location, robot detects the pointing gesture, computes its intersection with surrounding planar surface and moves to the destination. A depth camera mounted on the chassis is used. The user does not need to wear any extra clothing or markers. The design includes necessary mathematical concepts such as transformations between coordinate systems and vector abstraction of features needed for simple navigation, which other current research works misses. We provide experimental evaluation with derived probability models. We term this approach “Linear HRI” and define 3 laws of Linear HRI.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Mavridis N (2015) A review of verbal and non-verbal human-robot interactive communication. Robot Auton Syst 63(1):22–35MathSciNetCrossRef Mavridis N (2015) A review of verbal and non-verbal human-robot interactive communication. Robot Auton Syst 63(1):22–35MathSciNetCrossRef
2.
Zurück zum Zitat Yoshida K, Hibino F, Takahashi Y, Maeda Y (2011) Evaluation of pointing navigation interface for mobile robot with spherical vision system. In: Proceedings of the IEEE international conference on fuzzy systems, pp 721–726 Yoshida K, Hibino F, Takahashi Y, Maeda Y (2011) Evaluation of pointing navigation interface for mobile robot with spherical vision system. In: Proceedings of the IEEE international conference on fuzzy systems, pp 721–726
3.
Zurück zum Zitat Pateraki M, Baltzakis H, Trahanias P (2011) Visual estimation of pointed targets for robot guidance via fusion of face pose and hand orientation. In: Proceedings of the IEEE international conference on computer vision workshops, pp 1060–1067 Pateraki M, Baltzakis H, Trahanias P (2011) Visual estimation of pointed targets for robot guidance via fusion of face pose and hand orientation. In: Proceedings of the IEEE international conference on computer vision workshops, pp 1060–1067
4.
Zurück zum Zitat Droeschel D, Stückler J, Behnke S (2011) Learning to interpret pointing gestures with a time-of-flight camera. In: Proceedings of the IEEE international conference on human–robot interaction, pp 481–488 Droeschel D, Stückler J, Behnke S (2011) Learning to interpret pointing gestures with a time-of-flight camera. In: Proceedings of the IEEE international conference on human–robot interaction, pp 481–488
5.
Zurück zum Zitat Fransen BR, Lawson WE, Bugajska MD (2010) Integrating vision for human-robot interaction. In: Proceedings of the IEEE computer society conference on computer vision and pattern recognition—workshops, pp 9–16 Fransen BR, Lawson WE, Bugajska MD (2010) Integrating vision for human-robot interaction. In: Proceedings of the IEEE computer society conference on computer vision and pattern recognition—workshops, pp 9–16
6.
Zurück zum Zitat Li Z, Jarvis R (2010) Visual interpretation of natural pointing gestures in 3D space for human-robot interaction. In: Proceedings of the IEEE international conference on control, automation, robotics and vision, pp 2513–2518 Li Z, Jarvis R (2010) Visual interpretation of natural pointing gestures in 3D space for human-robot interaction. In: Proceedings of the IEEE international conference on control, automation, robotics and vision, pp 2513–2518
7.
Zurück zum Zitat Van Den Bergh M, Carton D, De Nijs R, Mitsou N, Landsiedel C, Kuehnlenz K, Wollherr D, Van Gool L, Buss M (2011) Real-time 3D hand gesture interaction with a robot for understanding directions from humans. In: Proceedings of the IEEE international symposium on robot and human interactive communicationm, pp 357–362 Van Den Bergh M, Carton D, De Nijs R, Mitsou N, Landsiedel C, Kuehnlenz K, Wollherr D, Van Gool L, Buss M (2011) Real-time 3D hand gesture interaction with a robot for understanding directions from humans. In: Proceedings of the IEEE international symposium on robot and human interactive communicationm, pp 357–362
8.
Zurück zum Zitat Quintero CP, Fomena RT, Shademan A, Wolleb N, Dick T, Jagersand M (2013) SEPO: selecting by pointing as an intuitive human-robot command interface. In: Proceedings of the IEEE international conference on robotics and automation, pp 1166–1171 Quintero CP, Fomena RT, Shademan A, Wolleb N, Dick T, Jagersand M (2013) SEPO: selecting by pointing as an intuitive human-robot command interface. In: Proceedings of the IEEE international conference on robotics and automation, pp 1166–1171
9.
Zurück zum Zitat Høilund C, Krüger V, Moeslund TB (2012) Evaluation of human body tracking system for gesture-based programming of industrial robots. In: Proceedings of the conference on industrial electronics and applications, pp 477–480 Høilund C, Krüger V, Moeslund TB (2012) Evaluation of human body tracking system for gesture-based programming of industrial robots. In: Proceedings of the conference on industrial electronics and applications, pp 477–480
10.
Zurück zum Zitat Pourmehr S, Monajjemi V, Wawerla J, Vaughan R, Mori G (2013) A robust integrated system for selecting and commanding multiple mobile robots. In: Proceedings of the IEEE international conference on robotics and automation, pp 2874–2879 Pourmehr S, Monajjemi V, Wawerla J, Vaughan R, Mori G (2013) A robust integrated system for selecting and commanding multiple mobile robots. In: Proceedings of the IEEE international conference on robotics and automation, pp 2874–2879
11.
Zurück zum Zitat Abidi S, Williams M, Johnston B (2013) Human pointing as a robot directive. In: Proceedings of the IEEE international conference on human–robot interaction, pp 67–68 Abidi S, Williams M, Johnston B (2013) Human pointing as a robot directive. In: Proceedings of the IEEE international conference on human–robot interaction, pp 67–68
12.
Zurück zum Zitat Xiao Y, Zhang Z, Beck A, Yuan J, Thalmann D (2014) Human–robot interaction by understanding upper body gestures. Presence: teleoperators and virtual environments, vol. 23, Issue 2, Springer, pp. 133–154 Xiao Y, Zhang Z, Beck A, Yuan J, Thalmann D (2014) Human–robot interaction by understanding upper body gestures. Presence: teleoperators and virtual environments, vol. 23, Issue 2, Springer, pp. 133–154
13.
Zurück zum Zitat Alvarez-Santos V, Iglesias R, Pardo XM, Regueiro CV, Canedo-Rodriguez A (2014) Gesture-based interaction with voice feedback for a tour-guide robot. J Vis Commun Image Represent 25(2):499–509CrossRef Alvarez-Santos V, Iglesias R, Pardo XM, Regueiro CV, Canedo-Rodriguez A (2014) Gesture-based interaction with voice feedback for a tour-guide robot. J Vis Commun Image Represent 25(2):499–509CrossRef
14.
Zurück zum Zitat Hussain AT, Ahmed SF, Hazry D (2015) Tracking and replication of hand movements by teleguided intelligent manipulator robot. Robotica 33(1):141–156CrossRef Hussain AT, Ahmed SF, Hazry D (2015) Tracking and replication of hand movements by teleguided intelligent manipulator robot. Robotica 33(1):141–156CrossRef
15.
Zurück zum Zitat Gil P, Mateo C, Torres F (2014) 3D visual sensing of the human hand for the remote operation of a robotic hand. Int J Adv Robot Syst 11(1):1–12CrossRef Gil P, Mateo C, Torres F (2014) 3D visual sensing of the human hand for the remote operation of a robotic hand. Int J Adv Robot Syst 11(1):1–12CrossRef
16.
Zurück zum Zitat Freedman et al. (2010) Depth Mapping Using Projected Patterns, U.S. Patent 2010/0118123 A1 Freedman et al. (2010) Depth Mapping Using Projected Patterns, U.S. Patent 2010/0118123 A1
20.
Zurück zum Zitat Andersen MR, Jensen T, Lisouski P, Mortensen AK, Hansen MK, Gregersen T, Ahrendt P (2012) Kinect depth sensor evaluation for computer vision applications. Department of Engineering, Aarhus University. Denmark. 37 pp. - Technical report ECE-TR-6 Andersen MR, Jensen T, Lisouski P, Mortensen AK, Hansen MK, Gregersen T, Ahrendt P (2012) Kinect depth sensor evaluation for computer vision applications. Department of Engineering, Aarhus University. Denmark. 37 pp. - Technical report ECE-TR-6
21.
Zurück zum Zitat Kramer J, Burrus N, Echtler F, Herrera DC, Parker M (2012) Hardware. In: Hacking the Kinect, Apress, pp 11–13 Kramer J, Burrus N, Echtler F, Herrera DC, Parker M (2012) Hardware. In: Hacking the Kinect, Apress, pp 11–13
22.
Zurück zum Zitat Shotton J, Girshick R, Fitzgibbon A, Sharp T, Cook M, Finocchio M, Moore R, Kohli P, Criminisi A, Kipman A, Blake A (2013) Efficient human pose estimation from single depth images. In: Proceedings of the IEEE transactions on pattern analysis and machine intelligence, pp 2821–2840 Shotton J, Girshick R, Fitzgibbon A, Sharp T, Cook M, Finocchio M, Moore R, Kohli P, Criminisi A, Kipman A, Blake A (2013) Efficient human pose estimation from single depth images. In: Proceedings of the IEEE transactions on pattern analysis and machine intelligence, pp 2821–2840
23.
Zurück zum Zitat Costante G, Bellocchio E, Valigi P, Ricci E (2014) Personalizing vision-based gestural interfaces for HRI with UAVs: a transfer learning approach. In Proceedings of the IEEE international conference on intelligent robots and systems, pp 3319–3326 Costante G, Bellocchio E, Valigi P, Ricci E (2014) Personalizing vision-based gestural interfaces for HRI with UAVs: a transfer learning approach. In Proceedings of the IEEE international conference on intelligent robots and systems, pp 3319–3326
24.
Zurück zum Zitat Monajjemi VM, Wawerla J, Vaughan R, Mori G (2013) HRI in the sky: creating and commanding teams of UAVs with a vision-mediated gestural interface. In: Proceedings of the IEEE/RSJ international conference on intelligent robots and systems, pp 617–623 Monajjemi VM, Wawerla J, Vaughan R, Mori G (2013) HRI in the sky: creating and commanding teams of UAVs with a vision-mediated gestural interface. In: Proceedings of the IEEE/RSJ international conference on intelligent robots and systems, pp 617–623
25.
Zurück zum Zitat Schauerte B, Stiefelhagen R (2014) Look at this! learning to guide visual saliency in human–robot interaction. In: Proceedings of the 2014 IEEE/RSJ international conference on intelligent robots and systems, pp 995–1002 Schauerte B, Stiefelhagen R (2014) Look at this! learning to guide visual saliency in human–robot interaction. In: Proceedings of the 2014 IEEE/RSJ international conference on intelligent robots and systems, pp 995–1002
26.
Zurück zum Zitat Nakamura A, Ukai M, Wu X, Furuhashi H (2015) Pointing gesture recognition using robot head control. In: Proceedings of the international conference on computational science and computational intelligence (CSCI), pp 849–850 Nakamura A, Ukai M, Wu X, Furuhashi H (2015) Pointing gesture recognition using robot head control. In: Proceedings of the international conference on computational science and computational intelligence (CSCI), pp 849–850
27.
Zurück zum Zitat Shukla D, Erkent O, Piater J (2015) Detection of pointing directions for human–robot interaction. In: Proceedings of the international conference on digital image computing: techniques and applications (DICTA), pp 1–8 Shukla D, Erkent O, Piater J (2015) Detection of pointing directions for human–robot interaction. In: Proceedings of the international conference on digital image computing: techniques and applications (DICTA), pp 1–8
28.
Zurück zum Zitat Canal G, Escalera S, Angulo C (2016) A real-time human–robot interaction system based on gestures for assistive scenarios. In: Special issue on assistive computer vision and robotics, pp 65–77 Canal G, Escalera S, Angulo C (2016) A real-time human–robot interaction system based on gestures for assistive scenarios. In: Special issue on assistive computer vision and robotics, pp 65–77
29.
Zurück zum Zitat Jevtić A, Doisy G, Parmet Y, Edan Y (2015) Comparison of interaction modalities for mobile indoor robot guidance: direct physical interaction, person following, and pointing control. In: IEEE transactions on human–machine systems, pp 653–663 Jevtić A, Doisy G, Parmet Y, Edan Y (2015) Comparison of interaction modalities for mobile indoor robot guidance: direct physical interaction, person following, and pointing control. In: IEEE transactions on human–machine systems, pp 653–663
30.
Zurück zum Zitat Jevtić A, Doisy G, Bodiroza S, Edan Y, Hafner VV (2014) Human–robot interaction through 3D vision and force control. In: Proceedings of the ACM/IEEE international conference human–robot interaction, pp 102–102 Jevtić A, Doisy G, Bodiroza S, Edan Y, Hafner VV (2014) Human–robot interaction through 3D vision and force control. In: Proceedings of the ACM/IEEE international conference human–robot interaction, pp 102–102
31.
Zurück zum Zitat Tölgyessy M, Chovanec L’, Pásztó P, Hubinský P (2014) A plane based real-time algorithm for controlling a semi-autonomous robot with hand gestures using the Kinect. Int J Imaging Robot 13(2):126–133 Tölgyessy M, Chovanec L’, Pásztó P, Hubinský P (2014) A plane based real-time algorithm for controlling a semi-autonomous robot with hand gestures using the Kinect. Int J Imaging Robot 13(2):126–133
32.
Zurück zum Zitat Tölgyessy M, Hubinský P, Chovanec L’, Duchoň F, Babinec A (2017) Controlling a group of robots to perform a common task by gestures only. Int J Imaging Robot 17(1):1–13 Tölgyessy M, Hubinský P, Chovanec L’, Duchoň F, Babinec A (2017) Controlling a group of robots to perform a common task by gestures only. Int J Imaging Robot 17(1):1–13
Metadaten
Titel
Foundations of Visual Linear Human–Robot Interaction via Pointing Gesture Navigation
verfasst von
Michal Tölgyessy
Martin Dekan
František Duchoň
Jozef Rodina
Peter Hubinský
L’uboš Chovanec
Publikationsdatum
12.05.2017
Verlag
Springer Netherlands
Erschienen in
International Journal of Social Robotics / Ausgabe 4/2017
Print ISSN: 1875-4791
Elektronische ISSN: 1875-4805
DOI
https://doi.org/10.1007/s12369-017-0408-9

Weitere Artikel der Ausgabe 4/2017

International Journal of Social Robotics 4/2017 Zur Ausgabe

    Marktübersichten

    Die im Laufe eines Jahres in der „adhäsion“ veröffentlichten Marktübersichten helfen Anwendern verschiedenster Branchen, sich einen gezielten Überblick über Lieferantenangebote zu verschaffen.