Skip to main content
Erschienen in: Intelligent Service Robotics 4/2016

01.10.2016 | Original Research Paper

Realization of sign language motion using a dual-arm/hand humanoid robot

verfasst von: Sheng-Yen Lo, Han-Pang Huang

Erschienen in: Intelligent Service Robotics | Ausgabe 4/2016

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

The recent increase in technological maturity has empowered robots to assist humans and provide daily services. Voice command usually appears as a popular human–machine interface for communication. Unfortunately, deaf people cannot exchange information from robots through vocal modalities. To interact with deaf people effectively and intuitively, it is desired that robots, especially humanoids, have manual communication skills, such as performing sign languages. Without ad hoc programming to generate a particular sign language motion, we present an imitation system to teach the humanoid robot performing sign languages by directly replicating observed demonstration. The system symbolically encodes the information of human hand–arm motion from low-cost depth sensors as a skeleton motion time-series that serves to generate initial robot movement by means of perception-to-action mapping. To tackle the body correspondence problem, the virtual impedance control approach is adopted to smoothly follow the initial movement, while preventing potential risks due to the difference in the physical properties between the human and the robot, such as joint limit and self-collision. In addition, the integration of the leg-joints stabilizer provides better balance of the whole robot. Finally, our developed humanoid robot, NINO, successfully learned by imitation from human demonstration to introduce itself using Taiwanese Sign Language.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Zafrulla Z, Brashear H, Starner T, Hamilton H, Presti P (2011) American sign language recognition with the kinect. In: Proceedings of the 13th international conference on multimodal interfaces, pp 279–286 Zafrulla Z, Brashear H, Starner T, Hamilton H, Presti P (2011) American sign language recognition with the kinect. In: Proceedings of the 13th international conference on multimodal interfaces, pp 279–286
2.
Zurück zum Zitat Lang S, Block M, Rojas R (2012) Sign language recognition using kinect. In: Rutkowski L, Korytkowski M, Scherer R, Tadeusiewicz R, Zadeh LA, Zurada JM (eds) Artificial intelligence and soft computing. Springer, Berlin, pp 394–402 Lang S, Block M, Rojas R (2012) Sign language recognition using kinect. In: Rutkowski L, Korytkowski M, Scherer R, Tadeusiewicz R, Zadeh LA, Zurada JM (eds) Artificial intelligence and soft computing. Springer, Berlin, pp 394–402
3.
Zurück zum Zitat Kin Fun L, Lothrop K, Gill E, Lau S (2011) A web-based sign language translator using 3D video processing. In: 14th international conference on network-based information systems (NBiS), pp 356–361 Kin Fun L, Lothrop K, Gill E, Lau S (2011) A web-based sign language translator using 3D video processing. In: 14th international conference on network-based information systems (NBiS), pp 356–361
4.
Zurück zum Zitat Yi L (2012) Hand gesture recognition using Kinect. In: IEEE 3rd international conference on software engineering and service science (ICSESS), pp 196–199 Yi L (2012) Hand gesture recognition using Kinect. In: IEEE 3rd international conference on software engineering and service science (ICSESS), pp 196–199
5.
Zurück zum Zitat Potter LE, Araullo J, Carter L (2013) The leap motion controller: a view on sign language. In: Proceedings of the 25th Australian computer–human interaction conference: augmentation, application, innovation, collaboration, pp 175–178 Potter LE, Araullo J, Carter L (2013) The leap motion controller: a view on sign language. In: Proceedings of the 25th Australian computer–human interaction conference: augmentation, application, innovation, collaboration, pp 175–178
6.
Zurück zum Zitat Soltani F, Eskandari F, Golestan S (2012) Developing a gesture-based game for deaf/mute people using microsoft kinect. In: Sixth international conference on complex, intelligent and software intensive systems (CISIS), pp 491–495 Soltani F, Eskandari F, Golestan S (2012) Developing a gesture-based game for deaf/mute people using microsoft kinect. In: Sixth international conference on complex, intelligent and software intensive systems (CISIS), pp 491–495
7.
Zurück zum Zitat Uluer P, Akalın N, Köse H (2015) A new robotic platform for sign language tutoring. Int J Soc Robot 7:1–15 Uluer P, Akalın N, Köse H (2015) A new robotic platform for sign language tutoring. Int J Soc Robot 7:1–15
8.
Zurück zum Zitat Köse H, Uluer P, Akalın N, Yorgancı R, Özkul A, Ince G (2015) The effect of embodiment in sign language tutoring with assistive humanoid robots. Int J Soc Robot 7:1–12 Köse H, Uluer P, Akalın N, Yorgancı R, Özkul A, Ince G (2015) The effect of embodiment in sign language tutoring with assistive humanoid robots. Int J Soc Robot 7:1–12
9.
Zurück zum Zitat Koay KL, Lakatos G, Syrdal DS, Gacsi M, Bereczky B, Dautenhahn K, Miklosi A, Walters ML (2013) Hey! there is someone at your door. A hearing robot using visual communication signals of hearing dogs to communicate intent. In: IEEE symposium on artificial life (ALIFE), pp 90–97 Koay KL, Lakatos G, Syrdal DS, Gacsi M, Bereczky B, Dautenhahn K, Miklosi A, Walters ML (2013) Hey! there is someone at your door. A hearing robot using visual communication signals of hearing dogs to communicate intent. In: IEEE symposium on artificial life (ALIFE), pp 90–97
10.
Zurück zum Zitat Beck A, Cañamero L, Hiolle A, Damiano L, Cosi P, Tesser F, Sommavilla G (2013) Interpretation of emotional body language displayed by a humanoid robot: a case study with children. Int J Soc Robot 5(3):325–334CrossRef Beck A, Cañamero L, Hiolle A, Damiano L, Cosi P, Tesser F, Sommavilla G (2013) Interpretation of emotional body language displayed by a humanoid robot: a case study with children. Int J Soc Robot 5(3):325–334CrossRef
11.
Zurück zum Zitat Whiten A, Horner V, Litchfield CA, Marshall-Pescini S (2004) How do apes ape? Anim Learn Behav 32(1):36–52CrossRef Whiten A, Horner V, Litchfield CA, Marshall-Pescini S (2004) How do apes ape? Anim Learn Behav 32(1):36–52CrossRef
12.
Zurück zum Zitat Lopes M, Melo F, Montesano L, Santos-Victor J (2010) Abstraction levels for robotic imitation: overview and computational approaches. In: Sigaud O, Peters J (eds) From motor learning to interaction learning in robots. Springer, Berlin, pp 313–355 Lopes M, Melo F, Montesano L, Santos-Victor J (2010) Abstraction levels for robotic imitation: overview and computational approaches. In: Sigaud O, Peters J (eds) From motor learning to interaction learning in robots. Springer, Berlin, pp 313–355
13.
Zurück zum Zitat Sciutti A, Bisio A, Nori F, Metta G, Fadiga L, Pozzo T, Sandini G (2012) Measuring human–robot interaction through motor resonance. Int J Soc Robot 4(3):223–234CrossRef Sciutti A, Bisio A, Nori F, Metta G, Fadiga L, Pozzo T, Sandini G (2012) Measuring human–robot interaction through motor resonance. Int J Soc Robot 4(3):223–234CrossRef
14.
Zurück zum Zitat Bisio A, Sciutti A, Nori F, Metta G, Fadiga L, Sandini G, Pozzo T (2014) Motor contagion during human–human and human–robot interaction. PLoS One 9(8):e106172 Bisio A, Sciutti A, Nori F, Metta G, Fadiga L, Sandini G, Pozzo T (2014) Motor contagion during human–human and human–robot interaction. PLoS One 9(8):e106172
15.
Zurück zum Zitat Hogeveen J, Obhi SS (2012) Social interaction enhances motor resonance for observed human actions. J Neurosci 32(17):5984–5989CrossRef Hogeveen J, Obhi SS (2012) Social interaction enhances motor resonance for observed human actions. J Neurosci 32(17):5984–5989CrossRef
16.
Zurück zum Zitat Uithol S, van Rooij I, Bekkering H, Haselager P (2011) Understanding motor resonance. Soc Neurosci 6(4):388–397CrossRef Uithol S, van Rooij I, Bekkering H, Haselager P (2011) Understanding motor resonance. Soc Neurosci 6(4):388–397CrossRef
17.
Zurück zum Zitat Alissandrakis A, Nehaniv CL, Dautenhahn K (2006) Action, state and effect metrics for robot imitation. In: IEEE International symposium on robot and human interactive communication, pp 232–237 Alissandrakis A, Nehaniv CL, Dautenhahn K (2006) Action, state and effect metrics for robot imitation. In: IEEE International symposium on robot and human interactive communication, pp 232–237
18.
Zurück zum Zitat Alissandrakis A, Nehaniv CL, Dautenhahn K (2007) Correspondence mapping induced state and action metrics for robotic imitation. IEEE Trans Syst Man and Cybern Part B Cybern 37(2):299–307CrossRef Alissandrakis A, Nehaniv CL, Dautenhahn K (2007) Correspondence mapping induced state and action metrics for robotic imitation. IEEE Trans Syst Man and Cybern Part B Cybern 37(2):299–307CrossRef
19.
Zurück zum Zitat Pollard NS, Hodgins JK, Riley MJ, Atkeson CG (2002) Adapting human motion for the control of a humanoid robot. In: IEEE international conference on robotics and automation, pp 1390–1397 Pollard NS, Hodgins JK, Riley MJ, Atkeson CG (2002) Adapting human motion for the control of a humanoid robot. In: IEEE international conference on robotics and automation, pp 1390–1397
20.
Zurück zum Zitat Kim C, Kim D, Oh Y (2005) Solving an inverse kinematics problem for a humanoid robot imitation of human motions using optimization. In: Proceedings of the international conference on infomatics in control, automation and robotics, pp 85–92 Kim C, Kim D, Oh Y (2005) Solving an inverse kinematics problem for a humanoid robot imitation of human motions using optimization. In: Proceedings of the international conference on infomatics in control, automation and robotics, pp 85–92
21.
Zurück zum Zitat Nakaoka S, Nakazawa A, Yokoi K, Hirukawa H, Ikeuchi K (2003) Generating whole body motions for a biped humanoid robot from captured human dances. In: IEEE international conference on robotics and automation, pp 3905–3910 Nakaoka S, Nakazawa A, Yokoi K, Hirukawa H, Ikeuchi K (2003) Generating whole body motions for a biped humanoid robot from captured human dances. In: IEEE international conference on robotics and automation, pp 3905–3910
22.
Zurück zum Zitat Nakaoka S, Nakazawa A, Yokoi K, Ikeuchi K (2004) Leg motion primitives for a humanoid robot to imitate human dances. J Three Dimens Images 18(1):73–78 Nakaoka S, Nakazawa A, Yokoi K, Ikeuchi K (2004) Leg motion primitives for a humanoid robot to imitate human dances. J Three Dimens Images 18(1):73–78
23.
Zurück zum Zitat Choi Y, Ra S, Kim S, Park S-K (2009) Real-time arm motion imitation for human–robot tangible interface. Intell Serv Robot 2(2):61–69CrossRef Choi Y, Ra S, Kim S, Park S-K (2009) Real-time arm motion imitation for human–robot tangible interface. Intell Serv Robot 2(2):61–69CrossRef
24.
Zurück zum Zitat Ou Y, Hu J, Wang Z, Fu Y, Wu X, Li X (2015) A real-time human imitation system using kinect. Int J Soc Robot 7:1–14 Ou Y, Hu J, Wang Z, Fu Y, Wu X, Li X (2015) A real-time human imitation system using kinect. Int J Soc Robot 7:1–14
25.
Zurück zum Zitat Calinon S, D’halluin F, Sauser EL, Caldwell DG, Billard AG (2010) Learning and reproduction of gestures by imitation. IEEE Robot Autom Mag 17(2):44–54CrossRef Calinon S, D’halluin F, Sauser EL, Caldwell DG, Billard AG (2010) Learning and reproduction of gestures by imitation. IEEE Robot Autom Mag 17(2):44–54CrossRef
26.
Zurück zum Zitat Hung-Yi L, Han-Pang H, Huan-Kun H (2014) Lifting motion planning for humanoid robots. In: IEEE international conference on automation science and engineering, pp 1174–1179 Hung-Yi L, Han-Pang H, Huan-Kun H (2014) Lifting motion planning for humanoid robots. In: IEEE international conference on automation science and engineering, pp 1174–1179
27.
Zurück zum Zitat Zhengyou Z (2012) Microsoft kinect sensor and its effect. IEEE MultiMed 19(2):4–10CrossRef Zhengyou Z (2012) Microsoft kinect sensor and its effect. IEEE MultiMed 19(2):4–10CrossRef
28.
Zurück zum Zitat Weichert F, Bachmann D, Rudak B, Fisseler D (2013) Analysis of the accuracy and robustness of the leap motion controller. Sensors 13(5):6380–6393CrossRef Weichert F, Bachmann D, Rudak B, Fisseler D (2013) Analysis of the accuracy and robustness of the leap motion controller. Sensors 13(5):6380–6393CrossRef
29.
Zurück zum Zitat Kim S, Kim C, Park JH (2006) Human-like arm motion generation for humanoid robots using motion capture database. In: IEEE/RSJ international conference on intelligent robots and systems, pp 3486–3491 Kim S, Kim C, Park JH (2006) Human-like arm motion generation for humanoid robots using motion capture database. In: IEEE/RSJ international conference on intelligent robots and systems, pp 3486–3491
30.
Zurück zum Zitat Lo S-Y, Cheng C-A, Huang H-P (2016) Virtual impedance control for safe human–robot interaction. J Intell Robot Syst 82(1):3–19CrossRef Lo S-Y, Cheng C-A, Huang H-P (2016) Virtual impedance control for safe human–robot interaction. J Intell Robot Syst 82(1):3–19CrossRef
31.
Zurück zum Zitat Choi Y, Kim D, You B-J (2006) On the walking control for humanoid robot based on the kinematic resolution of com jacobian with embedded motion. In: IEEE international conference on robotics and automation, pp 2655–2660 Choi Y, Kim D, You B-J (2006) On the walking control for humanoid robot based on the kinematic resolution of com jacobian with embedded motion. In: IEEE international conference on robotics and automation, pp 2655–2660
Metadaten
Titel
Realization of sign language motion using a dual-arm/hand humanoid robot
verfasst von
Sheng-Yen Lo
Han-Pang Huang
Publikationsdatum
01.10.2016
Verlag
Springer Berlin Heidelberg
Erschienen in
Intelligent Service Robotics / Ausgabe 4/2016
Print ISSN: 1861-2776
Elektronische ISSN: 1861-2784
DOI
https://doi.org/10.1007/s11370-016-0203-8

Weitere Artikel der Ausgabe 4/2016

Intelligent Service Robotics 4/2016 Zur Ausgabe

Original Research Paper

STATE

Neuer Inhalt