Skip to main content
Erschienen in: International Journal of Social Robotics 2/2018

24.01.2018

Human–Robot Facial Expression Reciprocal Interaction Platform: Case Studies on Children with Autism

verfasst von: Ali Ghorbandaei Pour, Alireza Taheri, Minoo Alemi, Ali Meghdari

Erschienen in: International Journal of Social Robotics | Ausgabe 2/2018

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Reciprocal interaction and facial expression are some of the most interesting topics in the fields of social and cognitive robotics. On the other hand, children with autism show a particular interest toward robots, and facial expression recognition can improve these children’s social interaction abilities in real life. In this research, a robotic platform has been developed for reciprocal interaction consisting of two main phases, namely as Non-structured and Structured interaction modes. In the Non-structured interaction mode, a vision system recognizes the facial expressions of the user through a fuzzy clustering method. The interaction decision-making unit is combined with a fuzzy finite state machine to improve the quality of human–robot interaction by utilizing the results obtained from the facial expression analysis. In the Structured interaction mode, a set of imitation scenarios with eight different posed facial behaviors were designed for the robot. As a pilot study, the effect and acceptability of our platform have been investigated on autistic children between 3 and 7 years old and the preliminary acceptance rate of \(\sim \) 78% is observed in our experimental conditions. The scenarios start with simple facial expressions and get more complicated as they continue. The same vision system and fuzzy clustering method of the Non-structured interaction mode are used for automatic evaluation of a participant’s gestures. Lastly, the automatic assessment of imitation quality was compared with the manual video coding results. The Pearson’s r on these equivalent grades were computed as \(\hbox {r}\,=\,0.89\) which shows a sufficient agreement on the automatic and manual scores.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Pantic M, Pentland A, Nijholt A, Huang TS (2007) Human computing and machine understanding of human behavior: a survey. In: Huang TS, Nijholt A, Pantic M, Pentland A (eds) Artificial intelligence for human computing. Lecture notes in computer science, vol 4451. Springer, Berlin. https://doi.org/10.1007/978-3-540-72348-6_3 Pantic M, Pentland A, Nijholt A, Huang TS (2007) Human computing and machine understanding of human behavior: a survey. In: Huang TS, Nijholt A, Pantic M, Pentland A (eds) Artificial intelligence for human computing. Lecture notes in computer science, vol 4451. Springer, Berlin. https://​doi.​org/​10.​1007/​978-3-540-72348-6_​3
2.
Zurück zum Zitat Valstar MF (2008) Timing is everything: a spatio-temporal approach to the analysis of facial actions. Imperial College London, London Valstar MF (2008) Timing is everything: a spatio-temporal approach to the analysis of facial actions. Imperial College London, London
3.
Zurück zum Zitat Mavridis N (2015) A review of verbal and non-verbal human–robot interactive communication. Robot Auton Syst 63:22–35MathSciNetCrossRef Mavridis N (2015) A review of verbal and non-verbal human–robot interactive communication. Robot Auton Syst 63:22–35MathSciNetCrossRef
4.
Zurück zum Zitat Tardif C, Lainé F, Rodriguez M, Gepner B (2007) Slowing down presentation of facial movements and vocal sounds enhances facial expression recognition and induces facial-vocal imitation in children with autism. J Autism Dev Disord 37(8):1469–1484CrossRef Tardif C, Lainé F, Rodriguez M, Gepner B (2007) Slowing down presentation of facial movements and vocal sounds enhances facial expression recognition and induces facial-vocal imitation in children with autism. J Autism Dev Disord 37(8):1469–1484CrossRef
5.
Zurück zum Zitat Dawson G, Webb SJ, McPartland J (2005) Understanding the nature of face processing impairment in autism: insights from behavioral and electrophysiological studies. Dev Neuropsychol 27(3):403–424CrossRef Dawson G, Webb SJ, McPartland J (2005) Understanding the nature of face processing impairment in autism: insights from behavioral and electrophysiological studies. Dev Neuropsychol 27(3):403–424CrossRef
6.
Zurück zum Zitat Baron-Cohen S, Leslie AM, Frith U (1985) Does the autistic child have a "theory of mind"? Cognition 21(1):37–46CrossRef Baron-Cohen S, Leslie AM, Frith U (1985) Does the autistic child have a "theory of mind"? Cognition 21(1):37–46CrossRef
7.
Zurück zum Zitat Baron-Cohen S (2001) Theory of mind in normal development and autism. Prisme 34(1):74–183 Baron-Cohen S (2001) Theory of mind in normal development and autism. Prisme 34(1):74–183
8.
Zurück zum Zitat Haviland JM, Lelwica M (1987) The induced affect response: 10-week-old infants’ responses to three emotion expressions. Dev Psychol 23(1):97CrossRef Haviland JM, Lelwica M (1987) The induced affect response: 10-week-old infants’ responses to three emotion expressions. Dev Psychol 23(1):97CrossRef
9.
Zurück zum Zitat Tonks J, Williams WH, Frampton I, Yates P, Slater A (2007) Assessing emotion recognition in 9–15-years olds: preliminary analysis of abilities in reading emotion from faces, voices and eyes. Brain Inj 21(6):623–629CrossRef Tonks J, Williams WH, Frampton I, Yates P, Slater A (2007) Assessing emotion recognition in 9–15-years olds: preliminary analysis of abilities in reading emotion from faces, voices and eyes. Brain Inj 21(6):623–629CrossRef
10.
Zurück zum Zitat Pouretemad H (2011) Assessment and treatment of joint attention deficits in children with autistic spectrum disorders. Arjmand Book, Tehran (in Persian) Pouretemad H (2011) Assessment and treatment of joint attention deficits in children with autistic spectrum disorders. Arjmand Book, Tehran (in Persian)
11.
Zurück zum Zitat Ingersoll B (2010) Brief report: pilot randomized controlled trial of reciprocal imitation training for teaching elicited and spontaneous imitation to children with autism. J Autism Dev Disord 40(9):1154–1160CrossRef Ingersoll B (2010) Brief report: pilot randomized controlled trial of reciprocal imitation training for teaching elicited and spontaneous imitation to children with autism. J Autism Dev Disord 40(9):1154–1160CrossRef
12.
Zurück zum Zitat Alemi M, Meghdari A, Ghazisaedy M (2015) The impact of social robotics on L2 learners’ anxiety and attitude in English vocabulary acquisition. Int J Soc Robot 7(4):523–535CrossRef Alemi M, Meghdari A, Ghazisaedy M (2015) The impact of social robotics on L2 learners’ anxiety and attitude in English vocabulary acquisition. Int J Soc Robot 7(4):523–535CrossRef
13.
Zurück zum Zitat Tamura T, Yonemitsu S, Itoh A, Oikawa D, Kawakami A, Higashi Y et al (2004) Is an entertainment robot useful in the care of elderly people with severe dementia? J Gerontol Ser Biol Sci Med Sci 59(1):M83–M85CrossRef Tamura T, Yonemitsu S, Itoh A, Oikawa D, Kawakami A, Higashi Y et al (2004) Is an entertainment robot useful in the care of elderly people with severe dementia? J Gerontol Ser Biol Sci Med Sci 59(1):M83–M85CrossRef
14.
Zurück zum Zitat Alemi M, Ghanbarzadeh A, Meghdari A, Moghadam LJ (2016) Clinical application of a humanoid robot in pediatric cancer interventions. Int J Soc Robot 8(5):743–759CrossRef Alemi M, Ghanbarzadeh A, Meghdari A, Moghadam LJ (2016) Clinical application of a humanoid robot in pediatric cancer interventions. Int J Soc Robot 8(5):743–759CrossRef
15.
Zurück zum Zitat Taheri A, Alemi M, Meghdari A, Pouretemad H, Basiri NM, Poorgoldooz P (2015) Impact of humanoid social robots on treatment of a pair of Iranian autistic twins. In: International conference on social robotics. Springer, pp 623–632 Taheri A, Alemi M, Meghdari A, Pouretemad H, Basiri NM, Poorgoldooz P (2015) Impact of humanoid social robots on treatment of a pair of Iranian autistic twins. In: International conference on social robotics. Springer, pp 623–632
16.
Zurück zum Zitat Scassellati B, Admoni H, Matarić M (2012) Robots for use in autism research. Annu Rev Biomed Eng 14:275–294CrossRef Scassellati B, Admoni H, Matarić M (2012) Robots for use in autism research. Annu Rev Biomed Eng 14:275–294CrossRef
17.
Zurück zum Zitat Taheri A, Meghdari A, Alemi M, Pouretemad H, Poorgoldooz P, Roohbakhsh M (2016) Social robots and teaching music to autistic children: myth or reality? In: International conference on social robotics. Springer, pp 541–550 Taheri A, Meghdari A, Alemi M, Pouretemad H, Poorgoldooz P, Roohbakhsh M (2016) Social robots and teaching music to autistic children: myth or reality? In: International conference on social robotics. Springer, pp 541–550
18.
Zurück zum Zitat Hopkins IM, Gower MW, Perez TA, Smith DS, Amthor FR, Wimsatt FC, Biasini FJ (2011) Avatar assistant: improving social skills in students with an ASD through a computer-based intervention. J Autism Dev Disord 41(11):1543–1555CrossRef Hopkins IM, Gower MW, Perez TA, Smith DS, Amthor FR, Wimsatt FC, Biasini FJ (2011) Avatar assistant: improving social skills in students with an ASD through a computer-based intervention. J Autism Dev Disord 41(11):1543–1555CrossRef
19.
Zurück zum Zitat Feil-Seifer D, Mataric MJ (2008) B 3 IA: a control architecture for autonomous robot-assisted behavior intervention for children with Autism Spectrum Disorders. In: The 17th IEEE international symposium on robot and human interactive communication (RO-MAN), pp 328–333 Feil-Seifer D, Mataric MJ (2008) B 3 IA: a control architecture for autonomous robot-assisted behavior intervention for children with Autism Spectrum Disorders. In: The 17th IEEE international symposium on robot and human interactive communication (RO-MAN), pp 328–333
20.
Zurück zum Zitat Meghdari A, Alemi M, Pour AG, Taheri A (2016) Spontaneous human–robot emotional interaction through facial expressions. In: International conference on social robotics. Springer, pp 351–361 Meghdari A, Alemi M, Pour AG, Taheri A (2016) Spontaneous human–robot emotional interaction through facial expressions. In: International conference on social robotics. Springer, pp 351–361
21.
Zurück zum Zitat Zacharatos H, Gatzoulis C, Chrysanthou YL (2014) Automatic emotion recognition based on body movement analysis: a survey. IEEE Comput Graph Appl 34(6):35–45CrossRef Zacharatos H, Gatzoulis C, Chrysanthou YL (2014) Automatic emotion recognition based on body movement analysis: a survey. IEEE Comput Graph Appl 34(6):35–45CrossRef
22.
Zurück zum Zitat Xiao Y, Zhang Z, Beck A, Yuan J, Thalmann D (2014) Human–robot interaction by understanding upper body gestures. Presence Teleoper Virtual Environ 23(2):133–154CrossRef Xiao Y, Zhang Z, Beck A, Yuan J, Thalmann D (2014) Human–robot interaction by understanding upper body gestures. Presence Teleoper Virtual Environ 23(2):133–154CrossRef
23.
Zurück zum Zitat Aly A, Tapus A (2015) Multimodal adapted robot behavior synthesis within a narrative human–robot interaction. In: IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 2986–2993 Aly A, Tapus A (2015) Multimodal adapted robot behavior synthesis within a narrative human–robot interaction. In: IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 2986–2993
24.
Zurück zum Zitat Kwon DS, Kwak YK, Park JC, Chung MJ, Jee ES, Park KS et al (2007) Emotion interaction system for a service robot. In: The 16th ieee international symposium on robot and human interactive communication (RO-MAN), pp 351–356 Kwon DS, Kwak YK, Park JC, Chung MJ, Jee ES, Park KS et al (2007) Emotion interaction system for a service robot. In: The 16th ieee international symposium on robot and human interactive communication (RO-MAN), pp 351–356
25.
Zurück zum Zitat Brown L, Howard AM (2014) Gestural behavioral implementation on a humanoid robotic platform for effective social interaction. In: The 23rd IEEE international symposium on robot and human interactive communication (RO-MAN), pp 471–476 Brown L, Howard AM (2014) Gestural behavioral implementation on a humanoid robotic platform for effective social interaction. In: The 23rd IEEE international symposium on robot and human interactive communication (RO-MAN), pp 471–476
26.
Zurück zum Zitat Noh JY, Neumann U (1998) A survey of facial modeling and animation techniques. USC technical report, pp 99–705 Noh JY, Neumann U (1998) A survey of facial modeling and animation techniques. USC technical report, pp 99–705
27.
Zurück zum Zitat Mavadati S (2015) Spontaneous facial behavior computing in human machine interaction with applications in autism treatment. Doctoral dissertation, Electrical and Computer Engineering Department, University of Denver, Denver Mavadati S (2015) Spontaneous facial behavior computing in human machine interaction with applications in autism treatment. Doctoral dissertation, Electrical and Computer Engineering Department, University of Denver, Denver
28.
Zurück zum Zitat Halder A, Konar A, Mandal R, Chakraborty A, Bhowmik P, Pal NR, Nagar AK (2013) General and interval type-2 fuzzy face-space approach to emotion recognition. IEEE Trans Syst Man Cybern Syst 43(3):587–605CrossRef Halder A, Konar A, Mandal R, Chakraborty A, Bhowmik P, Pal NR, Nagar AK (2013) General and interval type-2 fuzzy face-space approach to emotion recognition. IEEE Trans Syst Man Cybern Syst 43(3):587–605CrossRef
29.
Zurück zum Zitat Dahmane M, Meunier J (2014) Prototype-based modeling for facial expression analysis. IEEE Trans Multimed 16(6):1574–1584CrossRef Dahmane M, Meunier J (2014) Prototype-based modeling for facial expression analysis. IEEE Trans Multimed 16(6):1574–1584CrossRef
30.
Zurück zum Zitat Kotsia I, Pitas I (2007) Facial expression recognition in image sequences using geometric deformation features and support vector machines. IEEE Trans Image Process 16(1):172–187MathSciNetCrossRef Kotsia I, Pitas I (2007) Facial expression recognition in image sequences using geometric deformation features and support vector machines. IEEE Trans Image Process 16(1):172–187MathSciNetCrossRef
31.
Zurück zum Zitat Li Y, Wang S, Zhao Y, Ji Q (2013) Simultaneous facial feature tracking and facial expression recognition. IEEE Trans Image Process 22(7):2559–2573CrossRef Li Y, Wang S, Zhao Y, Ji Q (2013) Simultaneous facial feature tracking and facial expression recognition. IEEE Trans Image Process 22(7):2559–2573CrossRef
32.
Zurück zum Zitat Holthaus P, Wachsmuth S (2013) Direct on-line imitation of human faces with hierarchical ART networks. In: The 22nd IEEE international symposium on robot and human interactive communication (RO-MAN), pp 370–371 Holthaus P, Wachsmuth S (2013) Direct on-line imitation of human faces with hierarchical ART networks. In: The 22nd IEEE international symposium on robot and human interactive communication (RO-MAN), pp 370–371
33.
Zurück zum Zitat Li Y, Mavadati SM, Mahoor MH, Zhao Y, Ji Q (2015) Measuring the intensity of spontaneous facial action units with dynamic Bayesian network. Pattern Recogn 48(11):3417–3427CrossRef Li Y, Mavadati SM, Mahoor MH, Zhao Y, Ji Q (2015) Measuring the intensity of spontaneous facial action units with dynamic Bayesian network. Pattern Recogn 48(11):3417–3427CrossRef
34.
Zurück zum Zitat Chakraborty A, Konar A, Chakraborty UK, Chatterjee A (2009) Emotion recognition from facial expressions and its control using fuzzy logic. IEEE Trans Syst Man Cybern Part A Syst Hum 39(4):726–743CrossRef Chakraborty A, Konar A, Chakraborty UK, Chatterjee A (2009) Emotion recognition from facial expressions and its control using fuzzy logic. IEEE Trans Syst Man Cybern Part A Syst Hum 39(4):726–743CrossRef
35.
Zurück zum Zitat Abdat F, Maaoui C, Pruski A (2011) Human–computer interaction using emotion recognition from facial expression. In: Fifth UKSim european symposium on computer modeling and simulation (EMS), pp 196–201 Abdat F, Maaoui C, Pruski A (2011) Human–computer interaction using emotion recognition from facial expression. In: Fifth UKSim european symposium on computer modeling and simulation (EMS), pp 196–201
36.
Zurück zum Zitat de Carvalho Santos V, Romero RAF, Coca SRDM (2012) Imitation of facial expressions for a virtual robotic head. In: Robotics symposium and latin american robotics symposium (SBR-LARS), 2012 Brazilian, pp 251–254 de Carvalho Santos V, Romero RAF, Coca SRDM (2012) Imitation of facial expressions for a virtual robotic head. In: Robotics symposium and latin american robotics symposium (SBR-LARS), 2012 Brazilian, pp 251–254
37.
Zurück zum Zitat Cid F, Prado JA, Bustos P, Nunez P (2013) A real time and robust facial expression recognition and imitation approach for affective human–robot interaction using gabor filtering. In: IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 2188–2193 Cid F, Prado JA, Bustos P, Nunez P (2013) A real time and robust facial expression recognition and imitation approach for affective human–robot interaction using gabor filtering. In: IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 2188–2193
38.
Zurück zum Zitat Chumkamon S, Masato K, Hayashi E (2014) The robot’s eye expression for imitating human facial expression. In: The 11th international conference on electrical engineering/electronics, computer, telecommunications and information technology (ECTI-CON), pp 1–5 Chumkamon S, Masato K, Hayashi E (2014) The robot’s eye expression for imitating human facial expression. In: The 11th international conference on electrical engineering/electronics, computer, telecommunications and information technology (ECTI-CON), pp 1–5
39.
Zurück zum Zitat Meghdari A, Shouraki SB, Siamy A, Shariati A (2016) The real-time facial imitation by a social humanoid robot. In: The 4th international conference on robotics and mechatronics (ICROM), pp 524–529 Meghdari A, Shouraki SB, Siamy A, Shariati A (2016) The real-time facial imitation by a social humanoid robot. In: The 4th international conference on robotics and mechatronics (ICROM), pp 524–529
40.
Zurück zum Zitat Tanaka JW, Wolf JM, Klaiman C, Koenig K, Cockburn J, Herlihy L et al (2010) Using computerized games to teach face recognition skills to children with autism spectrum disorder: the let’s face it! program. J Child Psychol Psychiatry 51(8):944–952CrossRef Tanaka JW, Wolf JM, Klaiman C, Koenig K, Cockburn J, Herlihy L et al (2010) Using computerized games to teach face recognition skills to children with autism spectrum disorder: the let’s face it! program. J Child Psychol Psychiatry 51(8):944–952CrossRef
41.
Zurück zum Zitat Duquette A, Michaud F, Mercier H (2008) Exploring the use of a mobile robot as an imitation agent with children with low-functioning autism. Auton Robots 24(2):147–157CrossRef Duquette A, Michaud F, Mercier H (2008) Exploring the use of a mobile robot as an imitation agent with children with low-functioning autism. Auton Robots 24(2):147–157CrossRef
42.
Zurück zum Zitat Salvador MJ, Silver S, Mahoor MH (2015) An emotion recognition comparative study of autistic and typically-developing children using the zeno robot. In: IEEE international conference on robotics and automation (ICRA), pp 6128–6133 Salvador MJ, Silver S, Mahoor MH (2015) An emotion recognition comparative study of autistic and typically-developing children using the zeno robot. In: IEEE international conference on robotics and automation (ICRA), pp 6128–6133
43.
Zurück zum Zitat Wainer J, Robins B, Amirabdollahian F, Dautenhahn K (2014) Using the humanoid robot KASPAR to autonomously play triadic games and facilitate collaborative play among children with autism. IEEE Trans Auton Ment Dev 6(3):183–199CrossRef Wainer J, Robins B, Amirabdollahian F, Dautenhahn K (2014) Using the humanoid robot KASPAR to autonomously play triadic games and facilitate collaborative play among children with autism. IEEE Trans Auton Ment Dev 6(3):183–199CrossRef
44.
Zurück zum Zitat Hanson D, Mazzei D, Garver C, Ahluwalia A, De Rossi D, Stevenson M, Reynolds K (2012) Realistic humanlike robots for treatment of ASD, social training, and research; shown to appeal to youths with ASD, cause physiological arousal, and increase human-to-human social engagement. In: Proceedings of the 5th ACM international conference on pervasive technologies related to assistive environments (PETRA’12) Hanson D, Mazzei D, Garver C, Ahluwalia A, De Rossi D, Stevenson M, Reynolds K (2012) Realistic humanlike robots for treatment of ASD, social training, and research; shown to appeal to youths with ASD, cause physiological arousal, and increase human-to-human social engagement. In: Proceedings of the 5th ACM international conference on pervasive technologies related to assistive environments (PETRA’12)
47.
Zurück zum Zitat Ekman P, Friesen W (1978) Facial action coding system: a technique for the measurement of facial movement. Consulting Psychologists, Palo Alto Ekman P, Friesen W (1978) Facial action coding system: a technique for the measurement of facial movement. Consulting Psychologists, Palo Alto
48.
Zurück zum Zitat Bezdek JC, Ehrlich R, Full W (1984) FCM: the fuzzy c-means clustering algorithm. Comput Geosci 10(2–3):191–203CrossRef Bezdek JC, Ehrlich R, Full W (1984) FCM: the fuzzy c-means clustering algorithm. Comput Geosci 10(2–3):191–203CrossRef
49.
Zurück zum Zitat Popescu M, Keller J, Bezdek J, Zare A (2015) Random projections fuzzy c-means (RPFCM) for big data clustering. In: IEEE international conference on fuzzy systems (FUZZ-IEEE), pp 1–6 Popescu M, Keller J, Bezdek J, Zare A (2015) Random projections fuzzy c-means (RPFCM) for big data clustering. In: IEEE international conference on fuzzy systems (FUZZ-IEEE), pp 1–6
50.
Zurück zum Zitat Yan J, Ryan M, Power J (1994) Using fuzzy logic: towards intelligent systems, vol 1. Prentice Hall, Upper Saddle River Yan J, Ryan M, Power J (1994) Using fuzzy logic: towards intelligent systems, vol 1. Prentice Hall, Upper Saddle River
51.
Zurück zum Zitat Minitab INC (2000) MINITAB statistical software. Minitab Release, 13 Minitab INC (2000) MINITAB statistical software. Minitab Release, 13
52.
Zurück zum Zitat Giannopulu I, Montreynaud V, Watanabe T (2014) PEKOPPA: a minimalistic toy robot to analyse a listener-speaker situation in neurotypical and autistic children aged 6 years. In: Proceedings of the second international conference on human-agent interaction. ACM, pp 9–16 Giannopulu I, Montreynaud V, Watanabe T (2014) PEKOPPA: a minimalistic toy robot to analyse a listener-speaker situation in neurotypical and autistic children aged 6 years. In: Proceedings of the second international conference on human-agent interaction. ACM, pp 9–16
55.
Zurück zum Zitat Elahi MT, Korayem AH, Shariati A, Meghdari A, Alemi M, Ahmadi E et al (2017) “Xylotism”: a tablet-based application to teach music to children with autism. In: International conference on social robotics. Springer, Cham, pp 728–738 Elahi MT, Korayem AH, Shariati A, Meghdari A, Alemi M, Ahmadi E et al (2017) “Xylotism”: a tablet-based application to teach music to children with autism. In: International conference on social robotics. Springer, Cham, pp 728–738
Metadaten
Titel
Human–Robot Facial Expression Reciprocal Interaction Platform: Case Studies on Children with Autism
verfasst von
Ali Ghorbandaei Pour
Alireza Taheri
Minoo Alemi
Ali Meghdari
Publikationsdatum
24.01.2018
Verlag
Springer Netherlands
Erschienen in
International Journal of Social Robotics / Ausgabe 2/2018
Print ISSN: 1875-4791
Elektronische ISSN: 1875-4805
DOI
https://doi.org/10.1007/s12369-017-0461-4

Weitere Artikel der Ausgabe 2/2018

International Journal of Social Robotics 2/2018 Zur Ausgabe

Editorial

Sociorobotics

    Marktübersichten

    Die im Laufe eines Jahres in der „adhäsion“ veröffentlichten Marktübersichten helfen Anwendern verschiedenster Branchen, sich einen gezielten Überblick über Lieferantenangebote zu verschaffen.