Skip to main content

2019 | OriginalPaper | Buchkapitel

Attention Assessment: Evaluation of Facial Expressions of Children with Autism Spectrum Disorder

verfasst von : Bilikis Banire, Dena Al Thani, Mustapha Makki, Marwa Qaraqe, Kruthika Anand, Olcay Connor, Kamran Khowaja, Bilal Mansoor

Erschienen in: Universal Access in Human-Computer Interaction. Multimodality and Assistive Environments

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Technological interventions for teaching children with autism spectrum disorders (ASD) are becoming popular due to their potentials for sustaining the attention of children with rich multimedia and repetitive functionalities. The degree of attentiveness to these technological interventions differs from one child to another due to variability in the spectrum. Therefore, an objective approach, as opposed to the subjective type of attention assessment, becomes essential for automatically monitoring attention in order to design and develop adaptive learning tools, as well as to support caregivers to evaluate learning tools. The analysis of facial expressions recently emerged as an objective method of measuring attention and participation levels of typical learners. However, few studies have examined facial expressions of children with ASD during an attention task. Thus, this study aims to evaluate existing facial expression parameters developed by “affectiva”, a commercial engagement level measuring tool. We conducted fifteen experimental scenarios of 5 min each with 4 children with ASD and 4 typically developing children with an average age of 8.8 years, A desktop virtual reality-continuous performance task (VR-CPT) as attention stimuli and a webcam were used to stream real-time facial expressions. All the participants scored above average in the VR-CPT and the performance of the TD group was better than that of ASD. While 3 out of 10 facial expressions were prominent in the two groups, ASD group showed addition facial expression. Our findings showed that facial expression could serve as a biomarker for measuring attention differentiating the groups.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat American Psychiatric Association: Diagnostic and Statistical Manual of Mental Disorders (DSM-5®). American Psychiatric Publishing (2013) American Psychiatric Association: Diagnostic and Statistical Manual of Mental Disorders (DSM-5®). American Psychiatric Publishing (2013)
2.
Zurück zum Zitat Patten, E., Watson, L.R.: Interventions targeting attention in young children with autism. Am. J. Speech-Lang. Pathol. 20(1), 60–69 (2011)CrossRef Patten, E., Watson, L.R.: Interventions targeting attention in young children with autism. Am. J. Speech-Lang. Pathol. 20(1), 60–69 (2011)CrossRef
4.
Zurück zum Zitat Baird, G., Cass, H., Slonims, V.: Diagnosis of autism. BMJ 327(7413), 488–493 (2003)CrossRef Baird, G., Cass, H., Slonims, V.: Diagnosis of autism. BMJ 327(7413), 488–493 (2003)CrossRef
5.
Zurück zum Zitat Lahiri, U., et al.: Design of a virtual reality based adaptive response technology for children with autism. IEEE Trans. Neural Syst. Rehabil. Eng. 21(1), 55–64 (2013)CrossRef Lahiri, U., et al.: Design of a virtual reality based adaptive response technology for children with autism. IEEE Trans. Neural Syst. Rehabil. Eng. 21(1), 55–64 (2013)CrossRef
6.
Zurück zum Zitat Esubalew, T., et al.: A step towards developing adaptive robot-mediated intervention architecture (ARIA) for children with autism. IEEE Trans. Neural Syst. Rehabil. Eng. 21(2), 289–299 (2013)CrossRef Esubalew, T., et al.: A step towards developing adaptive robot-mediated intervention architecture (ARIA) for children with autism. IEEE Trans. Neural Syst. Rehabil. Eng. 21(2), 289–299 (2013)CrossRef
7.
Zurück zum Zitat Szafir, D., Mutlu, B.: Pay attention!: designing adaptive agents that monitor and improve user engagement. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM (2012) Szafir, D., Mutlu, B.: Pay attention!: designing adaptive agents that monitor and improve user engagement. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM (2012)
8.
Zurück zum Zitat Wetherby, A.M., Prizant, B.M.: Autism Spectrum Disorders: A Transactional Developmental Perspective, vol. 9. Brookes Publishing, Baltimore (2000) Wetherby, A.M., Prizant, B.M.: Autism Spectrum Disorders: A Transactional Developmental Perspective, vol. 9. Brookes Publishing, Baltimore (2000)
9.
Zurück zum Zitat Norman, D.: Emotion & Design: Attractive Things Work Better. Interactions 9(4), 36–42 (2002)CrossRef Norman, D.: Emotion & Design: Attractive Things Work Better. Interactions 9(4), 36–42 (2002)CrossRef
10.
Zurück zum Zitat Escobedo, L., et al.: Using augmented reality to help children with autism stay focused. IEEE Pervasive Comput. 13(1), 38–46 (2014)CrossRef Escobedo, L., et al.: Using augmented reality to help children with autism stay focused. IEEE Pervasive Comput. 13(1), 38–46 (2014)CrossRef
11.
Zurück zum Zitat Sonne, T., Obel, C., Grønbæk, K.: Designing real time assistive technologies: a study of children with ADHD. In: Proceedings of the Annual Meeting of the Australian Special Interest Group for Computer Human Interaction. ACM (2015) Sonne, T., Obel, C., Grønbæk, K.: Designing real time assistive technologies: a study of children with ADHD. In: Proceedings of the Annual Meeting of the Australian Special Interest Group for Computer Human Interaction. ACM (2015)
12.
Zurück zum Zitat Mana, N., Mich, O.: Towards the design of technology for measuring and capturing children’s attention on e-learning tasks. In: Proceedings of the 12th International Conference on Interaction Design and Children. ACM (2013) Mana, N., Mich, O.: Towards the design of technology for measuring and capturing children’s attention on e-learning tasks. In: Proceedings of the 12th International Conference on Interaction Design and Children. ACM (2013)
13.
Zurück zum Zitat Huang, R.S., Jung, T.P., Makeig, S.: Multi-scale EEG brain dynamics during sustained attention tasks. In: 2007 IEEE International Conference on Acoustics, Speech and Signal Processing - ICASSP 2007 (2007) Huang, R.S., Jung, T.P., Makeig, S.: Multi-scale EEG brain dynamics during sustained attention tasks. In: 2007 IEEE International Conference on Acoustics, Speech and Signal Processing - ICASSP 2007 (2007)
14.
Zurück zum Zitat Ghassemi, F., et al.: Classification of sustained attention level based on morphological features of EEG’s independent components, pp. 1–6 (2009) Ghassemi, F., et al.: Classification of sustained attention level based on morphological features of EEG’s independent components, pp. 1–6 (2009)
15.
Zurück zum Zitat Hamadicharef, B., et al.: Learning EEG-based spectral-spatial patterns for attention level measurement. In: 2009 IEEE International Symposium on Circuits and Systems (2009) Hamadicharef, B., et al.: Learning EEG-based spectral-spatial patterns for attention level measurement. In: 2009 IEEE International Symposium on Circuits and Systems (2009)
16.
Zurück zum Zitat Silva, C.S., Principe, J.C., Keil, A.: A novel methodology to quantify dense EEG in cognitive tasks. In: 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (2017) Silva, C.S., Principe, J.C., Keil, A.: A novel methodology to quantify dense EEG in cognitive tasks. In: 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (2017)
17.
Zurück zum Zitat Belle, A., Hobson, R., Najarian, K.: A physiological signal processing system for optimal engagement and attention detection. In: 2011 IEEE International Conference on Bioinformatics and Biomedicine Workshops (BIBMW) (2011) Belle, A., Hobson, R., Najarian, K.: A physiological signal processing system for optimal engagement and attention detection. In: 2011 IEEE International Conference on Bioinformatics and Biomedicine Workshops (BIBMW) (2011)
18.
Zurück zum Zitat Zheng, C., et al.: An EEG-based adaptive training system for ASD children. In: UIST 2017 Adjunct - Adjunct Publication of the 30th Annual ACM Symposium on User Interface Software and Technology (2017) Zheng, C., et al.: An EEG-based adaptive training system for ASD children. In: UIST 2017 Adjunct - Adjunct Publication of the 30th Annual ACM Symposium on User Interface Software and Technology (2017)
19.
Zurück zum Zitat Schafer, E.C., et al.: Personal FM systems for children with autism spectrum disorders (ASD) and/or attention-deficit hyperactivity disorder (ADHD): an initial investigation. J. Commun. Disord. 46(1), 30–52 (2013)CrossRef Schafer, E.C., et al.: Personal FM systems for children with autism spectrum disorders (ASD) and/or attention-deficit hyperactivity disorder (ADHD): an initial investigation. J. Commun. Disord. 46(1), 30–52 (2013)CrossRef
21.
Zurück zum Zitat Ross, M., et al.: Using support vector machines to classify student attentiveness for the development of personalized learning systems. In: 2013 12th International Conference on Machine Learning and Applications (ICMLA). IEEE (2013) Ross, M., et al.: Using support vector machines to classify student attentiveness for the development of personalized learning systems. In: 2013 12th International Conference on Machine Learning and Applications (ICMLA). IEEE (2013)
22.
Zurück zum Zitat D’Mello, S.K., Graesser, A.: Multimodal semi-automated affect detection from conversational cues, gross body language, and facial features. User Model. User-Adap. Inter. 20(2), 147–187 (2010)CrossRef D’Mello, S.K., Graesser, A.: Multimodal semi-automated affect detection from conversational cues, gross body language, and facial features. User Model. User-Adap. Inter. 20(2), 147–187 (2010)CrossRef
23.
Zurück zum Zitat Yüce, A., et al.: Action units and their cross-correlations for prediction of cognitive load during driving. IEEE Trans. Affect. Comput. 8(2), 161–175 (2017)CrossRef Yüce, A., et al.: Action units and their cross-correlations for prediction of cognitive load during driving. IEEE Trans. Affect. Comput. 8(2), 161–175 (2017)CrossRef
25.
Zurück zum Zitat Katona, J.: Examination and comparison of the EEG based attention test with CPT and T.O.V.A. In: 2014 IEEE 15th International Symposium on Computational Intelligence and Informatics (CINTI) (2014) Katona, J.: Examination and comparison of the EEG based attention test with CPT and T.O.V.A. In: 2014 IEEE 15th International Symposium on Computational Intelligence and Informatics (CINTI) (2014)
26.
Zurück zum Zitat Rizzo, A.A., et al.: A virtual reality scenario for all seasons: the virtual classroom. CNS Spectr. 11(1), 35–44 (2009)CrossRef Rizzo, A.A., et al.: A virtual reality scenario for all seasons: the virtual classroom. CNS Spectr. 11(1), 35–44 (2009)CrossRef
27.
Zurück zum Zitat Díaz-Orueta, U., et al.: AULA virtual reality test as an attention measure: convergent validity with Conners’ continuous performance test. Child Neuropsychol. 20(3), 328–342 (2014)CrossRef Díaz-Orueta, U., et al.: AULA virtual reality test as an attention measure: convergent validity with Conners’ continuous performance test. Child Neuropsychol. 20(3), 328–342 (2014)CrossRef
28.
Zurück zum Zitat Bellani, M., et al.: Virtual reality in autism: state of the art. Epidemiol. Psychiatric Sci. 20(03), 235–238 (2011)CrossRef Bellani, M., et al.: Virtual reality in autism: state of the art. Epidemiol. Psychiatric Sci. 20(03), 235–238 (2011)CrossRef
29.
Zurück zum Zitat Parsons, S., Cobb, S.: State-of-the-art of virtual reality technologies for children on the autism spectrum. Eur. J. Spec. Needs Educ. 26(3), 355–366 (2011)CrossRef Parsons, S., Cobb, S.: State-of-the-art of virtual reality technologies for children on the autism spectrum. Eur. J. Spec. Needs Educ. 26(3), 355–366 (2011)CrossRef
30.
Zurück zum Zitat iMotions, iMotions and Affectiva (2012) iMotions, iMotions and Affectiva (2012)
31.
Zurück zum Zitat Senechal, T., McDuff, D., Kaliouby, R.: Facial action unit detection using active learning and an efficient non-linear kernel approximation. In: Proceedings of the IEEE International Conference on Computer Vision Workshops (2015) Senechal, T., McDuff, D., Kaliouby, R.: Facial action unit detection using active learning and an efficient non-linear kernel approximation. In: Proceedings of the IEEE International Conference on Computer Vision Workshops (2015)
32.
Zurück zum Zitat Williams, J., et al.: The CAST (childhood asperger syndrome test) test accuracy. Autism 9(1), 45–68 (2005)CrossRef Williams, J., et al.: The CAST (childhood asperger syndrome test) test accuracy. Autism 9(1), 45–68 (2005)CrossRef
33.
Zurück zum Zitat Aslan, S., et al.: Learner engagement measurement and classification in 1: 1 learning. In: 2014 13th International Conference on Machine Learning and Applications (ICMLA). IEEE (2014) Aslan, S., et al.: Learner engagement measurement and classification in 1: 1 learning. In: 2014 13th International Conference on Machine Learning and Applications (ICMLA). IEEE (2014)
34.
Zurück zum Zitat Mehrabian, A., Wiener, M.: Decoding of inconsistent communications. J. Pers. Soc. Psychol. 6(1), 109 (1967)CrossRef Mehrabian, A., Wiener, M.: Decoding of inconsistent communications. J. Pers. Soc. Psychol. 6(1), 109 (1967)CrossRef
35.
Zurück zum Zitat Senechal, T., et al.: Facial action recognition combining heterogeneous features via multikernel learning. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 42(4), 993–1005 (2012)CrossRef Senechal, T., et al.: Facial action recognition combining heterogeneous features via multikernel learning. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 42(4), 993–1005 (2012)CrossRef
36.
Zurück zum Zitat Janssen, J.H., et al.: Machines outperform laypersons in recognizing emotions elicited by autobiographical recollection. Hum.-Comput. Interact. 28(6), 479–517 (2013)CrossRef Janssen, J.H., et al.: Machines outperform laypersons in recognizing emotions elicited by autobiographical recollection. Hum.-Comput. Interact. 28(6), 479–517 (2013)CrossRef
37.
Zurück zum Zitat Tariq, U., et al.: Recognizing emotions from an ensemble of features. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 42(4), 1017–1026 (2012)CrossRef Tariq, U., et al.: Recognizing emotions from an ensemble of features. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 42(4), 1017–1026 (2012)CrossRef
38.
Zurück zum Zitat Higuchi, T., et al.: Spatiotemporal characteristics of gaze of children with autism spectrum disorders while looking at classroom scenes. PLoS ONE 12(5), e0175912 (2017)CrossRef Higuchi, T., et al.: Spatiotemporal characteristics of gaze of children with autism spectrum disorders while looking at classroom scenes. PLoS ONE 12(5), e0175912 (2017)CrossRef
39.
Zurück zum Zitat Kinnealey, M., et al.: Effect of classroom modification on attention and engagement of students with autism or dyspraxia. Am. J. Occup. Ther. 66(5), 511–519 (2012)CrossRef Kinnealey, M., et al.: Effect of classroom modification on attention and engagement of students with autism or dyspraxia. Am. J. Occup. Ther. 66(5), 511–519 (2012)CrossRef
40.
Zurück zum Zitat Asteriadis, S., et al.: Estimation of behavioral user state based on eye gaze and head pose—application in an e-learning environment. Multimedia Tools Appl. 41(3), 469–493 (2009)CrossRef Asteriadis, S., et al.: Estimation of behavioral user state based on eye gaze and head pose—application in an e-learning environment. Multimedia Tools Appl. 41(3), 469–493 (2009)CrossRef
41.
Zurück zum Zitat Bieberich, A.A., Morgan, S.B.: Self-regulation and affective expression during play in children with autism or down syndrome: a short-term longitudinal study. J. Autism Dev. Disord. 34(4), 439–448 (2004)CrossRef Bieberich, A.A., Morgan, S.B.: Self-regulation and affective expression during play in children with autism or down syndrome: a short-term longitudinal study. J. Autism Dev. Disord. 34(4), 439–448 (2004)CrossRef
42.
Zurück zum Zitat Czapinski, P., Bryson, S.E.: 9. Reduced facial muscle movements in autism: evidence for dysfunction in the neuromuscular pathway? Brain Cogn. 51(2), 177–179 (2003) Czapinski, P., Bryson, S.E.: 9. Reduced facial muscle movements in autism: evidence for dysfunction in the neuromuscular pathway? Brain Cogn. 51(2), 177–179 (2003)
43.
Zurück zum Zitat Chu, H.C., Tsai, W.W.J., Liao, M.J., Chen, Y.M.: Facial emotion recognition with transition detection for students with high-functioning autism in adaptive e-learning. Soft Comput. 1–27 (2018) Chu, H.C., Tsai, W.W.J., Liao, M.J., Chen, Y.M.: Facial emotion recognition with transition detection for students with high-functioning autism in adaptive e-learning. Soft Comput. 1–27 (2018)
Metadaten
Titel
Attention Assessment: Evaluation of Facial Expressions of Children with Autism Spectrum Disorder
verfasst von
Bilikis Banire
Dena Al Thani
Mustapha Makki
Marwa Qaraqe
Kruthika Anand
Olcay Connor
Kamran Khowaja
Bilal Mansoor
Copyright-Jahr
2019
DOI
https://doi.org/10.1007/978-3-030-23563-5_4

Neuer Inhalt