skip to main content
10.1145/2678025.2701397acmconferencesArticle/Chapter ViewAbstractPublication PagesiuiConference Proceedingsconference-collections
research-article

Automatic Detection of Learning-Centered Affective States in the Wild

Published:18 March 2015Publication History

ABSTRACT

Affect detection is a key component in developing intelligent educational interfaces that are capable of responding to the affective needs of students. In this paper, computer vision and machine learning techniques were used to detect students' affect as they used an educational game designed to teach fundamental principles of Newtonian physics. Data were collected in the real-world environment of a school computer lab, which provides unique challenges for detection of affect from facial expressions (primary channel) and gross body movements (secondary channel) - up to thirty students at a time participated in the class, moving around, gesturing, and talking to each other. Results were cross validated at the student level to ensure generalization to new students. Classification was successful at levels above chance for off-task behavior (area under receiver operating characteristic curve or (AUC = .816) and each affective state including boredom (AUC =.610), confusion (.649), delight (.867), engagement (.679), and frustration (.631) as well as a five-way overall classification of affect (.655), despite the noisy nature of the data. Implications and prospects for affect-sensitive interfaces for educational software in classroom environments are discussed.

References

  1. Allison, P. D. Multiple regression: A primer. Pine Forge Press, 1999.Google ScholarGoogle Scholar
  2. Arroyo, I., Cooper, D. G., Burleson, W., Woolf, B. P., Muldner, K., and Christopherson, R. Emotion sensors go to school. AIED, (2009), 17--24. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Baker, R., Gowda, S. M., Wixon, M., et al. Towards sensor-free affect detection in cognitive tutor algebra. Proceedings of the 5th International Conference on Educational Data Mining, (2012), 126--133.Google ScholarGoogle Scholar
  4. Baker, R., Rodrigo, M. M. T., and Xolocotzin, U. E. The dynamics of affective transitions in simulation problem-solving environments. In A. C. R. Paiva, R. Prada and R.W. Picard, eds., Affective Computing and Intelligent Interaction. Springer, Berlin Heidelberg, 2007, 666--677. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Bosch, N., Chen, Y., and D'Mello, S. It's written on your face: detecting affective states from facial expressions while learning computer programming. Proceedings of the 12th International Conference on Intelligent Tutoring Systems (ITS 2014), Switzerland: Springer International Publishing (2014), 39--44.Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Calvo, R. A. and D'Mello, S. Affect detection: An interdisciplinary review of models, methods, and their applications. IEEE Transactions on Affective Computing 1, 1 (2010), 18--37. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Chawla, N. V., Bowyer, K. W., Hall, L. O., and Kegelmeyer, W. P. SMOTE: synthetic minority oversampling technique. Journal of Artificial Intelligence Research 16, (2011), 321--357. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Cook, D. L. The Hawthorne effect in educational research. Phi Delta Kappan, (1962), 116--122.Google ScholarGoogle Scholar
  9. Craig, S., Graesser, A., Sullins, J., and Gholson, B. Affect and learning: An exploratory look into the role of affect in learning with AutoTutor. Journal of Educational Media 29, 3 (2004), 241--250.Google ScholarGoogle ScholarCross RefCross Ref
  10. Dhall, A., Goecke, R., Joshi, J., Wagner, M., and Gedeon, T. Emotion recognition in the wild challenge 2013. Proceedings of the 15th ACM on International Conference on Multimodal Interaction, ACM (2013), 509--516. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. D'Mello, S. Dynamical emotions: bodily dynamics of affect during problem solving. Proceedings of the 33rd Annual Conference of the Cognitive Science Society, (2011).Google ScholarGoogle Scholar
  12. D'Mello, S. A selective meta-analysis on the relative incidence of discrete affective states during learning with technology. Journal of Educational Psychology 105, 4 (2013), 1082--1099.Google ScholarGoogle ScholarCross RefCross Ref
  13. D'Mello, S., Blanchard, N., Baker, R., Ocumpaugh, J., and Brawner, K. I feel your pain: A selective review of affect-sensitive instructional strategies. In R. Sottilare, A. Graesser, X. Hu and B. Goldberg, eds., Design Recommendations for Intelligent Tutoring Systems Volume 2: Instructional Management. 2014, 35--48.Google ScholarGoogle Scholar
  14. D'Mello, S. and Graesser, A. The half-life of cognitive-affective states during complex learning. Cognition & Emotion 25, 7 (2011), 1299--1308.Google ScholarGoogle ScholarCross RefCross Ref
  15. D'Mello, S. and Graesser, A. Dynamics of affective states during complex learning. Learning and Instruction 22, 2 (2012), 145--157.Google ScholarGoogle Scholar
  16. Ekman, P., Freisen, W. V., and Ancoli, S. Facial signs of emotional experience. Journal of Personality and Social Psychology 39, 6 (1980), 1125--1134.Google ScholarGoogle ScholarCross RefCross Ref
  17. Ekman, P. and Friesen, W. V. Facial action coding system. Consulting Psychologist Press, (1978), Palo Alto, CA.Google ScholarGoogle Scholar
  18. Fawcett, T. An introduction to ROC analysis. Pattern Recognition Letters 27, 8 (2006), 861--874. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Frenzel, A. C., Pekrun, R., and Goetz, T. Perceived learning environment and students' emotional experiences: A multilevel analysis of mathematics classrooms. Learning and Instruction 17, 5 (2007), 478--493.Google ScholarGoogle ScholarCross RefCross Ref
  20. Grafsgaard, J. F., Wiggins, J. B., Boyer, K. E., Wiebe, E. N., and Lester, J. C. Automatically recognizing facial indicators of frustration: A learning-centric analysis. (2013).Google ScholarGoogle Scholar
  21. Hernandez, J., Hoque, M. (Ehsan), Drevo, W., and Picard, R. W. Mood meter: Counting smiles in the wild. Proceedings of the 2012 ACM Conference on Ubiquitous Computing, ACM (2012), 301--310. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Holmes, G., Donkin, A., and Witten, I. H. WEKA: a machine learning workbench. Proceedings of the Second Australian and New Zealand Conference on Intelligent Information Systems, (1994), 357--361.Google ScholarGoogle ScholarCross RefCross Ref
  23. Hoque, M. E., McDuff, D., and Picard, R. W. Exploring temporal patterns in classifying frustrated and delighted smiles. IEEE Transactions on Affective Computing 3, 3 (2012), 323--334. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Jeni, L.., Cohn, J. F., and de la Torre, F. Facing imbalanced data - Recommendations for the use of performance metrics. 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction (ACII), (2013), 245--251. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Kapoor, A., Burleson, W., and Picard, R. W. Automatic prediction of frustration. International Journal of Human-Computer Studies 65, 8 (2007), 724--736. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Kapoor, A. and Picard, R. W. Multimodal affect recognition in learning environments. Proceedings of the 13th Annual ACM International Conference on Multimedia, ACM (2005), 677--682. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Kononenko, I. Estimating attributes: Analysis and extensions of RELIEF. In F. Bergadano and L. D. Raedt, eds., Machine Learning: ECML-94. Springer, Berlin Heidelberg, 1994, 171--182. Google ScholarGoogle ScholarCross RefCross Ref
  28. Lepper, M. R., Woolverton, M., Mumme, D. L., and Gurtner, J. Motivational techniques of expert human tutors: Lessons for the design of computer-based tutors. Computers as cognitive tools 1993, (1993), 75--105.Google ScholarGoogle Scholar
  29. Littlewort, G., Whitehill, J., Wu, T., et al. The computer expression recognition toolbox (CERT). 2011 IEEE International Conference on Automatic Face Gesture Recognition and Workshops (FG 2011), (2011), 298--305.Google ScholarGoogle ScholarCross RefCross Ref
  30. McDaniel, B. T., D'Mello, S., King, B. G., Chipman, P., Tapp, K., and Graesser, A. Facial features for affective state detection in learning environments. Proceedings of the 29th Annual Cognitive Science Society, (2007), 467--472.Google ScholarGoogle Scholar
  31. McDuff, D., El Kaliouby, R., Senechal, T., Amr, M., Cohn, J. F., and Picard, R. Affectiva - MIT facial expression dataset (AM-FED): Naturalistic and spontaneous facial expressions collected in-the-wild. 2013 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), (2013), 881--888. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. McDuff, D., El Kaliouby, R., Senechal, T., Demirdjian, D., and Picard, R. Automatic measurement of ad preferences from facial responses gathered over the Internet. Image and Vision Computing 32, 10 (2014), 630--640.Google ScholarGoogle ScholarCross RefCross Ref
  33. Mota, S. and Picard, R. W. Automated posture analysis for detecting learner's interest level. Conference on Computer Vision and Pattern Recognition Workshop (CVPRW '03), (2003), 49--56.Google ScholarGoogle ScholarCross RefCross Ref
  34. Ocumpaugh, J., Baker, R., Kamarainen, A., and Metcalf, S. Modifying field observation methods on the fly: Creative metanarrative and disgust in an environmental MUVE. Proceedings of the 4th International Workshop on Personalization Approaches in Learning Environments (PALE), held in conjunction with the 22nd International Conference on User Modeling, Adaptation, and Personalization (UMAP 2014), (2014), 49--54.Google ScholarGoogle Scholar
  35. Ocumpaugh, J., Baker, R., and Rodrigo, M. M. T. Baker-Rodrigo observation method protocol (BROMP) 1.0. Training manual version 1.0. Technical Report. New York, NY: EdLab. Manila, Philippines: Ateneo Laboratory for the Learning Sciences, 2012.Google ScholarGoogle Scholar
  36. Porayska-Pomsta, K., Mavrikis, M., D'Mello, S., Conati, C., and Baker, R. Knowledge elicitation methods for affect modelling in education. International Journal of Artificial Intelligence in Education 22, 3 (2013), 107--140. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Reisenzein, R., Studtmann, M., and Horstmann, G. Coherence between emotion and facial expression: Evidence from laboratory experiments. Emotion Review 5, 1 (2013), 16--23.Google ScholarGoogle ScholarCross RefCross Ref
  38. Schutz, P. and Pekrun, R., eds. Emotion in Education. Academic Press, San Diego, CA, 2007.Google ScholarGoogle Scholar
  39. Senechal, T., Bailly, K., and Prevost, L. Impact of action unit detection in automatic emotion recognition. Pattern Analysis and Applications 17, 1 (2014), 51--67. Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. Shute, V. J., Ventura, M., and Kim, Y. J. Assessment and learning of qualitative physics in Newton's Playground. The Journal of Educational Research 106, 6 (2013), 423--430.Google ScholarGoogle Scholar
  41. Whitehill, J., Serpell, Z., Lin, Y.-C., Foster, A., and Movellan, J. R. The faces of engagement: Automatic recognition of student engagement from facial expressions. IEEE Transactions on Affective Computing 5, 1 (2014), 86--98.Google ScholarGoogle ScholarCross RefCross Ref
  42. Zeng, Z., Pantic, M., Roisman, G. I., and Huang, T. S. A survey of affect recognition methods: Audio, visual, and spontaneous expressions. IEEE Transactions on Pattern Analysis and Machine Intelligence 31, 1 (2009), 39--58. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Automatic Detection of Learning-Centered Affective States in the Wild

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      IUI '15: Proceedings of the 20th International Conference on Intelligent User Interfaces
      March 2015
      480 pages
      ISBN:9781450333061
      DOI:10.1145/2678025

      Copyright © 2015 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 18 March 2015

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      IUI '15 Paper Acceptance Rate47of205submissions,23%Overall Acceptance Rate746of2,811submissions,27%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader