ABSTRACT
Affect detection is a key component in developing intelligent educational interfaces that are capable of responding to the affective needs of students. In this paper, computer vision and machine learning techniques were used to detect students' affect as they used an educational game designed to teach fundamental principles of Newtonian physics. Data were collected in the real-world environment of a school computer lab, which provides unique challenges for detection of affect from facial expressions (primary channel) and gross body movements (secondary channel) - up to thirty students at a time participated in the class, moving around, gesturing, and talking to each other. Results were cross validated at the student level to ensure generalization to new students. Classification was successful at levels above chance for off-task behavior (area under receiver operating characteristic curve or (AUC = .816) and each affective state including boredom (AUC =.610), confusion (.649), delight (.867), engagement (.679), and frustration (.631) as well as a five-way overall classification of affect (.655), despite the noisy nature of the data. Implications and prospects for affect-sensitive interfaces for educational software in classroom environments are discussed.
- Allison, P. D. Multiple regression: A primer. Pine Forge Press, 1999.Google Scholar
- Arroyo, I., Cooper, D. G., Burleson, W., Woolf, B. P., Muldner, K., and Christopherson, R. Emotion sensors go to school. AIED, (2009), 17--24. Google ScholarDigital Library
- Baker, R., Gowda, S. M., Wixon, M., et al. Towards sensor-free affect detection in cognitive tutor algebra. Proceedings of the 5th International Conference on Educational Data Mining, (2012), 126--133.Google Scholar
- Baker, R., Rodrigo, M. M. T., and Xolocotzin, U. E. The dynamics of affective transitions in simulation problem-solving environments. In A. C. R. Paiva, R. Prada and R.W. Picard, eds., Affective Computing and Intelligent Interaction. Springer, Berlin Heidelberg, 2007, 666--677. Google ScholarDigital Library
- Bosch, N., Chen, Y., and D'Mello, S. It's written on your face: detecting affective states from facial expressions while learning computer programming. Proceedings of the 12th International Conference on Intelligent Tutoring Systems (ITS 2014), Switzerland: Springer International Publishing (2014), 39--44.Google ScholarDigital Library
- Calvo, R. A. and D'Mello, S. Affect detection: An interdisciplinary review of models, methods, and their applications. IEEE Transactions on Affective Computing 1, 1 (2010), 18--37. Google ScholarDigital Library
- Chawla, N. V., Bowyer, K. W., Hall, L. O., and Kegelmeyer, W. P. SMOTE: synthetic minority oversampling technique. Journal of Artificial Intelligence Research 16, (2011), 321--357. Google ScholarDigital Library
- Cook, D. L. The Hawthorne effect in educational research. Phi Delta Kappan, (1962), 116--122.Google Scholar
- Craig, S., Graesser, A., Sullins, J., and Gholson, B. Affect and learning: An exploratory look into the role of affect in learning with AutoTutor. Journal of Educational Media 29, 3 (2004), 241--250.Google ScholarCross Ref
- Dhall, A., Goecke, R., Joshi, J., Wagner, M., and Gedeon, T. Emotion recognition in the wild challenge 2013. Proceedings of the 15th ACM on International Conference on Multimodal Interaction, ACM (2013), 509--516. Google ScholarDigital Library
- D'Mello, S. Dynamical emotions: bodily dynamics of affect during problem solving. Proceedings of the 33rd Annual Conference of the Cognitive Science Society, (2011).Google Scholar
- D'Mello, S. A selective meta-analysis on the relative incidence of discrete affective states during learning with technology. Journal of Educational Psychology 105, 4 (2013), 1082--1099.Google ScholarCross Ref
- D'Mello, S., Blanchard, N., Baker, R., Ocumpaugh, J., and Brawner, K. I feel your pain: A selective review of affect-sensitive instructional strategies. In R. Sottilare, A. Graesser, X. Hu and B. Goldberg, eds., Design Recommendations for Intelligent Tutoring Systems Volume 2: Instructional Management. 2014, 35--48.Google Scholar
- D'Mello, S. and Graesser, A. The half-life of cognitive-affective states during complex learning. Cognition & Emotion 25, 7 (2011), 1299--1308.Google ScholarCross Ref
- D'Mello, S. and Graesser, A. Dynamics of affective states during complex learning. Learning and Instruction 22, 2 (2012), 145--157.Google Scholar
- Ekman, P., Freisen, W. V., and Ancoli, S. Facial signs of emotional experience. Journal of Personality and Social Psychology 39, 6 (1980), 1125--1134.Google ScholarCross Ref
- Ekman, P. and Friesen, W. V. Facial action coding system. Consulting Psychologist Press, (1978), Palo Alto, CA.Google Scholar
- Fawcett, T. An introduction to ROC analysis. Pattern Recognition Letters 27, 8 (2006), 861--874. Google ScholarDigital Library
- Frenzel, A. C., Pekrun, R., and Goetz, T. Perceived learning environment and students' emotional experiences: A multilevel analysis of mathematics classrooms. Learning and Instruction 17, 5 (2007), 478--493.Google ScholarCross Ref
- Grafsgaard, J. F., Wiggins, J. B., Boyer, K. E., Wiebe, E. N., and Lester, J. C. Automatically recognizing facial indicators of frustration: A learning-centric analysis. (2013).Google Scholar
- Hernandez, J., Hoque, M. (Ehsan), Drevo, W., and Picard, R. W. Mood meter: Counting smiles in the wild. Proceedings of the 2012 ACM Conference on Ubiquitous Computing, ACM (2012), 301--310. Google ScholarDigital Library
- Holmes, G., Donkin, A., and Witten, I. H. WEKA: a machine learning workbench. Proceedings of the Second Australian and New Zealand Conference on Intelligent Information Systems, (1994), 357--361.Google ScholarCross Ref
- Hoque, M. E., McDuff, D., and Picard, R. W. Exploring temporal patterns in classifying frustrated and delighted smiles. IEEE Transactions on Affective Computing 3, 3 (2012), 323--334. Google ScholarDigital Library
- Jeni, L.., Cohn, J. F., and de la Torre, F. Facing imbalanced data - Recommendations for the use of performance metrics. 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction (ACII), (2013), 245--251. Google ScholarDigital Library
- Kapoor, A., Burleson, W., and Picard, R. W. Automatic prediction of frustration. International Journal of Human-Computer Studies 65, 8 (2007), 724--736. Google ScholarDigital Library
- Kapoor, A. and Picard, R. W. Multimodal affect recognition in learning environments. Proceedings of the 13th Annual ACM International Conference on Multimedia, ACM (2005), 677--682. Google ScholarDigital Library
- Kononenko, I. Estimating attributes: Analysis and extensions of RELIEF. In F. Bergadano and L. D. Raedt, eds., Machine Learning: ECML-94. Springer, Berlin Heidelberg, 1994, 171--182. Google ScholarCross Ref
- Lepper, M. R., Woolverton, M., Mumme, D. L., and Gurtner, J. Motivational techniques of expert human tutors: Lessons for the design of computer-based tutors. Computers as cognitive tools 1993, (1993), 75--105.Google Scholar
- Littlewort, G., Whitehill, J., Wu, T., et al. The computer expression recognition toolbox (CERT). 2011 IEEE International Conference on Automatic Face Gesture Recognition and Workshops (FG 2011), (2011), 298--305.Google ScholarCross Ref
- McDaniel, B. T., D'Mello, S., King, B. G., Chipman, P., Tapp, K., and Graesser, A. Facial features for affective state detection in learning environments. Proceedings of the 29th Annual Cognitive Science Society, (2007), 467--472.Google Scholar
- McDuff, D., El Kaliouby, R., Senechal, T., Amr, M., Cohn, J. F., and Picard, R. Affectiva - MIT facial expression dataset (AM-FED): Naturalistic and spontaneous facial expressions collected in-the-wild. 2013 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), (2013), 881--888. Google ScholarDigital Library
- McDuff, D., El Kaliouby, R., Senechal, T., Demirdjian, D., and Picard, R. Automatic measurement of ad preferences from facial responses gathered over the Internet. Image and Vision Computing 32, 10 (2014), 630--640.Google ScholarCross Ref
- Mota, S. and Picard, R. W. Automated posture analysis for detecting learner's interest level. Conference on Computer Vision and Pattern Recognition Workshop (CVPRW '03), (2003), 49--56.Google ScholarCross Ref
- Ocumpaugh, J., Baker, R., Kamarainen, A., and Metcalf, S. Modifying field observation methods on the fly: Creative metanarrative and disgust in an environmental MUVE. Proceedings of the 4th International Workshop on Personalization Approaches in Learning Environments (PALE), held in conjunction with the 22nd International Conference on User Modeling, Adaptation, and Personalization (UMAP 2014), (2014), 49--54.Google Scholar
- Ocumpaugh, J., Baker, R., and Rodrigo, M. M. T. Baker-Rodrigo observation method protocol (BROMP) 1.0. Training manual version 1.0. Technical Report. New York, NY: EdLab. Manila, Philippines: Ateneo Laboratory for the Learning Sciences, 2012.Google Scholar
- Porayska-Pomsta, K., Mavrikis, M., D'Mello, S., Conati, C., and Baker, R. Knowledge elicitation methods for affect modelling in education. International Journal of Artificial Intelligence in Education 22, 3 (2013), 107--140. Google ScholarDigital Library
- Reisenzein, R., Studtmann, M., and Horstmann, G. Coherence between emotion and facial expression: Evidence from laboratory experiments. Emotion Review 5, 1 (2013), 16--23.Google ScholarCross Ref
- Schutz, P. and Pekrun, R., eds. Emotion in Education. Academic Press, San Diego, CA, 2007.Google Scholar
- Senechal, T., Bailly, K., and Prevost, L. Impact of action unit detection in automatic emotion recognition. Pattern Analysis and Applications 17, 1 (2014), 51--67. Google ScholarDigital Library
- Shute, V. J., Ventura, M., and Kim, Y. J. Assessment and learning of qualitative physics in Newton's Playground. The Journal of Educational Research 106, 6 (2013), 423--430.Google Scholar
- Whitehill, J., Serpell, Z., Lin, Y.-C., Foster, A., and Movellan, J. R. The faces of engagement: Automatic recognition of student engagement from facial expressions. IEEE Transactions on Affective Computing 5, 1 (2014), 86--98.Google ScholarCross Ref
- Zeng, Z., Pantic, M., Roisman, G. I., and Huang, T. S. A survey of affect recognition methods: Audio, visual, and spontaneous expressions. IEEE Transactions on Pattern Analysis and Machine Intelligence 31, 1 (2009), 39--58. Google ScholarDigital Library
Index Terms
- Automatic Detection of Learning-Centered Affective States in the Wild
Recommendations
Using Video to Automatically Detect Learner Affect in Computer-Enabled Classrooms
Regular Articles, Special Issue on Highlights of IUI 2015 (Part 2 of 2) and Special Issue on Highlights of ICMI 2014 (Part 1 of 2)Affect detection is a key component in intelligent educational interfaces that respond to students’ affective states. We use computer vision and machine-learning techniques to detect students’ affect from facial expressions (primary channel) and gross ...
Automatic detection of learner's affect from conversational cues
We explored the reliability of detecting a learner's affect from conversational features extracted from interactions with AutoTutor, an intelligent tutoring system (ITS) that helps students learn by holding a conversation in natural language. Training ...
Developing an affective working companion utilising GSR data
Proceedings of the 15th WSEAS international conference on ComputersAffective Computing field is in its second decade and is showing great achievements and promising results. Affective computing is mainly related to detecting and modifying a person's affective state. This paper presents a review on the fields of emotion ...
Comments