skip to main content
10.1145/2930238.2930257acmconferencesArticle/Chapter ViewAbstractPublication PagesumapConference Proceedingsconference-collections
research-article
Public Access

Gender Differences in Facial Expressions of Affect During Learning

Published:13 July 2016Publication History

ABSTRACT

Affective support is crucial during learning, with recent evidence suggesting it is particularly important for female students. Facial expression is a rich channel for affect detection, but a key open question is how facial displays of affect differ by gender during learning. This paper presents an analysis suggesting that facial expressions for women and men differ systematically during learning. Using facial video automatically tagged with facial action units, we find that despite no differences between genders in incoming knowledge, self-efficacy, or personality profile, women displayed one lower facial action unit significantly more than men, while men displayed brow lowering and lip fidgeting more than women. However, numerous facial actions including brow raising and nose wrinkling were strongly correlated with learning in women, whereas only one facial action unit, eyelid raiser, was associated with learning for men. These results suggest that the entire affect adaptation pipeline, from detection to response, may benefit from gender-specific models in order to support students more effectively.

References

  1. I. Arroyo, W. Burleson, T. Minghui, M. K, and B. P. Woolf. Gender Differences in the Use and Benefit of Advanced Learning Technologies for Mathematics. Journal of Educational Psychology, 105(4):957--969, 2013.Google ScholarGoogle ScholarCross RefCross Ref
  2. I. Arroyo, D. G. Cooper, W. Burleson, B. P. Woolf, M. K, and R. M. Christopherson. Emotion Sensors Go To School. In Proceedings of the 14th International Conference on Artificial Intelligence in Education, pages 17--24, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. I. Arroyo, B. P. Woolf, D. G. Cooper, W. Burleson, and K. Muldner. The impact of animated pedagogical agents on girls' and boys' emotions, attitudes, behaviors and learning. In Proceedings of the 11th International Conference on Advanced Learning Technologies, pages 506--510, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. R. S. J. d. Baker, S. K. D'Mello, M. T. Rodrigo, and A. C. Graesser. Better to be frustrated than bored: The incidence, persistence, and impact of learners' cognitive-affective states during interactions with three different computer-based learning environments. International Journal of Human-Computer Studies, 68(4):223--241, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. L. R. Brody and J. A. Hall. Gender and emotion in context. Handbook of emotions, 3:395--408, 2008.Google ScholarGoogle Scholar
  6. W. Burleson and R. W. Picard. Gender-specific approaches to developing emotionally intelligent learning companions. IEEE Intelligent Systems, 22(4):62--69, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. R. A. Calvo and S. K. D'Mello. New Perspectives on Affect and Learning Technologies. Springer Science & Business Media, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. G. Chen, S. M. Gully, and D. Eden. Validation of a new general self-efficacy scale. Organizational Research Methods, 4(1):62--83, 2001.Google ScholarGoogle ScholarCross RefCross Ref
  9. M. Chi, K. VanLehn, D. J. Litman, and P. W. Jordan. Empirically evaluating the application of reinforcement learning to the induction of effective and adaptive pedagogical strategies. User Modelling and User-Adapted Interaction, 21(1-2):137--180, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. S. Corrigan, T. Barkley, and Z. Pardos. Dynamic Approaches to Modeling Student Affect and its Changing Role in Learning and Performance. In Proceedings of the 23rd International Conference on User Modeling, Adaptation, and Personalization, pages 92--103, 2015.Google ScholarGoogle ScholarCross RefCross Ref
  11. S. Craig, S. D'Mello, A. Witherspoon, and A. Graesser. Emote-Aloud during Learning with AutoTutor: Applying the Facial Action Coding System to Cognitive-Affective States during Learning. Cognition and Emotion, 22(5):777--788, 2008.Google ScholarGoogle ScholarCross RefCross Ref
  12. S. D. Craig, A. C. Graesser, J. Sullins, and B. Gholson. Affect and learning: An exploratory look into the role of affect in learning with AutoTutor. Journal of Educational Media, 29(3):241--250, 2004.Google ScholarGoogle ScholarCross RefCross Ref
  13. F. De la Torre and J. F. Cohn. Facial expression analysis. Visual analysis of humans, pages 377--409, 2011.Google ScholarGoogle Scholar
  14. M. Dennis, J. Mastho , and C. Mellish. Adapting Performance Feedback to a Learner's Conscientiousness. In Proceedings of the 20th International Conference on User Modeling, Adaptation, and Personalization, pages 297--302, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. M. C. Desmarais and R. S. J. d. Baker. A review of recent advances in learner and skill modeling in intelligent learning environments. User Modelling and User-Adapted Interaction, 22(1-2):9--38, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. U. Dimberg and L.-O. Lundquist. Gender differences in facial reactions to facial expressions. Biological psychology, 30(2):151--159, 1990.Google ScholarGoogle ScholarCross RefCross Ref
  17. S. D'Mello, B. Lehman, R. Pekrun, and A. Graesser. Confusion can be bene cial for learning. Learning and Instruction, 29:153--170, 2014.Google ScholarGoogle ScholarCross RefCross Ref
  18. S. K. D'Mello. A Selective Meta-analysis on the Relative Incidence of Discrete Affective States during Learning with Technology. Journal of Educational Psychology, 105(4):1082--1099, 2013.Google ScholarGoogle ScholarCross RefCross Ref
  19. S. K. D'Mello and A. C. Graesser. AutoTutor and affective AutoTutor: Learning by talking with cognitively and emotionally intelligent computers that talk back. ACM Transactions on Interactive Intelligent Systems, 2:1--39, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. P. Ekman, W. V. Friesen, and P. Ellsworth. Emotion in the Human Face: Guidelines for Research and an Integration of Findings. Pergamon Press, 1972.Google ScholarGoogle Scholar
  21. P. Ekman, W. V. Friesen, and J. C. Hager. Facial Action Coding System: Investigator's Guide. 2002.Google ScholarGoogle Scholar
  22. L. R. Goldberg. The structure of phenotypic personality traits. American Psychologist, 48(1):26--34, 1993.Google ScholarGoogle ScholarCross RefCross Ref
  23. J. F. Grafsgaard, K. E. Boyer, and J. C. Lester. Predicting facial indicators of confusion with hidden Markov models. In Proceedings of the 4th International Conference on A ective Computing and Intelligent Interaction, pages 97--106, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. J. F. Grafsgaard, S. Y. Lee, B. W. Mott, K. E. Boyer, and J. C. Lester. Modeling Self-Efficacy Across Age Groups with Automatically Tracked Facial Expression. In Proceedings of the 17th International Conference on Artificial Intelligence in Education, pages 582--585, 2015.Google ScholarGoogle ScholarCross RefCross Ref
  25. J. F. Grafsgaard, J. B. Wiggins, K. E. Boyer, E. N. Wiebe, and J. C. Lester. Automatically Recognizing Facial Expression: Predicting Engagement and Frustration. In Proceedings of the 6th International Conference on Educational Data Mining, pages 43--50, 2013.Google ScholarGoogle Scholar
  26. J. F. Grafsgaard, J. B. Wiggins, A. K. Vail, K. E. Boyer, and J. C. Lester. The Additive Value of Multimodal Features for Predicting Engagement, Frustration, and Learning during Tutoring. In Proceedings of the 16th ACM International Conference on Multimodal Interaction, pages 42--49, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. D. T. Green, T. J. Walsh, P. R. Cohen, C. R. Beal, and Y.-H. Chang. Gender Differences and the Value of Choice in Intelligent Tutoring Systems. In Proceedings of the 19th International Conference on User Modeling, Adaptation, and Personalization, pages 341--346, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. E. Y. Ha, J. F. Grafsgaard, C. M. Mitchell, K. E. Boyer, and J. C. Lester. Combining Verbal and Nonverbal Features to Overcome the 'Information Gap' in Task-Oriented Dialogue. In Proceedings of the 13th Annual SIGDIAL Meeting on Discourse and Dialogue, pages 247--256, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. J. A. Hall. Gender e ects in decoding nonverbal cues. Psychological bulletin, 85(4):845, 1978.Google ScholarGoogle ScholarCross RefCross Ref
  30. J. A. Hall and D. Matsumoto. Gender differences in judgments of multiple emotions from facial expressions. Emotion, 4(2):201, 2004.Google ScholarGoogle ScholarCross RefCross Ref
  31. O. P. John and S. Srivastava. The Big Five trait taxonomy: History, measurement, and theoretical perspectives. Handbook of Personality: Theory and Research, 2:102--138, 1999.Google ScholarGoogle Scholar
  32. D. Keltner. Signs of appeasement: Evidence for the distinct displays of embarrassment, amusement, and shame. Journal of Personality and Social Psychology, page 441.Google ScholarGoogle Scholar
  33. M. LaFrance and M. A. Hecht. Option or obligation to smile: The effects of power and gender on facial expression. The social context of nonverbal behavior: Studies in emotion and social interaction, page 431, 1999.Google ScholarGoogle Scholar
  34. M. R. Lepper and M. Woolverton. The wisdom of practice: Lessons learned from the study of highly effective tutors. Improving Academic Achievement: Impact of Psychological Factors on Education, pages 135--158, 2002.Google ScholarGoogle Scholar
  35. G. Littlewort, J. Whitehill, T. Wu, I. Fasel, M. Frank, M. Javier, and M. Bartlett. The computer expression recognition toolbox (CERT). In Proceedings of the 11th International Conference on Automatic Face & Gesture Recognition and Workshops, pages 298--305, 2011.Google ScholarGoogle ScholarCross RefCross Ref
  36. G. C. Littlewort, M. S. Bartlett, L. P. Salamanca, and J. Reilly. Automated measurement of children's facial expressions during problem solving tasks. In Proceedings of the International Conference on Automatic Face & Gesture Recognition and Workshops, pages 30--35, 2011.Google ScholarGoogle ScholarCross RefCross Ref
  37. C. M. Mitchell, E. Y. Ha, K. E. Boyer, and J. C. Lester. Learner characteristics and dialogue: recognising e ective and student-adaptive tutorial strategies. International Journal of Learning Technology, 8(4):382--403, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. A. Mitrovic. Fifteen years of constraint-based tutors: What we have achieved and where we are going. User Modelling and User-Adapted Interaction, 22(1-2):39--72, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. M. O. Z. San Pedro, R. S. J. d. Baker, S. M. Gowda, and N. T. Hefferman. Towards an Understanding of Affect and Knowledge from Student Interaction with an Intelligent Tutoring System. In Proceedings of the 16th International Conference on Artificial Intelligence in Education, pages 41--50, 2013.Google ScholarGoogle ScholarCross RefCross Ref
  40. A. K. Vail, K. E. Boyer, E. N. Wiebe, and J. C. Lester. The Mars and Venus Effect: The Influence of User Gender on the Effectiveness of Adaptive Task Support. In Proceedings of the 23rd International Conference on User Modeling, Adaptation, and Personalization, pages 216--227, 2015.Google ScholarGoogle ScholarCross RefCross Ref
  41. K. VanLehn. The relative effectiveness of human tutoring, intelligent tutoring systems, and other tutoring systems. Educational Psychologist, 46(4):197--221, 2011.Google ScholarGoogle ScholarCross RefCross Ref
  42. K. VanLehn, A. C. Graesser, G. T. Jackson, P. W. Jordan, A. Olney, and C. P. Rose. When Are Tutorial Dialogues More E ective Than Reading? Cognitive Science, 31(1):3--62, 2007.Google ScholarGoogle ScholarCross RefCross Ref
  43. M. Wixon and I. Arroyo. When the Question is Part of the Answer: Examining the Impact of Emotion Self-reports on Student Emotion. In Proceedings of the 22nd International Conference on User Modeling, Adaptation, and Personalization, pages 471--477, 2014.Google ScholarGoogle ScholarCross RefCross Ref
  44. B. P. Woolf, W. Burleson, I. Arroyo, T. Dragon, D. G. Cooper, and R. W. Picard. A ect-Aware Tutors: Recognizing and Responding to Student A ect. International Journal of Learning Technology, 4(3--4):129--164. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Gender Differences in Facial Expressions of Affect During Learning

          Recommendations

          Comments

          Login options

          Check if you have access through your login credentials or your institution to get full access on this article.

          Sign in
          • Published in

            cover image ACM Conferences
            UMAP '16: Proceedings of the 2016 Conference on User Modeling Adaptation and Personalization
            July 2016
            366 pages
            ISBN:9781450343688
            DOI:10.1145/2930238

            Copyright © 2016 ACM

            Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

            Publisher

            Association for Computing Machinery

            New York, NY, United States

            Publication History

            • Published: 13 July 2016

            Permissions

            Request permissions about this article.

            Request Permissions

            Check for updates

            Qualifiers

            • research-article

            Acceptance Rates

            UMAP '16 Paper Acceptance Rate21of123submissions,17%Overall Acceptance Rate162of633submissions,26%

            Upcoming Conference

          PDF Format

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader