skip to main content
10.1145/2702123.2702304acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Mudslide: A Spatially Anchored Census of Student Confusion for Online Lecture Videos

Published:18 April 2015Publication History

ABSTRACT

Educators have developed an effective technique to get feedback after in-person lectures, called "muddy cards." Students are given time to reflect and write the "muddiest" (least clear) point on an index card, to hand in as they leave class. This practice of assigning end-of-lecture reflection tasks to generate explicit student feedback is well suited for adaptation to the challenge of supporting feedback in online video lectures. We describe the design and evaluation of Mudslide, a prototype system that translates the practice of muddy cards into the realm of online lecture videos. Based on an in-lab study of students and teachers, we find that spatially contextualizing students' muddy point feedback with respect to particular lecture slides is advantageous to both students and teachers. We also reflect on further opportunities for enhancing this feedback method based on teachers' and students' experiences with our prototype.

References

  1. Altman, D.G. Practical statistics for medical research. CRC Press, 1990. Google ScholarGoogle ScholarCross RefCross Ref
  2. Angelo, T.A. and Cross, K.P. Classroom assessment techniques. 1993.Google ScholarGoogle Scholar
  3. Breslow, L., Pritchard, D.E., DeBoer, J., Stump, G.S., Ho, A.D., and Seaton, D.T. Studying learning in the worldwide classroom: Research into edX's first MOOC. Research & Practice in Assessment 8, (2013), 13--25.Google ScholarGoogle Scholar
  4. Brush, A.J.B., Bargeron, D., Grudin, J., Borning, A., and Gupta, A. Supporting Interaction Outside of Class: Anchored Discussions vs. Discussion Boards. CSCL 2002, International Society of the Learning Sciences (2002), 425--434. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Cohen, J. A Coefficient of Agreement for Nominal Scales. Educational and Psychological Measurement 20, 1 (1960), 37--46.Google ScholarGoogle ScholarCross RefCross Ref
  6. Craig, S., Graesser, A., Sullins, J., and Gholson, B. Affect and learning: an exploratory look into the role of affect in learning with AutoTutor. Journal of Educational Media 29, 3 (2004), 241--250.Google ScholarGoogle ScholarCross RefCross Ref
  7. Cross, A., Bayyapunedi, M., Ravindran, D., Cutrell, E., and Thies, W. VidWiki: Enabling the Crowd to Improve the Legibility of Online Educational Videos. Proceedings of the 17th ACM Conference on Computer Supported Cooperative Work & Social Computing, ACM (2014), 1167--1175. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. D'Mello, S., Lehman, B., Pekrun, R., and Graesser, A. Confusion can be beneficial for learning. Learning and Instruction 29, (2014), 153--170.Google ScholarGoogle ScholarCross RefCross Ref
  9. Freeman, S., Eddy, S.L., McDonough, M., et al. Active learning increases student performance in science, engineering, and mathematics. PNAS 111, 23 (2014), 8410--8415.Google ScholarGoogle ScholarCross RefCross Ref
  10. Hall, S.R. Teaching By Questioning. MIT Aero-Astro Annual, 2003, 29--35.Google ScholarGoogle Scholar
  11. Harwood, W.S. The one-minute paper: a communication tool for large lecture classes. J. Chemical Education 73, 3 (1996), 229.Google ScholarGoogle ScholarCross RefCross Ref
  12. Herreid, C.F. and Schiller, N.A. Case studies and the flipped classroom. J. College Science Teaching 42, 5 (2013), 62--66.Google ScholarGoogle Scholar
  13. Kibler, J. Cognitive Disequilibrium. In S. Goldstein and J.A. Naglieri, eds., Encyclopedia of Child Behavior and Development. Springer US, 2011, 380--380.Google ScholarGoogle Scholar
  14. Kim, J., Guo, P.J., Cai, C.J., Li, S.-W.D., Gajos, K.Z., and Miller, R.C. Data-Driven Interaction Techniques for Improving Navigation of Educational Videos. UIST 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Kulkarni, C., Wei, K.P., Le, H., et al. Peer and self assessment in massive online classes. ACM Trans. on Computer-Human Interaction 20, 6 (2013), 33. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Lombard, M., Snyder-Duch, J., and Bracken, C.C. Practical resources for assessing and reporting intercoder reliability in content analysis research projects. Retrieved April 19, (2004), 2004.Google ScholarGoogle Scholar
  17. Luther, K., Pavel, A., Wu, W., et al. CrowdCrit: crowdsourcing and aggregating visual design critique. CSCW 2014, ACM (2014), 21--24. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Mosteller, F. The "Muddiest Point in the Lecture" as a feedback device. On Teaching and Learning: J. Harvard-Danforth Center 3, (1989), 10--21.Google ScholarGoogle Scholar
  19. Sadler, D.R. Formative assessment and the design of instructional systems. Instructional science 18, 2 (1989), 119--144.Google ScholarGoogle Scholar
  20. Surowiecki, J. The wisdom of crowds. Random House LLC, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Wattenberg, M. and Viegas, F.B. The Word Tree, an Interactive Visual Concordance. IEEE Transactions on Visualization and Computer Graphics 14, 6 (2008), 1221--1228. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Wilson, R.C. Improving faculty teaching: Effective use of student evaluations and consultants. J. Higher Education, (1986), 196--211.Google ScholarGoogle Scholar
  23. Xu, A., Huang, S.-W., and Bailey, B. Voyant: generating structured feedback on visual designs using a crowd of non-experts. CSCW 2014, ACM (2014), 1433--1444. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Zyto, S., Karger, D., Ackerman, M., and Mahajan, S. Successful classroom deployment of a social document annotation system. CHI 2012, ACM (2012), 1883--1892. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Mudslide: A Spatially Anchored Census of Student Confusion for Online Lecture Videos

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI '15: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems
      April 2015
      4290 pages
      ISBN:9781450331456
      DOI:10.1145/2702123

      Copyright © 2015 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 18 April 2015

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      CHI '15 Paper Acceptance Rate486of2,120submissions,23%Overall Acceptance Rate6,199of26,314submissions,24%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader