skip to main content
10.1145/2702123.2702186acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

RIMES: Embedding Interactive Multimedia Exercises in Lecture Videos

Published:18 April 2015Publication History

ABSTRACT

Teachers in conventional classrooms often ask learners to express themselves and show their thought processes by speaking out loud, drawing on a whiteboard, or even using physical objects. Despite the pedagogical value of such activities, interactive exercises available in most online learning platforms are constrained to multiple-choice and short answer questions. We introduce RIMES, a system for easily authoring, recording, and reviewing interactive multimedia exercises embedded in lecture videos. With RIMES, teachers can prompt learners to record their responses to an activity using video, audio, and inking while watching lecture videos. Teachers can then review and interact with all the learners' responses in an aggregated gallery. We evaluated RIMES with 19 teachers and 25 students. Teachers created a diverse set of activities across multiple subjects that tested deep conceptual and procedural knowledge. Teachers found the exercises useful for capturing students' thought processes, identifying misconceptions, and engaging students with content.

Skip Supplemental Material Section

Supplemental Material

pn0382-file3.mp4

mp4

62 MB

References

  1. Anderson, R., Anderson, R., Davis, P., et al. Classroom presenter: Enhancing interactive education with digital ink. Computer 40, 9 (2007), 56--61. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Bandura, A. Self-efficacy: toward a unifying theory of behavioral change. Psychological review 84, 2 (1977).Google ScholarGoogle Scholar
  3. Basu, S., Jacobs, C., and Vanderwende, L. Powergrading: a Clustering Approach to Amplify Human Effort for Short Answer Grading. TACL 1, (2013), 391--402.Google ScholarGoogle ScholarCross RefCross Ref
  4. Bloom, B.S. and Krathwohl, D.R. Taxonomy of educational objectives: The classification of educational goals. Handbook I: Cognitive domain. (1956).Google ScholarGoogle Scholar
  5. Bonwell, C.C. and Eison, J.A. Active Learning: Creating Excitement in the Classroom. 1991 ASHEERIC Higher Education Reports. ERIC, 1991.Google ScholarGoogle Scholar
  6. Breslow, L., Pritchard, D.E., DeBoer, J., Stump, G.S., Ho, A.D., and Seaton, D.T. Studying learning in the worldwide classroom: Research into edX's first MOOC. Research & Practice in Assessment 8, (2013), 13--25.Google ScholarGoogle Scholar
  7. Brooks, M., Basu, S., Jacobs, C., and Vanderwende, L. Divide and correct: using clusters to grade short answers at scale. Learning at Scale '14, ACM (2014), 89--98. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Cambre, J., Kulkarni, C., Bernstein, M.S., and Klemmer, S.R. Talkabout: small-group discussions in massive global classes. Learning at Scale '14, ACM (2014). Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Chi, M.T. Active-constructive-interactive: A conceptual framework for differentiating learning activities. Topics in Cognitive Science 1, 1 (2009), 73--105.Google ScholarGoogle ScholarCross RefCross Ref
  10. Chi, M.T., Bassok, M., Lewis, M.W., Reimann, P., and Glaser, R. Self-explanations: How students study and use examples in learning to solve problems. Cognitive science 13, 2 (1989), 145--182.Google ScholarGoogle Scholar
  11. Coetzee, D., Fox, A., Hearst, M.A., and Hartmann, B. Chatrooms in MOOCs: All Talk and No Action. Learning at Scale '14, ACM (2014), 127--136. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Cohen, J. A coefficient of agreement for nominal scales. Educational and Psychological Measurement 20, (1960), 37--46.Google ScholarGoogle ScholarCross RefCross Ref
  13. Fox, A. From MOOCs to SPOCs. Communications of the ACM 56, 12 (2013), 38--40. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Freeman, S., Eddy, S.L., McDonough, M., et al. Active learning increases student performance in science, engineering, and mathematics. PNAS 111, 23 (2014).Google ScholarGoogle ScholarCross RefCross Ref
  15. Gibbons, J.F. Tutored Videotape Instruction. (1977).Google ScholarGoogle Scholar
  16. Hausmann, R.G. and Chi, M.H. Can a computer interface support self-explaining. Cognitive Technology 7, 1 (2002), 4--14.Google ScholarGoogle Scholar
  17. Kim, J., Nguyen, P.T., Weir, S., Guo, P.J., Miller, R.C., and Gajos, K.Z. Crowdsourcing Step-by-step Information Extraction to Enhance Existing How-to Videos. CHI'14, ACM (2014), 4017--4026. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Kizilcec, R.F., Piech, C., and Schneider, E. Deconstructing Disengagement: Analyzing Learner Subpopulations in Massive Open Online Courses. LAK'13, ACM (2013), 170--179. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Koile, K., Chevalier, K., Rbeiz, M., et al. Supporting Feedback and Assessment of Digital Ink Answers to Inclass Exercises. IAAI'07, (2007), 1787--1794. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Krathwohl, D.R. A revision of Bloom's taxonomy: An overview. Theory into practice 41, 4 (2002), 212--218.Google ScholarGoogle Scholar
  21. Markoff, J. Essay-grading software offers professors a break. The New York Times Apr. 4, 2013, (2013).Google ScholarGoogle Scholar
  22. Monserrat, T.-J.K.P., Li, Y., Zhao, S., and Cao, X. L.IVE: An Integrated Interactive Video-based Learning Environment. CHI'14, ACM (2014), 3399--3402. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Mory, E.H. Feedback research revisited. Handbook of research on educational communications and technology 2, (2004), 745--783.Google ScholarGoogle Scholar
  24. Noer, M. One Man, One Computer, 10 Million Students: How Khan Academy Is Reinventing Education. Forbes, Nov. 19, 2012 (2012).Google ScholarGoogle Scholar
  25. Piech, C., Huang, J., Chen, Z., Do, C., Ng, A., and Koller, D. Tuned models of peer assessment in MOOCs. arXiv preprint arXiv:1307.2579, (2013).Google ScholarGoogle Scholar
  26. Preston, M., Campbell, G., Ginsburg, H., et al. Developing New Tools for Video Analysis and Communication to Promote Critical Thinking. EdMedia'05, (2005), 4357--4364.Google ScholarGoogle Scholar
  27. Prince, M. Does Active Learning Work? A Review of the Research. J. of Engineering Education 93, 3 (2004).Google ScholarGoogle ScholarCross RefCross Ref
  28. Sadler, D.R. Formative assessment and the design of instructional systems. Instructional science 18, 2 (1989).Google ScholarGoogle Scholar
  29. Sams, A. and Bergmann, J. Flip your classroom: Reach every student in every class every day. International Society for Technology in Education (ISTE), (2012).Google ScholarGoogle Scholar
  30. Shumski, D. MOOCs by the numbers: How do EdX, Coursera and Udacity stack up? Education Dive August 15, 2013, (2013).Google ScholarGoogle Scholar
  31. Weir, S., Kim, J., Gajos, K.Z., and Miller, R.C. Learnersourcing Subgoal Labels for How-to Videos. CSCW'15, ACM (2015). Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Wilkerson, M., Griswold, W.G., and Simon, B. Ubiquitous presenter: increasing student access and control in a digital lecturing environment. ACM SIGCSE Bulletin 37, 1 (2005), 116--120. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. Williams, J.J., Kovacs, G., Walker, C., Maldonado, S., and Lombrozo, T. Learning Online via Prompts to Explain. CHI EA'14, ACM (2014), 2269--2274. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Wylie, R. and Chi, M.T. 17 The Self-Explanation Principle in Multimedia Learning. The Cambridge Handbook of Multimedia Learning, (2014), 413.Google ScholarGoogle Scholar
  35. Zhang, D., Zhou, L., Briggs, R.O., and Nunamaker Jr, J.F. Instructional video in e-learning: Assessing the impact of interactive video on learning effectiveness. Information & Management 43, 1 (2006), 15--27. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Zheng, A.Y., Lawhorn, J.K., Lumley, T., and Freeman, S. ASSESSMENT: Application of Bloom's Taxonomy Debunks the "MCAT Myth." Science 319, 5862 (2008).Google ScholarGoogle Scholar

Index Terms

  1. RIMES: Embedding Interactive Multimedia Exercises in Lecture Videos

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI '15: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems
      April 2015
      4290 pages
      ISBN:9781450331456
      DOI:10.1145/2702123

      Copyright © 2015 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 18 April 2015

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      CHI '15 Paper Acceptance Rate486of2,120submissions,23%Overall Acceptance Rate6,199of26,314submissions,24%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader