skip to main content
10.1145/2556325.2566246acmconferencesArticle/Chapter ViewAbstractPublication Pagesl-at-sConference Proceedingsconference-collections
research-article
Open Access

Monitoring MOOCs: which information sources do instructors value?

Published:04 March 2014Publication History

ABSTRACT

For an instructor who is teaching a massive open online course (MOOC), what is the best way to understand their class? What is the best way to view how the students are interacting with the content while the course is running? To help prepare for the next iteration, how should the course's data be best analyzed after the fact? How do these instructional monitoring needs differ between online courses with tens of thousands of students and courses with only tens? This paper reports the results of a survey of 92 MOOC instructors who answered questions about which information they find useful in their course, with the end goal of creating an information display for MOOC instructors.

The main findings are: (i) quantitative data sources such as grades, although useful, are not sufficient; understanding the activity in discussion forums and student surveys was rated useful for all use cases by a large majority of respondents, (ii) chat logs were not seen as useful, (iii) for the most part, the same sources of information were seen as useful as found in surveys of smaller online courses, (iv) mockups of existing and novel visualization techniques were responded to positively for use both while the course is running and for planning a revision of the course, and (v) a wide range of views was expressed about other details.

References

  1. Khan Academy coach demo. http://www.khanacademy.org/coach/demo.Google ScholarGoogle Scholar
  2. Aguilar, D. A. G., Theron, R., and Penalvo, F. J. G. Semantic spiral timelines used as support for e-learning. J. UCS 15, 7 (2009), 1526--1545.Google ScholarGoogle Scholar
  3. Breslow, L., Pritchard, D. E., DeBoer, J., Stump, G. S., Ho, A. D., and Seaton, D. Studying learning in the worldwide classroom: Research into edX's first MOOC. Research & Practice in Assessment 8 (2013), 13--25.Google ScholarGoogle Scholar
  4. Coetzee, D., Fox, A., Hearst, M. A., and Hartmann, B. Should your MOOC forum use a reputation system? In Proceedings of the 2014 Conference on Computer-Supported Cooperative Work (2014). Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Cotton, K. Monitoring student learning in the classroom. Northwest Regional Educational Laboratory, 1988.Google ScholarGoogle Scholar
  6. Gaudioso, E., Hernandez-del-Olmo, F., and Montero, M. Enhancing e-learning through teacher support: two experiences. Education, IEEE Transactions on 52, 1 (2009), 109--115. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Gibbs,W. J., Olexa, V., and Bernas, R. S. A visualization tool for managing and studying online communications. Educational Technology & Society 9, 3 (2006), 232--243.Google ScholarGoogle Scholar
  8. Goldberg, M. W. Student participation and progress tracking for web-based courses using WebCT. In Proceedings of the Second International NA WEB Conference (1996), 5--8.Google ScholarGoogle Scholar
  9. Goldberg, M. W., Salari, S., and Swoboda, P. World wide webcourse tool: An environment for building www-based courses. Computer Networks and ISDN Systems 28, 7 (1996), 1219--1231. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Grover, S., Pea, R., and Cooper, S. Promoting active learning & leveraging dashboards for curriculum assessment in an OpenEdX introductory CS course for middle school. In Learning @ Scale, Work in Progress, ACM (2014). Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Hardless, C., and Nulden, U. Visualizing learning activities to support tutors. In CHI'99 extended abstracts on Human factors in computing systems, ACM (1999), 312--313. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Hardy, J., Antonioletti, M., and Bates, S. e-learner tracking: Tools for discovering learner behavior. In The IASTED International Conference on Web-base Education (2004).Google ScholarGoogle Scholar
  13. Huang, J., Piech, C., Nguyen, A., and Guibas, L. Syntactic and functional variability of a million code submissions in a machine learning mooc. In AIED 2013 Workshops Proceedings Volume (2013), 25.Google ScholarGoogle Scholar
  14. Kizilcec, R. F., Piech, C., and Schneider, E. Deconstructing disengagement: analyzing learner subpopulations in massive open online courses. In Proceedings of the Third International Conference on Learning Analytics and Knowledge, ACM (2013), 170--179. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Mazza, R., and Dimitrova, V. Coursevis: Externalising student information to facilitate instructors in distance learning. In Proceedings of the International conference in Artificial Intelligence in Education, Sydney, Australia (2003), 117--129.Google ScholarGoogle Scholar
  16. Mazza, R., and Dimitrova, V. Informing the design of a course data visualisator: an empirical study. In 5th International Conference on New Educational Environments (ICNEE 2003) (2003), 215--220.Google ScholarGoogle Scholar
  17. Mazza, R., and Dimitrova, V. Coursevis: A graphical student monitoring tool for supporting instructors in web-based distance courses. International Journal of Human-Computer Studies 65, 2 (2007), 125--139. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Mazza, R., and Milani, C. Exploring usage analysis in learning systems: Gaining insights from visualisations.In AIED05 workshop on Usage analysis in learningsystems, Citeseer (2005), 65--72.Google ScholarGoogle Scholar
  19. Romero, C., and Ventura, S. Educational data mining: a review of the state of the art. Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on 40, 6 (2010), 601--618. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Williams, F. P., and Conlan, O. Visualizing narrative structures and learning style information in personalized e-learning systems. In Advanced Learning Technologies, 2007. ICALT 2007. Seventh IEEE International Conference on, IEEE (2007), 872--876.Google ScholarGoogle ScholarCross RefCross Ref
  21. Zinn, C., and Scheuer, O. Getting to know your student in distance learning contexts. In Innovative Approaches for Learning and Knowledge Sharing. Springer, 2006, 437--451. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Monitoring MOOCs: which information sources do instructors value?

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      L@S '14: Proceedings of the first ACM conference on Learning @ scale conference
      March 2014
      234 pages
      ISBN:9781450326698
      DOI:10.1145/2556325

      Copyright © 2014 Owner/Author

      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 4 March 2014

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      L@S '14 Paper Acceptance Rate14of38submissions,37%Overall Acceptance Rate117of440submissions,27%

      Upcoming Conference

      L@S '24
      Eleventh ACM Conference on Learning @ Scale
      July 18 - 20, 2024
      Atlanta , GA , USA

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader