skip to main content
10.1145/2675133.2675245acmconferencesArticle/Chapter ViewAbstractPublication PagescscwConference Proceedingsconference-collections
research-article

Massive Open Online Proctor: Protecting the Credibility of MOOCs certificates

Published:28 February 2015Publication History

ABSTRACT

Massive Open Online Courses (MOOCs) enable everyone to receive high-quality education. However, current MOOC creators cannot provide an effective, economical, and scalable method to detect cheating on tests, which would be required for any certification. In this paper, we propose a Massive Open Online Proctoring (MOOP) framework, which combines both automatic and collaborative approaches to detect cheating behaviors in online tests. The MOOP framework consists of three major components: Automatic Cheating Detector (ACD), Peer Cheating Detector (PCD), and Final Review Committee (FRC). ACD uses webcam video or other sensors to monitor students and automatically flag suspected cheating behavior. Ambiguous cases are then sent to the PCD, where students peer-review flagged webcam video to confirm suspicious cheating behaviors. Finally, the list of suspicious cheating behaviors is sent to the FRC to make the final punishing decision. Our experiment show that ACD and PCD can detect usage of a cheat sheet with good accuracy and can reduce the overall human resources required to monitor MOOCs for cheating.

References

  1. ProctorU. http://www.proctoru.com/.Google ScholarGoogle Scholar
  2. ProctorU: Overview and Technology Requirements. http://www.ao.uiuc.edu/support/source/student_services/proctoru_tech.html.Google ScholarGoogle Scholar
  3. THE TRIBUNAL. http://na.leagueoflegends.com/tribunal/.Google ScholarGoogle Scholar
  4. Aggarwal, J., and Ryoo, M. S. Human activity analysis: A review. ACM Computing Surveys (CSUR) 43, 3 (2011), 16. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Case, R., and Cabalka, P. Remote proctoring: Results of a pilot program at western governors university. Proceedings of the 25th Annual Conference on Distance Teaching and Learning 10 (2009), 2010.Google ScholarGoogle Scholar
  6. Dick, M., Sheard, J., Bareiss, C., Carter, J., Joyce, D., Harding, T., and Laxer, C. Addressing student cheating: Definitions and solutions. In Working Group Reports from ITiCSE on Innovation and Technology in Computer Science Education, ITiCSE-WGR '02, ACM (New York, NY, USA, 2002), 172--184. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Eisenberg, A. Keeping an Eye on Online Test-Takers. The New York Times, 2013.Google ScholarGoogle Scholar
  8. Jiang, Y.-G., Dai, Q., Xue, X., Liu, W., and Ngo, C.-W. Trajectory-based modeling of human actions with motion reference points. In Computer Vision - ECCV 2012. Springer, 2012, 425--438. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Matikainen, P., Hebert, M., and Sukthankar, R. Trajectons: Action recognition through the motion analysis of tracked features. In Computer Vision Workshops (ICCV Workshops), 2009 IEEE 12th International Conference on, IEEE (2009), 514--521.Google ScholarGoogle ScholarCross RefCross Ref
  10. Mostow, J., Chang, K.-M., and Nelson, J. Toward exploiting eeg input in a reading tutor. In Proceedings of the 15th International Conference on Artificial Intelligence in Education, AIED'11, Springer-Verlag (Berlin, Heidelberg, 2011), 230--237. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Pappano, L. The Year of the MOOC. The New York Times, 2012.Google ScholarGoogle Scholar
  12. Prince, D. J., Fulton, R. A., and Garsombke, T. W. Comparisons of proctored versus non-proctored testing strategies in graduate distance education curriculum. Journal of College Teaching & Learning 6, 7 (2009).Google ScholarGoogle Scholar
  13. Richardson, R., and North, M. Strengthening the trust in online courses: A common sense approach. J. Comput. Sci. Coll. 28, 5 (May 2013), 266--272. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Rogers, C. F. Faculty perceptions about e-cheating during online testing. J. Comput. Sci. Coll. 22, 2 (Dec. 2006), 206--212. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Shaw, A. D., Horton, J. J., and Chen, D. L. Designing incentives for inexpert human raters. In Proceedings of the ACM 2011 Conference on Computer Supported Cooperative Work, CSCW '11, ACM (New York, NY, USA, 2011), 275--284. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Wang, H., Klaser, A., Schmid, C., and Liu, C.-L. Action recognition by dense trajectories. In Computer Vision and Pattern Recognition (CVPR), 2011 IEEE Conference on, IEEE (2011), 3169--3176. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Wang, H., Kläser, A., Schmid, C., and Liu, C.-L. Dense trajectories and motion boundary descriptors for action recognition. International journal of computer vision 103, 1 (2013), 60--79.Google ScholarGoogle ScholarCross RefCross Ref
  18. Wang, H., Schmid, C., et al. Action recognition with improved trajectories. In International Conference on Computer Vision (2013). Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Young, J. R. Dozens of Plagiarism Incidents Are Reported in Coursera's Free Online Courses. The Chronicle of Higher Education, 2012.Google ScholarGoogle Scholar
  20. Zhang, Y., and van der Schaar, M. Reputation-based incentive protocols in crowdsourcing applications. In INFOCOM, 2012 Proceedings IEEE (March 2012), 2140--2148.Google ScholarGoogle Scholar

Index Terms

  1. Massive Open Online Proctor: Protecting the Credibility of MOOCs certificates

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        CSCW '15: Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing
        February 2015
        1956 pages
        ISBN:9781450329224
        DOI:10.1145/2675133

        Copyright © 2015 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 28 February 2015

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        CSCW '15 Paper Acceptance Rate161of575submissions,28%Overall Acceptance Rate2,235of8,521submissions,26%

        Upcoming Conference

        CSCW '24

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader