skip to main content
10.1145/2543882.2543885acmconferencesArticle/Chapter ViewAbstractPublication PagesiticseConference Proceedingsconference-collections
research-article

The Canterbury QuestionBank: building a repository of multiple-choice CS1 and CS2 questions

Published:29 June 2013Publication History

ABSTRACT

In this paper, we report on an ITiCSE-13 Working Group that developed a set of 654 multiple-choice questions on CS1 and CS2 topics, the Canterbury QuestionBank. We describe the questions, the metadata we investigated, and some preliminary investigations of possible research uses of the QuestionBank. The QuestionBank is publicly available as a repository for computing education instructors and researchers.

References

  1. AlgoViz.org; the Algorithm Visualization Portal. http://http://algoviz.org.Google ScholarGoogle Scholar
  2. The Ensemble Computing Portal. http://www.computingportal.org/. Retrieved August 7, 2013.Google ScholarGoogle Scholar
  3. D. Buck and D. J. Stucki. Design early considered harmful: graduated exposure to complexity and structure based on levels of cognitive development. In Proceedings of the Thirty-First SIGCSE Technical Symposium on Computer Science Education, SIGCSE-00, pages 75--79, 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. T. Clear. The hermeneutics of program comprehension: a 'holey quilt' theory. ACM Inroads, 3(2):6--7, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Q. Cutts, S. Esper, M. Fecho, S. R. Foster, and B. Simon. The Abstraction Transition Taxonomy: developing desired learning outcomes through the lens of situated cognition. In Proceedings of the Eighth Annual International Conference on International Computing Education Research, ICER '12, pages 63--70, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. P. Denny, J. Hamer, A. Luxton-Reilly, and H. Purchase. Peerwise: students sharing their multiple choice questions. In Proceedings of the Fourth International Workshop on Computing Education Research, ICER '08, pages 51--58, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. S. H. Edwards, J. Börstler, L. N. Cassel, M. S. Hall, and J. Hollingsworth. Developing a common format for sharing programming assignments. SIGCSE Bull., 40(4):167--182, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. S. Fincher, M. Kölling, I. Utting, N. Brown, and P. Stevens. Repositories of teaching material and communities of use: Nifty Assignments and the Greenroom. In Proceedings of the Sixth International Workshop on Computing Education Research, pages 107--114, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. R. Gluga, J. Kay, R. Lister, S. Kleitman, and T. Lever. Over-confidence and confusion in using Bloom for programming fundamentals assessment. In Proceedings of the Forty-Third ACM Technical Symposium on Computer Science Education, SIGCSE '12, pages 147--152, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. M. Goldweber. Proposal for an on-line computer science courseware review. In Proceedings of the First Conference on Integrating Technology into Computer Science Education, ITiCSE-96, page 230, 1996. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. S. Grissom, D. Knox, E. Copperman, W. Dann, M. Goldweber, J. Hartman, M. Kuittinen, D. Mutchler, and N. Parlante. Developing a digital library of computer science teaching resources. In Working Group Reports of the Third Annual SIGCSE/SIGCUE ITiCSE Conference on Integrating Technology Into Computer Science Education, ITiCSE-WGR '98, pages 1--13, 1998. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. J. Hamer, Q. Cutts, J. Jackova, A. Luxton-Reilly, R. McCartney, H. Purchase, C. Riedesel, M. Saeli, K. Sanders, and J. Sheard. Contributing student pedagogy. SIGCSE Bull., 40(4):194--212, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. O. Hazzan, T. Lapidot, and N. Ragonis. Guide to Teaching Computer Science. Springer, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. D. Joyce, D. Knox, J. Gerhardt-Powals, E. Koffman, W. Kreuzer, C. Laxer, K. Loose, E. Sutinen, and R. A. Whitehurst. Developing laboratories for the SIGCSE Computing Laboratory Repository: guidelines, recommendations, and sample labs. In The Supplemental Proceedings of the Conference on Integrating Technology Into Computer Science Education: Working Group Reports and Supplemental Proceedings, ITiCSE-WGR '97, pages 1--12, 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. N. Kasto and J. Whalley. Measuring the difficulty of code comprehension tasks using software metrics. In Proceedings of the Fifteenth Australasian Computing Education Conference, ACE2013, 2013.Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. J. Kilpatrick and J. Swafford. Helping Children Learn. National Academies Press, 2002.Google ScholarGoogle Scholar
  17. D. L. Knox. On-line publication of CS laboratories. In Proceedings of the Twenty-Eighth SIGCSE Technical Symposium on Computer Science Education (SIGCSE'97), 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. D. L. Knox. The Computer Science Teaching Center. SIGCSE Bull., 31(2):22--23, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. D. L. Knox. CITIDEL: making resources available. In Proceedings of the Seventh Annual Conference on Innovation and Technology in Computer Science Education, ITiCSE '02, pages 225--225, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. J. R. Landis and G. G. Koch. The Measurement of Observer Agreement for Categorical Data. Biometrics, 33(1), March 1977.Google ScholarGoogle Scholar
  21. R. Lister. Objectives and objective assessment in CS1. In Proceedings of the Thirty-Second SIGCSE Technical Symposium on Computer Science Education, SIGCSE-01, pages 292--296, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. R. Lister. The Neglected Middle Novice Programmer: reading and writing without abstracting. In Proceedings of the Twentieth Annual Conference of the National Advisory Committee on Computing Qualifications, NACCQ-07, pages 133--140, 2007.Google ScholarGoogle Scholar
  23. R. Lister, E. S. Adams, S. Fitzgerald, W. Fone, J. Hamer, M. Lindholm, R. McCartney, J. E. Moström, K. Sanders, O. Seppálá, B. Simon, and L. Thomas. A multi-national study of reading and tracing skills in novice programmers. SIGCSE Bull., 36(4):119--150, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. S. M. Mitchell and W. G. Lutters. Assessing the value of computer science course material repositories. In Proceedings of the Nineteenth Conference on Software Engineering Education and Training Workshops, CSEETW '06, pages 2--5, Washington, DC, USA, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. B. B. Morrison, M. Clancy, R. McCartney, B. Richards, and K. Sanders. Applying data structures in exams. In Proceedings of the Forty-Second ACM Technical Symposium on Computer Science Education, pages 353--358, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. E. Patitsas, M. Craig, and S. Easterbrook. Comparing and contrasting different algorithms leads to increased student learning. In Proceedings of the Ninth Annual Conference on International Computing Education Research, ICER '13, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. M. E. Piontek. Best Practices for Designing and Grading Exams. http://www.crlt.umich.edu/publinks/CRLT_no24.pdf.Google ScholarGoogle Scholar
  28. K. Sanders, B. Richards, J. E. Moström, V. Almstrum, S. Edwards, S. Fincher, K. Gunion, M. Hall, B. Hanks, S. Lonergan, R. McCartney, B. Morrison, J. Spacco, and L. Thomas. DCER: sharing empirical computer science education data. In Proceedings of the Fourth International Workshop on Computing Education Research, ICER '08, pages 137--148, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. C. Schulte. Block model: an educational model of program comprehension as a tool for a scholarly approach to teaching. In Proceedings of the Fourth International Workshop on Computing Education Research, ICER '08, pages 149--160, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Simon, J. Sheard, A. Carbone, D. Chinn, and M.-J. Laakso. A guide to classifying programming examination questions (Working Paper No.2), 2013. http://hdl.handle.net/1959.13/1036148. Retrieved August 6, 2013.Google ScholarGoogle Scholar
  31. Simon, J. Sheard, A. Carbone, D. Chinn, M. Laalso, T. Clear, M. de Raadt, D. D'Souza, R. Lister, A. Philpott, J. Skene, and G. Warburton. Introductory programming: examining the exams. In Fourteenth Australasian Computing Education Conference, ACE2012, pages 61--70, 2012. http://www.crpit.com/confpapers/CRPITV123Simon.pdf. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. B. Simon, M. Clancy, R. McCartney, B. Morrison, B. Richards, and K. Sanders. Making sense of data structures exams. In Proceedings of the Sixth International Workshop on Computing Education Research, ICER-10, pages 97--106, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. C. A. Thompson, J. Smarr, H. Nguyen, and C. Manning. Finding educational resources on the web: exploiting automatic extraction of metadata. In Proceedings of the ECML Workshop on Adaptive Text Extraction and Mining, 2003.Google ScholarGoogle Scholar
  34. M. Tungare, X. Yu, W. Cameron, G. Teng, M. A. Perez-Quinones, L. Cassel, W. Fan, and E. A. Fox. Towards a syllabus repository for computer science courses. SIGCSE Bull., 39(1), 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. J. Whalley, T. Clear, P. Robbins, and E. Thompson. Salient elements in novice solutions to code writing problems. In Proceedings of the Thirteenth Australasian Computing Education Conference, ACE'11, pages 37--46, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. J. Whalley and N. Kasto. Revisiting models of human conceptualisation in the context of a programming examination. In ACE 2013, pages 67--76, 2013.Google ScholarGoogle Scholar
  1. The Canterbury QuestionBank: building a repository of multiple-choice CS1 and CS2 questions

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      ITiCSE -WGR '13: Proceedings of the ITiCSE working group reports conference on Innovation and technology in computer science education-working group reports
      June 2013
      80 pages
      ISBN:9781450326650
      DOI:10.1145/2543882

      Copyright © 2013 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 29 June 2013

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      ITiCSE -WGR '13 Paper Acceptance Rate4of4submissions,100%Overall Acceptance Rate552of1,613submissions,34%

      Upcoming Conference

      ITiCSE 2024

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader