Abstract
Systems that automatically assess student programming assignments have been designed and used for over forty years. Systems that objectively test and mark student programming work were developed simultaneously with programming assessment in the computer science curriculum. This article reviews a number of influential automatic assessment systems, including descriptions of the earliest systems, and presents some of the most recent developments. The final sections explore a number of directions automated assessment systems may take, presenting current developments alongside a number of important emerging e-learning specifications.
- Beck, K. 2003. Test Driven Development: By Example. Addison-Wesley, Boston, MA. Google Scholar
- Daly, C. and Horgan, J. 2005. Patterns of plagiarism. In Proceedings of the 36th SIGCSE Technical Symposium on Computer Science Education. Google Scholar
- Daly, C. and Waldron, J. 2004. Assessing the assessment of programming ability. In Proceedings of the 35th SIGCSE Technical Symposium on Computer Science Education. 210--213. Google Scholar
- Daly, C. 1999. RoboProf and an introductory computer programming course. In Proceedings of the 4th Annual SIGCSE/SIGCUE ITiCSE Conference on Innovation and Technology in Computer Science Education. 155--158. Google Scholar
- Edwards, S. H. 2003a. Using test-driven development in the classroom: Providing students with automatic, concrete feedback on performance. In Proceedings of the International Conference on Education and Information Systems: Technologies and Applications. 421--426.Google Scholar
- Edwards, S. H. 2003b. Teaching software testing: Automatic grading meets test-first coding. In Proceedings of the OOPSLA'03 Conference. Poster presentation. 318--319. Google Scholar
- English, J. 2004. Automated Assessment of GUI Programs using JEWL. Proceedings of the 9th Annual SIGCSE Conference on Innovation and Technology in Computer Science Education. 131--141. Google Scholar
- English, J. 2002. Experience with a computer-assisted formal programming examination. ACM SIGCSE Bull. 34, 3: Proceedings of the 7th Annual Conference on Innovation and Technology in Computer Science Education. 51--54. Google Scholar
- English, J. and Siviter, P. 2000. Experience with an automatically assessed course. In Proceedings of the 5th Annual SIGCSE Conference on Innovation and Technology in Computer Science Education. 168--171. Google Scholar
- Forsythe, G. E. and Wirth, N. 1965. Automatic grading programs. Commun. ACM 8, 5, 275--529. Google Scholar
- Gehringer, E. F. 2001. Electronic peer review and peer grading in computer-science courses. In Proceedings of the 32nd SIGCSE Technical Symposium on Computer Science Education. 139--143. Google Scholar
- Hext, J. B. and Winings, J. W. 1969. An automatic grading scheme for simple programming exercises. Commun. ACM 12, 5, 272--275. Google Scholar
- Higgins, C., Hegazy, T., Symeonidis, P., and Tsintsifas, A. 2003. The CourseMaster CBA system: Improvements over Ceilidh. J. Edu. Inf.Technol. 8, 3, 287--304. Google Scholar
- Hollingsworth, J. 1960. Automatic graders for programming classes. Commun. ACM 3, 10, 528--529. Google Scholar
- Hung, S.-L., Kwok, L. F., and Chan, R. 1993. Automatic programming assessment. Comput. Edu. 20, 2, 183--190. Google Scholar
- Isaacson, P. C. and Scott, T. A. 1989. Automating the execution of student programs. SIGCSE Bull. 21, 2, 15--22. Google Scholar
- Jackson, D. and Usher, M. 1997. Grading student programing using ASSYST. In Technical Symposium on Computer Science Education, Proceedings of the 28th SIGCSE (San Jose, CA), 335--339. Google Scholar
- Jackson, D. 2000. A semi-automated approach to on-line assessment. In Proceedings of the 5th Annual SIGCSE Conference on Innovation and Technology in Computer Science Education, 164--167. Google Scholar
- Joy, M. and Luck, M. 1998. Effective electronic marking for on-line assessment. In Proceedings of the 6th Annual Conference on the Teaching of Computing (Dublin City University, Ireland). 134--138. Google Scholar
- Korhonen, A. and Malmi, L. 2000. Algorithm simulation with automatic assessment. In Proceedings of the 5th Annual SIGCSE/SIGCUE ItiCSE Conference on Innovation and Technology in Computer Science Education. 160--163. Google Scholar
- Korhonen, A., Malmi, L., Myllyselk, P., and Scheinin, P. 2002. Does it make a difference if students exercise on the web or in the classroom? In Proceedings of the 7th Annual SIGCSE/SIGCUE Conference on Innovation and Technology in Computer Science Education, ITiCSE02 (Aarhus, Denmark), ACM Press, New York, 121--124. Google Scholar
- Leal, J. P. and Moreira, N. 1998. Automatic grading of programming exercises. Tech. Rep. DCC-98-4, Dep. di Ciência de Computadores, Universidade do Porto, Portugal.Google Scholar
- Luck, M. and Joy, M. S. 1999. A secure on-line submission system. Softw. Pract. Exper. 29,8, 721--740. Google Scholar
- Mitrovic, A., Martin, B., and Mayo, M. 2000. Using evaluation to shape ITS design: Results and experiences with SQL-tutor. In User Modeling and User- Adapted Interaction 12. 243--279. Google Scholar
- Naur, P. 1964. Automatic grading of students' ALGOL programming. BIT 4, 177--188.Google Scholar
- Oliver, R. G. 1998. Experience of assessing programming assignments by computer. In Computer Based Assessment Volume 2: Case Studies in Science and Computing. D. Charman and A. Elmes, eds., University of Plymouth. 45--49.Google Scholar
- Reek, K. A. 1989. The TRY system---or---how to avoid testing student programs. SIGCSE Bull. 21, 1, 112--116. Google Scholar
- Saikkonen, R., Malmi, L., and Korhonen, A. 2001. Fully automatic assessment of programming exercises. In Proceedings of the ITiCSE 2001Conference, ACM Press, New York, 133--136. Google Scholar
- Smythe, C., et al. 2005. IMS question and test interoperability, Ver. 2.0, IMS Global Learning Consortium.Google Scholar
- Sykes, E. R. and Franek, F. 2004. A prototype for an intelligent tutoring system for students learning to program in Java. Int. J. Comput.Appl.1, 35--44.Google Scholar
- Thoburn, G. and Row, G. 1996. PASS---An automated program assessment system. In Proceedings of the 4th Annual Conference on the Teaching of Computing (Centre for Teaching Computing, Dublin City University).R. Rory O'Connor and S. Alexander, eds.Google Scholar
- Von Matt, U. 1994. Kassandra: The automatic grading system. Tech. Rep.UMIACS-TR-94-59, Institute for Advanced Computer Studies, Dept. of Computer Science, University of Maryland.Google Scholar
- Wilson, S., Blinco, K., and Rehak, D. 2004. Service-oriented frameworks: Modelling the infrastructure for the next generation of e-learning systems. JISC-CETIS. Available from JISC.Google Scholar
- Woit, D. and Mason, D. 2003. Effectiveness of on-line assessment. In Proceedings of the 34th SIGCSE Technical Symposium on Computer Science Education. 137--141. Google Scholar
Index Terms
- Automatic test-based assessment of programming: A review
Recommendations
Automated assessment and experiences of teaching programming
This article reports on the design, implementation, and usage of the CourseMarker (formerly known as CourseMaster) courseware Computer Based Assessment (CBA) system at the University of Nottingham. Students use CourseMarker to solve (programming) ...
Formative computer based assessment in diagram based domains
This paper presents an approach to conducting formative assessment of student coursework within diagram-based domains using Computer Based Assessment (CBA) technology. Formative assessment is perceived as a resource-intensive assessment mode and its ...
Formative computer based assessment in diagram based domains
ITICSE '06: Proceedings of the 11th annual SIGCSE conference on Innovation and technology in computer science educationThis paper presents an approach to conducting formative assessment of student coursework within diagram-based domains using Computer Based Assessment (CBA) technology. Formative assessment is perceived as a resource-intensive assessment mode and its ...
Comments