ABSTRACT
In college-level introductory computer science courses, the programming ability of students is often evaluated using pseudocode responses to prompts. However, this does not necessarily reflect modern programming practice in industry and academia, where developers have access to compilers to test snippets of code on-the-fly. As a result, use of pseudocode prompts may not capture the full gamut of student capabilities due to lack of support tools usually available when writing programs. An assessment environment where students could write, compile, and run code could provide a more comfortable and familiar experience for students that more accurately captures their abilities. Prior work has found improvement in student performance when digital assessments are used instead of paper-based assessments for pseudocode prompts, but there is limited work focusing on the difference between digital pseudocode and compile-and-run assessment prompts. To investigate the impact of the assessment approach on student experience and performance, we conducted a study at a public university across two introductory programming classes (N=226). We found that students both preferred and performed better on typical programming assessment questions when they utilized a compile-and-run environment compared to a pseudocode environment. Our work suggests that compile-and-run assessments capture more nuanced evaluation of student ability by more closely reflecting the environments of programming practice and supports further work to explore administration of programming assessments.
- ACM 2013. Computer Science Curricula 2013: Curriculum Guidelines for Undergraduate Degree Programs in Computer Science.Google Scholar
- Al-Qdah, M. and Ababneh, I. 2017. Comparing Online and Paper Exams: Performances and Perceptions of Saudi Students. International Journal of Information and Education Technology. 7, 2 (2017), 106--109. DOI: https://doi.org/10.18178/ijiet.2017.7.2 .850Google ScholarCross Ref
- Auerbach, C. and Silverstein, L.B. 2003. Qualitative data: An introduction to coding and analysis. Qualitative data: An introduction to coding and analysis. NYU press. 31--87.Google Scholar
- Bagert, D.J. 1988. Should computer science examinations contain "programming" problems? (Feb. 1988).Google Scholar
- Ben-Ari, M. 2004. Situated Learning in Computer Science Education. Computer Science Education. 14, 2 (2004), 85--100. DOI:https://doi.org/10.1080/08993400412331363823.Google ScholarCross Ref
- Bugbee, A.C. 1996. The Equivalence of Paper-and-Pencil and Computer-Based Testing. Journal of Research on Computing in Education. 28, 3 (1996), 282-- 299. DOI:https://doi.org/10.1080/08886504.1996.10782166.Google ScholarCross Ref
- Canvas: https://learn.canvas.net/login/canvas.Google Scholar
- Chamillard, A.T. and Braun, K.A. 2000. Evaluating Programming Ability in an Introductory Computer Science Course. Proceedings of the Thirty-First SIGCSE Technical Symposium on Computer Science Education (Austin, Texas, USA, May 2000), 5.Google ScholarDigital Library
- Chamillard, A.T. and Joiner, J.K. 2001. Using lab practica to evaluate programming ability. ACM SIGCSE Bulletin (2001), 159--163.Google Scholar
- Clariana, R. and Wallace, P. 2002. Paper--based versus computer--based assessment: key factors associated with the test mode effect. British Journal of Educational Technology. 33, 5 (Nov. 2002), 593--602. DOI:https://doi.org/10.1111/1467--8535.00294.Google ScholarCross Ref
- Daly, C. and Waldron, J. 2004. Assessing the Assessment of Programming Ability. 36, (2004), 210--213. DOI:https://doi.org/10.1145/1028174.971375.Google ScholarDigital Library
- Depradine, C. and Gay, G. 2004. Active participation of integrated development environments in the teaching of object-oriented programming. Computers & Education. 43, 3 (Nov. 2004), 291--298. DOI:https://doi.org/10.1016/j.compedu.2003.10.009.Google ScholarDigital Library
- Espana-Boquera, S. et al. 2017. Analyzing the learning process (in Programming) by using data collected from an online IDE. 2017 16th International Conference on Information Technology Based Higher Education and Training (ITHET) (Ohrid, Macedonia, Jul. 2017), 1--4.Google Scholar
- Haghighi, P.D. et al. 2005. Summative Computer Programming Assessment Using Both Paper and Computer. (2005), 67--75.Google Scholar
- Harrison, M.A. et al. 2011. Which students complete extra-credit work. College Student Journal. 45, 3 (2011), 550--555.Google Scholar
- Hylton, K. et al. 2016. Utilizing webcam-based proctoring to deter misconduct in online exams. Computers & Education. 92--93, (Jan. 2016), 53--63. DOI:https://doi.org/10.1016/j.compedu.2015 .10.002.Google ScholarCross Ref
- Jacobson, N. 2000. Using on-computer exams to ensure beginning students' programming competency. ACM SIGCSE Bulletin. 32, 4 (Dec. 2000), 53--56. DOI:https://doi.org/10.1145/369295.369324.Google ScholarDigital Library
- Lappalainen, V. et al. 2016. Paper-based vs computer-based exams in CS1. Proceedings of the 16th Koli Calling International Conference on Computing Education Research (Koli Finland, Nov. 2016), 172--173.Google ScholarDigital Library
- Lemos, R.S. 1980. Measuring Programming Language Proficiency. 13, 4 (1980), 261--273. DOI:https://doi.org/10.1080/00011037.1980.11008280.Google ScholarCross Ref
- McCracken, M. et al. 2001. A multi-national, multi-institutional study of assessment of programming skills of first-year CS students. (Canterbury, UK, Dec. 2001), 125--180.Google Scholar
- Nachar, N. 2008. The Mann-Whitney U: A Test for Assessing Whether Two Independent Samples Come from the Same Distribution. Tutorials in Quantitative Methods for Psychology. 4, 1 (Mar. 2008), 13--20. DOI:https://doi.org/10.20982/tqmp.04.1.p013.Google ScholarCross Ref
- Noyes, J. et al. 2004. Paper-based versus computer-based assessment: is workload another test mode effect? British Journal of Educational Technology. 35, 1 (Jan. 2004), 111--113. DOI:https://doi.org/10.1111/j.14678535.2004.00373.x.Google ScholarCross Ref
- Öqvist, M. and Nouri, J. 2018. Coding by hand or on the computer? Evaluating the effect of assessment mode on performance of students learning programming. Journal of Computers in Education. 5, 2 (Jun. 2018), 199--219. DOI:https://doi.org/10.1007/s40692-018-0103--3.Google ScholarCross Ref
- Rajala, T. et al. 2016. Automatically assessed electronic exams in programming courses. Proceedings of the Australasian Computer Science Week Multiconference (Canberra Australia, Feb. 2016), 1--8.Google ScholarDigital Library
- Seppälä, O. et al. 2015. Do we know how difficult the rainfall problem is? Proceedings of the 15th Koli Calling Conference on Computing Education Research (Koli Finland, Nov. 2015), 87--96.Google ScholarDigital Library
- Tanenbaum, A.S. and Bos, H. 2008. Modern Operating Systems. Pearson.Google Scholar
- Viera, A.J. and Garrett, J.M. 2005. Understanding Interobserver Agreement: The Kappa Statistic. Family Medicine. May (2005), 360--363. DOI:https://doi.org/Vol. 37, No. 5.Google Scholar
- zyBooks: https://www.zybooks.comGoogle Scholar
Index Terms
- Pseudocode vs. Compile-and-Run Prompts: Comparing Measures of Student Programming Ability in CS1 and CS2
Recommendations
Interactive, Language-neutral Flowcharts and Pseudocode for Teaching Core CS0/1 Programming Concepts: (Abstract Only)
SIGCSE '18: Proceedings of the 49th ACM Technical Symposium on Computer Science EducationIntroductory programming courses often use a full-featured programming language, such as Python, Java, or C++, wherein students concurrently learn programming concepts along with language syntax. However, many instructors believe that learning ...
Some deficiencies of C++ in teaching CS1 and CS2
C++ is a popular programming language for the introductory level (CS1) and at the advance level (CS2) at many community colleges and universities in the USA. While teaching CS1 and CS2, we observed some deficiencies of C++, which are discussed in this ...
Python CS1 as preparation for C++ CS2
SIGCSE '09How suitable is a Python-based CS1 course as preparation for a C++-based CS2 course? After fifteen years of using C++ for both CS1 and CS2, the Computer Science Department at Michigan State University changed the CS1 course to Python. This paper ...
Comments