ABSTRACT
Previous studies of student programming ability have raised questions about students' ability to problem solve, read and analyze code, and understand introductory computing concepts. However, it is unclear whether these results are the product of failures of student comprehension or our inability to accurately measure their performance. We propose a method for creating a language independent CS1 assessment instrument and present the results of our analysis used to define the common conceptual content that will serve as the framework for the exam. We conclude with a discussion of future work and our progress towards developing the assessment.
- Computing curricula 2001. Journal on Educational Resources in Computing, 1(3es):1--240, 2001.Google Scholar
- American Educational Research Association, American Psychological Association, and National Council on Measurement in Education. Standards for educational and psychological testing. American Educational Research Association, Washington, DC, 1999.Google Scholar
- J. Cohoon and J. Davidson. Java 5.0 Program Design. McGraw Hill, Boston, MA, 2006. Google ScholarDigital Library
- C. H. Crouch and E. Mazur. Peer instruction: Ten years of experience and results. American Journal of Physics, 69(9):970--977, September 2001.Google ScholarCross Ref
- A. M. Decker. How Students Measure Up: An Assessment Instrument for Introductory Computer Science. PhD thesis, University at Buffalo (SUNY), Buffalo, NY, 2007.Google Scholar
- H. Deitel and P. Deitel. C++: How to Program. Prentice Hall, Upper Saddle River, NJ, 5th edition, 2005.Google Scholar
- M. Felleisen, R. B. Findler, M. Flatt, and S. Krishnamurthi. How to Design Programs: An Introduction to Programming and Computing. MIT Press, Cambridge, MA, 2001. Google ScholarDigital Library
- K. Goldman, P. Gross, C. Heeren, G. Herman, L. Kaczmarczyk, M. C. Loui, and C. Zilles. Identifying important and difficult concepts in introductory computing courses using a Delphi process. In SIGCSE '08: Proceedings of the 39th SIGCSE Technical Symposium on Computer Science Education, pages 256---260, 2008. Google ScholarDigital Library
- T. M. Haladyna. Developing and validating multiple-choice test items. Lawrence Erlbaum Associates, Inc., Mahwah, NJ, 3rd edition, 2004.Google Scholar
- D. Hestenes, M. Wells, and G. Swackhamer. Force concept inventory. The Physics Teacher, 30:141--158, March 1992.Google ScholarCross Ref
- C. Horstmann. Java Concepts. John Wiley and Sons, Hoboken, NJ, 4th edition, 2005.Google Scholar
- C. Horstmann. Big Java. John Wiley and Sons, Hoboken, NJ, 2nd edition, 2006. Google ScholarDigital Library
- J. Lewis and W. Loftus. Java Software Solutions (Java 5.0 version): Foundations of Program Design. Addison Wesley, Boston, MA, 4th edition, 2005. Google ScholarDigital Library
- J. C. Libarkin and S. Anderson. Assessment of learning in entry-level geoscience courses: Results from the geoscience concept inventory. Journal of Geoscience Education, 53:394--401, 2005.Google ScholarCross Ref
- E. F. Lindquist, editor. Educational measurement. American Council on Education, Washington, D.C., 1951.Google Scholar
- R. Lister, E. S. Adams, S. Fitzgerald, W. Fone, J. Hamer, M. Lindholm, R. McCartney, J. E. Moström, K. Sanders, O. Seppälä B. Simon, and L. Thomas. A multi-national study of reading and tracing skills in novice programmers. In ITiCSE-WGR '04: Working group reports from ITiCSE on Innovation and technology in computer science education, pages 119--150, 2004. Google ScholarDigital Library
- D. S. Malik. C++ Programming: From Problem Analysis to Program Design. Thompson Course Technology, Boston, MA, 2nd edition, 2004. Google ScholarDigital Library
- D. S. Malik. Java Programming: From Problem Analysis to Program Design. Thompson Course Technology, Boston, MA, 2nd edition, 2006. Google ScholarDigital Library
- M. McCracken, V. Almstrum, D. Diaz, M. Guzdial, D. Hagan, Y. B.-D. Kolikant, C. Laxer, L. Thomas, I. Utting, and T. Wilusz. A multi-national, multi-institutional study of assessment of programming skills of first-year CS students. SIGCSE Bulletin, 33(4):125--180, 2001. Google ScholarDigital Library
- R. Mercer. Computing Fundamentals with Java. Franklin Beedle and Associates, Wilsonville, OR, 2002.Google Scholar
- P. A. Moss, B. J. Girard, and L. C. Haniford. Validity in Educational Assessment. Review of Research in Education, 30(1):109--162, 2006.Google Scholar
- W. Savitch. Java: An Introduction to Problem Solving and Programming, Prentice Hall, Upper Saddle River, NJ, 4th edition edition, 2005. Google ScholarDigital Library
- W. Savitch. Problem Solving with C++: The Object of Programming. Addison Wesley, Boston, MA, 5th edition edition, 2005. Google ScholarDigital Library
- A. E. Tew, W. M. McCracken, and M. Guzdial. Impact of alternative introductory courses on programming concept understanding. In ICER '05: Proceedings of the 2005 International Workshop on Computing Education Research, pages 25--35, 2005. Google ScholarDigital Library
- C. T. Wu. Intro to Object Oriented Programming using Java. McGraw Hill, Boston, MA, 4th edition, 2006.Google Scholar
- J. M. Zelle. Python Programming: An Introduction to Computer Science, Franklin Beedle, Wilsonville, OR, 2004. Google ScholarDigital Library
Index Terms
- Developing a validated assessment of fundamental CS1 concepts
Recommendations
The FCS1: a language independent assessment of CS1 knowledge
SIGCSE '11: Proceedings of the 42nd ACM technical symposium on Computer science educationA primary goal of many CS education projects is to determine the extent to which a given intervention has had an impact on student learning. However, computing lacks valid assessments for pedagogical or research purposes. Without such valid assessments, ...
Fundamental concepts of CS1: procedural vs. object oriented paradigm - a case study
Proceedings of the 12th annual SIGCSE conference on Innovation and technology in computer science education (ITiCSE'07)For some time, there has been an ongoing debate among Computer Science (CS) educators about the advantages and disadvantages of the shift from the procedural to the Object-Oriented (OO) paradigm. In our institution, we decided to implement this shift in ...
Replicating a Validated CS1 Assessment (Abstract Only)
SIGCSE '16: Proceedings of the 47th ACM Technical Symposium on Computing Science EducationValidated assessments are important for teachers and researchers. A validated assessment is carefully developed to make sure that it is measuring the right things. Computing education needs more and better validated assessments. Validated assessments ...
Comments