skip to main content
10.1145/2960310.2960316acmconferencesArticle/Chapter ViewAbstractPublication PagesicerConference Proceedingsconference-collections
research-article
Public Access

Replication, Validation, and Use of a Language Independent CS1 Knowledge Assessment

Published:25 August 2016Publication History

ABSTRACT

Computing education lags other discipline-based education research in the number and range of validated assessments available to the research community. Validated assessments are important for researchers to reduce experimental error due to flawed assessments and to allow for comparisons between different experiments. Although the need is great, building assessments from scratch is difficult. Once an assessment is built, it's important to be able to replicate it, in order to address problems within it, or to extend it. We developed the Second CS1 Assessment (SCS1) as an isomorphic version of a previously validated language-independent assessment for introductory computer science, the FCS1. Replicating the FCS1 is important to enable free use by a broader research community. This paper is documentation of our process for replicating an existing validated assessment and validating the success of our replication. We present initial use of SCS1 by other research groups, to serve as examples of where it might be used in the future. SCS1 is useful for researchers, but care must be taken to avoid undermining the validity argument.

References

  1. Almstrum, V.L., Henderson, P.B., Harvey, V., Heeren, C., Marion, W., Riedesel, C., Soh, L.-K. and Tew, A.E. 2006. Concept inventories in computer science for the topic discrete mathematics. ACM SIGCSE Bulletin. 38, 4 (2006), 132. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Astrachan, O. and Briggs, A. 2012. The CS Principles Project.Google ScholarGoogle Scholar
  3. Brennan, R.L. 2006. Educational measurement.Google ScholarGoogle Scholar
  4. Bruckman, A., Biggers, M., Ericson, B., Mcklin, T., Dimond, J., Disalvo, B., Hewner, M., Ni, L. and Yardi, S. 2009. "Georgia Computes"!?: Improving the Computing Education Pipeline. Proceedings of the Special Interest Group on Computer Science Education (SIGCSE'09). (2009), 86--90. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Cooper, S., Grover, S. and Simon, B. 2014. Building a virtual community of practice for K-12 CS teachers. Communications of the ACM. 57, 5 (2014), 39--41. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Downey, A.B. 2014. Think Python: How To Think Like a Computer Scientist. Green Tea Press Think X series. June (2014), 300.Google ScholarGoogle Scholar
  7. Fazio, R.H., Sherman, S.J. and Herr, P.M. 1982. The feature-positive effect in the self-perception process: Does not doing matter as much as doing? Journal of Personality and Social Psychology. 42, 3 (1982), 404--411.Google ScholarGoogle ScholarCross RefCross Ref
  8. Field, A. 2005. Discovering Statistics Using SPSS. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Freeman, S., Eddy, S.L., McDonough, M., Smith, M.K., Okoroafor, N., Jordt, H. and Wenderoth, M.P. 2014. Active learning increases student performance in science, engineering, and mathematics. PNAS Proceedings of the National Academy of Sciences of the United States of America. 111, 23 (2014), 8410--8415.Google ScholarGoogle ScholarCross RefCross Ref
  10. Goldman, K., Gross, P., Heeren, C., Herman, G.L., Kaczmarczyk, L., Loui, M.C. and Zilles, C. 2010. Setting the Scope of Concept Inventories for Introductory Computing Subjects. ACM Transactions on Computing Education. 10, 2 (2010), 1--29. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Guzdial, M. 2013. Exploring hypotheses about media computation. Proceedings of the ninth annual international ACM conference on International computing education research - ICER '13 (2013), 19. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Hake, R.R. 1998. Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses. American Journal of Physics. 66, 1 (1998), 64.Google ScholarGoogle ScholarCross RefCross Ref
  13. Hambleton, R.K., Swaminathan, H. and Rogers, H.J. 1991. Fundamentals of item response theory.Google ScholarGoogle Scholar
  14. Herman, G. 2011. The Development of a Digital Logic Concept Inventory.Google ScholarGoogle Scholar
  15. Hestenes, D., Wells, M. and Swackhamer, G. 1992. Force Concept Inventory. The Physics Teacher.Google ScholarGoogle Scholar
  16. Karpierz, K. and Wolfman, S. a. 2014. Misconceptions and concept inventory questions for binary search trees and hash tables. Proceedings of the 45th ACM technical symposium on Computer science education - SIGCSE '14. (2014), 109--114. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Lee, M.J. and Ko, A.J. 2015. Comparing the Effectiveness of Online Learning Approaches on CS1 Learning Outcomes. ICER. (2015), 237--246. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Libarkin, J. 2008. Concept Inventories in Higher Education Science. STEM Education Workshop 2. (2008), 1--13.Google ScholarGoogle Scholar
  19. Libarkin, J.C. and Anderson, S.W. 2005. Assessment of Learning in Entry-Level Geoscience Courses?: Results from the Geoscience Concept Inventory. Journal of Geoscience Education. 53, 4 (2005), 394--401.Google ScholarGoogle ScholarCross RefCross Ref
  20. Lord, F.M. 1952. The relation of the reliability of multiple-choice tests to the distribution of item difficulties. Psychometrika.Google ScholarGoogle Scholar
  21. Magana, A.J., Falk, M.L. and Reese, M.J. 2013. Introducing Discipline-Based Computing in Undergraduate Engineering Education. ACM Transactions on Computing Education. 13, 4 (2013), 1--22. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Miller, M.D., Linn, R.L. and Gronlund, N.E. 2012. Validity. Measurement and assessment in teaching. Pearson Higher Ed.Google ScholarGoogle Scholar
  23. Morrison, B.B., Margulieux, L.E. and Guzdial, M. 2015. Subgoals, Context, and Worked Examples in Learning Computing Problem Solving. ICER. (2015), 21--29. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Mühling, A., Ruf, A. and Hubwieser, P. 2015. Design and First Results of a Psychometric Test for Measuring Basic Programming Abilities. WiPSCE '15. (2015). Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Ni, L., Guzdial, M., Tew, A.E., Morrison, B. and Galanos, R. 2011. Building a Community to Support HS CS Teachers: the Disciplinary Commons for Computing Educators. Proceedings of the 42th ACM technical symposium on Computer Science Education - SIGCSE '11. (2011), 553--558. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Paul, W. and Vahrenhold, J. 2013. Hunting High and Low: Instruments to Detect Misconceptions Related to Algorithms and Data Structures. Proceedings of the 44th ACM technical symposium on Computer Science Education - SIGCSE '13. (2013), 29. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Pollatsek, A. and Well, A.D. 1995. On the use of counterbalanced designs in cognitive research: a suggestion for a better and more powerful analysis. Journal of Experimental Psychology: Learning, Memory, and Cognition. 21, 3 (1995), 785--794.Google ScholarGoogle ScholarCross RefCross Ref
  28. Porter, L., Garcia, S., Tseng, H.-W. and Zingaro, D. 2013. Evaluating student understanding of core concepts in computer architecture. Proceedings of the 18th ACM conference on Innovation and technology in computer science education - ITiCSE '13. (2013), 279. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Rebello, N.S. and Zollman, D. a. 2004. The effect of distracters on student performance on the force concept inventory. American Journal of Physics. 72, 1 (2004), 116.Google ScholarGoogle ScholarCross RefCross Ref
  30. Shavelson, R.J. 2007. A Brief History of Student Learning Assessment: How We Got Where We Are and a Proposal for Where to Go Next.Google ScholarGoogle Scholar
  31. Singley, M. and Anderson, J.R. 1989. The Transfer of Cognitive Skill. Harvard University Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Stefani, L. 2004. Assessment of Student Learning: promoting a scholarly approach. 1 (2004).Google ScholarGoogle Scholar
  33. Stefik, A. and Siebert, S. 2013. An Empirical Investigation into Programming Language Syntax. ACM Transactions on Computing Education. 13, 4 (2013), 1--40. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Taylor, C., Zingaro, D., Porter, L., Webb, K.C., Lee, C.B. and Clancy, M. 2014. Computer science concept inventories: Past and future. Computer Science Education. 24, 4 (2014), 253--276.Google ScholarGoogle ScholarCross RefCross Ref
  35. Tew, A.E. 2010. Assessing Fundamental Introductory Computing Concept Knowledge in a Language Independent Manner Assessing Fundamental Introductory Computing Concept Knowledge. Georgia Institute of Technology.Google ScholarGoogle Scholar
  36. Tew, A.E. and Dorn, B. 2013. The Case for Validated Tools in Computing Education Research. Computer. 46, 9 (2013), 60--66. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Tew, A.E. and Guzdial, M. 2010. Developing a validated assessment of fundamental CS1 concepts. Proceedings of the 41st ACM technical symposium on Computer science education - SIGCSE '10. (2010), 97. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. Tew, A.E. and Guzdial, M. 2011. The FCS1': A Language Independent Assessment of CS1 Knowledge. Proceedings of the 42nd ACM technical symposium on computer science education (2011), 111--116. Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. Utting, I., Tew, A.E., McCracken, M., Thomas, L., Bouvier, D., Frye, R., Paterson, J., Caspersen, M., Kolikant, Y.B.-D., Sorva, J. and Wilusz, T. 2013. A Fresh Look at Novice Programmers' Performance and Their Teachers' Expectations. Proceedings of the {ITiCSE} Working Group Reports Conference on Innovation and Technology in Computer Science Education-working Group Reports. (2013), 15--32. Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. Weintrop, D. and Drive, C. 2015. Using Commutative Assessments to Compare Conceptual Understanding in Blocks-based and Text-based Programs. (2015), 101--110. Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. Yadav, A., Burkhart, D., Moix, D., Snow, E., Bandaru, P. and Clayborn, L. 2015. Sowing the Seeds: A Landscape Study on Assessment in Secondary Computer Science Education.Google ScholarGoogle Scholar

Index Terms

  1. Replication, Validation, and Use of a Language Independent CS1 Knowledge Assessment

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      ICER '16: Proceedings of the 2016 ACM Conference on International Computing Education Research
      August 2016
      310 pages
      ISBN:9781450344494
      DOI:10.1145/2960310

      Copyright © 2016 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 25 August 2016

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      ICER '16 Paper Acceptance Rate26of102submissions,25%Overall Acceptance Rate189of803submissions,24%

      Upcoming Conference

      ICER 2024
      ACM Conference on International Computing Education Research
      August 13 - 15, 2024
      Melbourne , VIC , Australia

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader