skip to main content
10.1145/2462476.2465594acmconferencesArticle/Chapter ViewAbstractPublication PagesiticseConference Proceedingsconference-collections
research-article

Towards improving programming habits to create better computer science course outcomes

Published:01 July 2013Publication History

ABSTRACT

We examine a large dataset collected by the Marmoset system in a CS2 course. The dataset gives us a richly detailed portrait of student behavior because it combines automatically collected program snapshots with unit tests that can evaluate the correctness of all snapshots. We find that students who start earlier tend to earn better scores, which is consistent with the findings of other researchers. We also detail the overall work habits exhibited by students. Finally, we evaluate how students use release tokens, a novel mechanism that provides feedback to students without giving away the code for the test cases used for grading, and gives students an incentive to start coding earlier. We find that students seem to use their tokens quite effectively to acquire feedback and improve their project score, though we do not find much evidence suggesting that students start coding particularly early.

References

  1. A. Allevato, M. Thornton, S. H. Edwards, and M. A. Pérez-Quiñones. Mining data from an automated grading and testing system by adding rich reporting capabilities. In R. S. J. de Baker, T. Barnes, and J. E. Beck, editors, EDM, pages 167--176. www.educationaldatamining.org, 2008.Google ScholarGoogle Scholar
  2. T. Beaubouef and J. Mason. Why the high attrition rate for computer science students: some thoughts and observations. SIGCSE Bull., 37(2):103--106, June 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. P. Blikstein. Using learning analytics to assess students' behavior in open-ended programming tasks. In Proceedings of the 1st International Conference on Learning Analytics and Knowledge, LAK '11, pages 110--116, New York, NY, USA, 2011. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. N. S. Board. Science and engineering indicators digest 2012. 2012.Google ScholarGoogle Scholar
  5. L. Carter. Why students with an apparent aptitude for computer science don't choose to major in computer science. SIGCSE Bull., 38(1):27--31, Mar. 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. S. H. Edwards. Using software testing to move students from trial-and-error to reflection-in-action. SIGCSE Bull., 36(1):26--30, Mar. 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. S. H. Edwards, J. Snyder, M. A. Pérez-Quinones, A. Allevato, D. Kim, and B. Tretola. Comparing effective and ineffective behaviors of student programmers. In Proceedings of the fifth international workshop on Computing education research workshop, ICER '09, pages 3--14, New York, NY, USA, 2009. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. N. Falkner and K. Falkner. A fast measure for identifying at-risk students in computer science. In Proceedings of the ninth annual international conference on International computing education research, pages 55--62. ACM, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. J. B. Fenwick, Jr., C. Norris, F. E. Barry, J. Rountree, C. J. Spicer, and S. D. Cheek. Another look at the behaviors of novice programmers. In Proceedings of the 40th ACM technical symposium on Computer science education, SIGCSE '09, pages 296--300, New York, NY, USA, 2009. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. M. C. Jadud. A first look at novice compilation behaviour using bluej. Computer Science Education, 15:1--25, 2005.Google ScholarGoogle ScholarCross RefCross Ref
  11. J. Kasurinen and U. Nikula. Estimating programming knowledge with bayesian knowledge tracing. In Proceedings of the 14th annual ACM SIGCSE conference on Innovation and technology in computer science education, ITiCSE '09, pages 313--317, New York, NY, USA, 2009. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. M. McCracken, V. Almstrum, D. Diaz, M. Guzdial, D. Hagan, Y. B.-D. Kolikant, C. Laxer, L. Thomas, I. Utting, and T. Wilusz. A multi-national, multi-institutional study of assessment of programming skills of first-year cs students. SIGCSE Bull., 33(4):125--180, Dec. 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. C. Murphy, G. Kaiser, K. Loveland, and S. Hasan. Retina: helping students and instructors based on observed programming activities. In Proceedings of the 40th ACM technical symposium on Computer science education, SIGCSE '09, pages 178--182, New York, NY, USA, 2009. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. P. C. of Advisors on Science and Technology. Designing a digital future: Federally funded research and development in networking and information technology. 2013.Google ScholarGoogle Scholar
  15. N. A. of Colleges and Employers. Nace january 2013 salary survey. 2013.Google ScholarGoogle Scholar
  16. J. Spacco, J. Strecker, D. Hovemeyer, and W. Pugh. Software repository mining with marmoset: an automated programming project snapshot and testing system. In Proceedings of the 2005 international workshop on Mining software repositories, MSR '05, pages 1--5, New York, NY, USA, 2005. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. E. S. Tabanao, M. M. T. Rodrigo, and M. C. Jadud. Predicting at-risk novice java programmers through the analysis of online protocols. In Proceedings of the seventh international workshop on Computing education research, ICER '11, pages 85--92, New York, NY, USA, 2011. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. I. Utting, N. Brown, M. Kölling, D. McCall, and P. Stevens. Web-scale data gathering with bluej. In Proceedings of the ninth annual international conference on International computing education research, ICER '12, pages 1--4, New York, NY, USA, 2012. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Towards improving programming habits to create better computer science course outcomes

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      ITiCSE '13: Proceedings of the 18th ACM conference on Innovation and technology in computer science education
      July 2013
      384 pages
      ISBN:9781450320788
      DOI:10.1145/2462476

      Copyright © 2013 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 1 July 2013

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      ITiCSE '13 Paper Acceptance Rate51of161submissions,32%Overall Acceptance Rate552of1,613submissions,34%

      Upcoming Conference

      ITiCSE 2024

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader