skip to main content
10.1145/1880071.1880109acmconferencesArticle/Chapter ViewAbstractPublication PagesgroupConference Proceedingsconference-collections
research-article

Design, implementation, and evaluation of an approach for determining when programmers are having difficulty

Authors Info & Claims
Published:07 November 2010Publication History

ABSTRACT

Previous research has motivated the idea of automatically determining when programmers are having difficulty, provided an initial algorithm (unimplemented in an actual system), and performed a small student-based evaluation to justify the viability of this concept. We have taken the next step in this line of research by designing and developing two-different systems that incorporate variations of the algorithm, implementing a tool that allows independent observers to code recorded sessions, and performing studies involving both student and industrial programmers. Our work shows that (a) it is possible to develop an efficient and reusable architecture for predicting programmer status, (b) the previous technique can be improved through aggregation of predicted status, (c) the improved technique correlates more with programmers' perception of whether they are stuck than that of observers manually watching the programmers, (d) the observers are quicker than the developers to conclude that programmers are stuck, (e) with appropriate training, the tool can be used to predict even the observers' perceptions, and (f) a group training model offers more accuracy than an individual one when the training and test exercises are the same and carried over a small time frame.

References

  1. Redmiles, D., et al. Continuous Coordination: A New Paradigm to Support Globally Distributed Software Development Projects. Wirtschaftsinformatik, 2007. 49 (Special Issue): p. 28--38.Google ScholarGoogle Scholar
  2. Sarma, A., D. Redmiles, and T.R. André van der Hoek., TR-UNL-CSE-2009-0017, 2009, The Coordination Pyramid: A Perspective on the State of the Art in Coordination Technology. 2009, UNL.Google ScholarGoogle Scholar
  3. Begel, A. and B. Simon. Novice software developers, all over again. in International Computing Education Research Workshop. 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. LaToza, T.D., Venolia, G., and Deline. R. Maintaining Mental Models: A Study of Developer Work Habits. in Proc. ICSE. 2006: IEEE. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Nawrocki, J. R., et al., Pair Programming vs. Side-by-Side Programming, in Software Process Improvement. 2005, Springer Berlin / Heidelberg. p. 28--38. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Dewan, P., et al. Experiments in Distributed Side-by-Side Software Development. In IEEE CollaborateCom. 2009.Google ScholarGoogle ScholarCross RefCross Ref
  7. Carter, J. and P. Dewan. Are You Having Difficulty? In Proc CSCW. 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Herbsleb, J. D., et al. Distance, dependencies, and delay in a global collaboration. In Proc. CSCW. 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Cataldo, M. Sources of Errors in Distributed Development Projects: Implications for Collaborative Tools. In Proc. CSCW. 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Teasley, S., et al. How does radical collocation help a team succeed? In Proc. CSCW. 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Cockburn, A. and L. Williams, The Costs and Benefits of Pair Programming. Extreme Programming Examined. 2001: Addison Wesley. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Williams, L., et al. Building Pair Programming Knowledge through a Family of Experiments. in IEEE ISESE. 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Herbsleb, J. and R. E. Grinter. Splitting the Organization and Integrating the Code: Conway's Law Revisited. Proceedings of International Conference on Software Engineering. 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Hegde, R. and P. Dewan. Connecting Programming Environments to Support Ad-Hoc Collaboration. In Proc. IEEE/ACM ASE. 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Hollan, J. and S. Stornetta. Beyond Being There. In Proc. CHI '92. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Kapoor, A., Burleson, et al., Automatic Prediction of Frustration. International Journal of Human-Computer Studies, 2007. 65(8). Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Liu, Y., Stroulia, E. A Lightweight Project-Management Environment for Small Novice Teams. In Proc. of 3rd International Workshop on Adoption-Centric Software Engineering. 2003.Google ScholarGoogle Scholar
  18. Begole, J. B., et al., Work Rhythms: Analyzing Visualizations of Awareness Histories of Distributed Groups. In Proc. of CSCW. 2002. p. 334--343. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Fogarty, J., Ko, A., Aung. H. H., Golden E., Tang, K. and Hudson S. Examining Task Engagement in Sensor-Based Statistical Models of Human Interruptibility. In Proc. CHI, 331--340, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Tang, J. C., et al. Unobtrusive But Invasive: Using Screen Recording to Collect Field Data on Computer-Mediated Interaction. In Proc. CSCW. 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Shrauger, J. S. and T. M. Osberg. The Relative Accuracy of Self-Predictions and Judgments by Others in Psychological Assessment. Psychological Bulletin, 1981. 90(2): p. 322--351.Google ScholarGoogle ScholarCross RefCross Ref
  22. Chawla, N. V., et. al., Smote: Synthetic minority over-sampling technique. Journal of Artificial Intelligence Research, 2002. 16. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Witten, I. H., Frank, E. Data Mining: Practical Machine Learning Tools and Techniques with Java Implementations. 1999: Morgan Kaufmann. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Y. Sharon. Eclipseye - spying on eclipse. Bachelor's thesis, University of Lugano, 2007.Google ScholarGoogle Scholar

Index Terms

  1. Design, implementation, and evaluation of an approach for determining when programmers are having difficulty

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        GROUP '10: Proceedings of the 2010 ACM International Conference on Supporting Group Work
        November 2010
        378 pages
        ISBN:9781450303873
        DOI:10.1145/1880071

        Copyright © 2010 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 7 November 2010

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        Overall Acceptance Rate125of405submissions,31%

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader