skip to main content
10.1145/1357054.1357126acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Tracking real-time user experience (TRUE): a comprehensive instrumentation solution for complex systems

Published:06 April 2008Publication History

ABSTRACT

Automatic recording of user behavior within a system (instrumentation) to develop and test theories has a rich history in psychology and system design. Often, researchers analyze instrumented behavior in isolation from other data. The problem with collecting instrumented behaviors without attitudinal, demographic, and contextual data is that researchers have no way to answer the 'why' behind the 'what'. We have combined the collection and analysis of behavioral instrumentation with other HCI methods to develop a system for Tracking Real-Time User Experience (TRUE). Using two case studies as examples, we demonstrate how we have evolved instrumentation methodology and analysis to extensively improve the design of video games. It is our hope that TRUE is adopted and adapted by the broader HCI community, becoming a useful tool for gaining deep insights into user behavior and improvement of design for other complex systems.

References

  1. Badre, A.N. and Santos, P. J. CHIME: A Knowledge-Based Computer-Human Interaction Monitoring Engine. Tech Rept. GIT-GVU-91-06, 1991.Google ScholarGoogle Scholar
  2. Balbo, S. EMA: Automatic Analysis Mechanism for the Ergonomic Evaluation of User Interfaces. Tech Rept. CSIRO-DIT 96/44, 1996.Google ScholarGoogle Scholar
  3. Chang, E. and Dillon, T.S. Automated usability testing. In Proc. INTERACT 1997, IOS Press (1997), 77--84. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Davis, J., Steury, K., and Pagulayan, R. A survey method for assessing perceptions of a game: The consumer playtest in game design. Game Studies: The International Journal of Computer Game Research, 5 (2005). http://www.gamestudies.org/0501/davis_steury_pagulayan/Google ScholarGoogle Scholar
  5. Dumas, J.S. and Redish, J.C. A practical guide to usability testing. (Rev. ed.). Intellect Books, OR, USA, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Hilbert, D.M. and Redmiles, D.F. Extracting usability information from user interface events. ACM Comput. Surv. 32, 4 (2000), 384--421. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Hochstein, L., Basili, V.R., Zelkowitz, M.V., Hollingsworth, J.K., and Carver, J. Combining self-reported and automatic data to improve programming effort measurement. In Proc. ESEC/FSE 2005, ACM Press (2005), 356--365. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Holtzblat, K. and Beyer, H. Contextual Design: Defining Customer Centered Systems. Morgan Kaufmann, San Francisco, CA, 1998. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Hurst, A., Hudson, S.E., and Mankoff, J. Dynamic detection of novice vs. skilled use without a task model. In Proc. CHI 2007, ACM Press (2007), 271--280. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Ivory, M.Y. and Hearst, M.A. 2001. The state of the art in automating usability evaluation of user interfaces. ACM Comput. Surv. 33, 4 (2001), 470--516. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Kort, J., Steen, M.G.D., de Poot, H., ter Hofte, H., and Mulder, I. Studying usage of complex applications. In Proc. Measuring Behav. 2005, Noldus Information Technology (2005), 266--269.Google ScholarGoogle Scholar
  12. Lowgren, J. and Nordqvist, T. Knowledge based evaluation as design support for graphical user interfaces. In Proc. CHI 1992, ACM Press (1992), 181--188. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Medlock, M.C., Wixon, D., Terrano, M., Romero, R.L., and Fulton, B. Using the RITE method to improve products: a definition and a case study. In Proc. UPA 2002, UPA (2002).Google ScholarGoogle Scholar
  14. Miles, M.B. and Huberman, M.A. Qualitative Data Analysis: An Expanded Source Book. Sage Publications, Thousand Oaks, CA, 1994.Google ScholarGoogle Scholar
  15. Misanchuk, E.R. and Schwier, R. Representing interactive multimedia and hypermedia audit trails. Journal of Educational Multimedia and Hypermedia 1, 3 (1992), 355--372.Google ScholarGoogle Scholar
  16. Neal, A.S. and Simons, R.M. Playback: A method for evaluating the usability of software and its documentation. In Proc. CHI 1983, ACM Press (1983), 12--15. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Pagulayan, R., Gunn, D., and Romero, R. A Gameplay-Centered Design Framework for Human Factors in Games. In W. Karwowski (Ed.), 2nd Edition of International Encyclopedia of Ergonomics and Human Factors, Taylor & Francis (2006), 1314--1319.Google ScholarGoogle ScholarCross RefCross Ref
  18. Pagulayan, R., Keeker, K., Fuller, T., Wixon, D., and Romero, R. User-centered design in games (revision). In J. Jacko and A. Sears (Eds.), Handbook for Human-Computer Interaction in Interactive Systems, Lawrence Erlbaum Associates (In press). Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Rea, L.M. and Parker. R.A. Designing and Conducting Survey Research: A Comprehensive Guide (3rd ed.). Jossey-Bass Publishers, CA, USA, 2005.Google ScholarGoogle Scholar
  20. Reeves, T.C. and Hedberg, J.G. Interactive Learning Systems Evaluation. Educational Technology Publications, Englewood Cliffs, NJ, 2003.Google ScholarGoogle Scholar
  21. Renaud, K. and Gray, P. Making sense of low-level usage data to understand user activities. In Proc. SAICSIT 2004, ACM Press (2004), 115--124. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Skinner, B.F. On the rate of formation of a conditioned reflex. Journal of General Psychology, 7 (1932), 274--86.Google ScholarGoogle ScholarCross RefCross Ref
  23. Skinner, B.F. The behavior of organisms: An experimental analysis. Appleton-Century, NY, USA, 1938.Google ScholarGoogle Scholar
  24. Thompson, C. Halo 3: How Microsoft Labs Invented a New Science of Play. Wired, 15.09 (2007). http://www.wired.com/gaming/virtualworlds/magazine/15-09/ff_haloGoogle ScholarGoogle Scholar
  25. Thorndike, E.L. Animal intelligence: An experimental study of the associative processes in animals. Psychological Review Monograph Supplement 2, 4 (1898), Whole No. 8.Google ScholarGoogle Scholar
  26. Thorndike, E.L. Animal Intelligence. Macmillan, NY, USA, 1911.Google ScholarGoogle Scholar
  27. Whiteside, J., Archer, N.P., Wixon, D., and Good, M. How Do People Really Use Text Editors? In Proc. SIGOA Conference on Office Information Systems, ACM Press (1982), 29--40. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Tracking real-time user experience (TRUE): a comprehensive instrumentation solution for complex systems

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI '08: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
      April 2008
      1870 pages
      ISBN:9781605580111
      DOI:10.1145/1357054

      Copyright © 2008 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 6 April 2008

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      CHI '08 Paper Acceptance Rate157of714submissions,22%Overall Acceptance Rate6,199of26,314submissions,24%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader