ABSTRACT
Automatic recording of user behavior within a system (instrumentation) to develop and test theories has a rich history in psychology and system design. Often, researchers analyze instrumented behavior in isolation from other data. The problem with collecting instrumented behaviors without attitudinal, demographic, and contextual data is that researchers have no way to answer the 'why' behind the 'what'. We have combined the collection and analysis of behavioral instrumentation with other HCI methods to develop a system for Tracking Real-Time User Experience (TRUE). Using two case studies as examples, we demonstrate how we have evolved instrumentation methodology and analysis to extensively improve the design of video games. It is our hope that TRUE is adopted and adapted by the broader HCI community, becoming a useful tool for gaining deep insights into user behavior and improvement of design for other complex systems.
- Badre, A.N. and Santos, P. J. CHIME: A Knowledge-Based Computer-Human Interaction Monitoring Engine. Tech Rept. GIT-GVU-91-06, 1991.Google Scholar
- Balbo, S. EMA: Automatic Analysis Mechanism for the Ergonomic Evaluation of User Interfaces. Tech Rept. CSIRO-DIT 96/44, 1996.Google Scholar
- Chang, E. and Dillon, T.S. Automated usability testing. In Proc. INTERACT 1997, IOS Press (1997), 77--84. Google ScholarDigital Library
- Davis, J., Steury, K., and Pagulayan, R. A survey method for assessing perceptions of a game: The consumer playtest in game design. Game Studies: The International Journal of Computer Game Research, 5 (2005). http://www.gamestudies.org/0501/davis_steury_pagulayan/Google Scholar
- Dumas, J.S. and Redish, J.C. A practical guide to usability testing. (Rev. ed.). Intellect Books, OR, USA, 1999. Google ScholarDigital Library
- Hilbert, D.M. and Redmiles, D.F. Extracting usability information from user interface events. ACM Comput. Surv. 32, 4 (2000), 384--421. Google ScholarDigital Library
- Hochstein, L., Basili, V.R., Zelkowitz, M.V., Hollingsworth, J.K., and Carver, J. Combining self-reported and automatic data to improve programming effort measurement. In Proc. ESEC/FSE 2005, ACM Press (2005), 356--365. Google ScholarDigital Library
- Holtzblat, K. and Beyer, H. Contextual Design: Defining Customer Centered Systems. Morgan Kaufmann, San Francisco, CA, 1998. Google ScholarDigital Library
- Hurst, A., Hudson, S.E., and Mankoff, J. Dynamic detection of novice vs. skilled use without a task model. In Proc. CHI 2007, ACM Press (2007), 271--280. Google ScholarDigital Library
- Ivory, M.Y. and Hearst, M.A. 2001. The state of the art in automating usability evaluation of user interfaces. ACM Comput. Surv. 33, 4 (2001), 470--516. Google ScholarDigital Library
- Kort, J., Steen, M.G.D., de Poot, H., ter Hofte, H., and Mulder, I. Studying usage of complex applications. In Proc. Measuring Behav. 2005, Noldus Information Technology (2005), 266--269.Google Scholar
- Lowgren, J. and Nordqvist, T. Knowledge based evaluation as design support for graphical user interfaces. In Proc. CHI 1992, ACM Press (1992), 181--188. Google ScholarDigital Library
- Medlock, M.C., Wixon, D., Terrano, M., Romero, R.L., and Fulton, B. Using the RITE method to improve products: a definition and a case study. In Proc. UPA 2002, UPA (2002).Google Scholar
- Miles, M.B. and Huberman, M.A. Qualitative Data Analysis: An Expanded Source Book. Sage Publications, Thousand Oaks, CA, 1994.Google Scholar
- Misanchuk, E.R. and Schwier, R. Representing interactive multimedia and hypermedia audit trails. Journal of Educational Multimedia and Hypermedia 1, 3 (1992), 355--372.Google Scholar
- Neal, A.S. and Simons, R.M. Playback: A method for evaluating the usability of software and its documentation. In Proc. CHI 1983, ACM Press (1983), 12--15. Google ScholarDigital Library
- Pagulayan, R., Gunn, D., and Romero, R. A Gameplay-Centered Design Framework for Human Factors in Games. In W. Karwowski (Ed.), 2nd Edition of International Encyclopedia of Ergonomics and Human Factors, Taylor & Francis (2006), 1314--1319.Google ScholarCross Ref
- Pagulayan, R., Keeker, K., Fuller, T., Wixon, D., and Romero, R. User-centered design in games (revision). In J. Jacko and A. Sears (Eds.), Handbook for Human-Computer Interaction in Interactive Systems, Lawrence Erlbaum Associates (In press). Google ScholarDigital Library
- Rea, L.M. and Parker. R.A. Designing and Conducting Survey Research: A Comprehensive Guide (3rd ed.). Jossey-Bass Publishers, CA, USA, 2005.Google Scholar
- Reeves, T.C. and Hedberg, J.G. Interactive Learning Systems Evaluation. Educational Technology Publications, Englewood Cliffs, NJ, 2003.Google Scholar
- Renaud, K. and Gray, P. Making sense of low-level usage data to understand user activities. In Proc. SAICSIT 2004, ACM Press (2004), 115--124. Google ScholarDigital Library
- Skinner, B.F. On the rate of formation of a conditioned reflex. Journal of General Psychology, 7 (1932), 274--86.Google ScholarCross Ref
- Skinner, B.F. The behavior of organisms: An experimental analysis. Appleton-Century, NY, USA, 1938.Google Scholar
- Thompson, C. Halo 3: How Microsoft Labs Invented a New Science of Play. Wired, 15.09 (2007). http://www.wired.com/gaming/virtualworlds/magazine/15-09/ff_haloGoogle Scholar
- Thorndike, E.L. Animal intelligence: An experimental study of the associative processes in animals. Psychological Review Monograph Supplement 2, 4 (1898), Whole No. 8.Google Scholar
- Thorndike, E.L. Animal Intelligence. Macmillan, NY, USA, 1911.Google Scholar
- Whiteside, J., Archer, N.P., Wixon, D., and Good, M. How Do People Really Use Text Editors? In Proc. SIGOA Conference on Office Information Systems, ACM Press (1982), 29--40. Google ScholarDigital Library
Index Terms
- Tracking real-time user experience (TRUE): a comprehensive instrumentation solution for complex systems
Recommendations
User experience: A concept without consensus? Exploring practitioners? perspectives through an international survey
We have surveyed 758 UX professionals from 35 nationalities.The understanding of User Experience varies according to background variables.The domain, role, language and seniority are the main factors impacting the results.UX is considered to be much ...
Interactive usability instrumentation
EICS '09: Proceedings of the 1st ACM SIGCHI symposium on Engineering interactive computing systemsUsage data logged from user interactions can be extremely valuable for evaluating software usability. However, instrumenting software to collect usage data is a time-intensive task that often requires technical expertise as well as an understanding of ...
Comments