Abstract
Usability inspection techniques are widely used, but few focus on users' thinking and many are appropriate only for particular devices and use contexts. We present a new technique (MOT) that guides inspection by metaphors of human thinking. The metaphors concern habit, the stream of thought, awareness and associations, the relation between utterances and thought, and knowing. The main novelty of MOT is its psychological basis combined with its use of metaphors to stimulate inspection. The first of three experiments shows that usability problems uncovered with MOT are more serious and more complex to repair than problems found with heuristic evaluation. Problems found with MOT are also judged more likely to persist for expert users. The second experiment shows that MOT finds more problems than cognitive walkthrough, and has a wider coverage of a reference collection of usability problems. Participants prefer using MOT over cognitive walkthrough; an important reason being the wider scope of MOT. The third experiment compares MOT, cognitive walkthrough, and think aloud testing, in the context of nontraditional user interfaces. Participants prefer using think aloud testing, but identify few problems with that technique that are not found also with MOT or cognitive walkthrough. MOT identifies more problems than the other techniques. Across experiments and measures of usability problems' utility in systems design, MOT performs better than existing inspection techniques and is comparable to think aloud testing.
- Andre, T. S., Hartson, H. R., Belz, S. M., and McCreary, F. A. 2001. The user action framework: A reliable foundation for usability engineering support tools. Int. J. Human-Comput. Stud. 54, 1, 107--136. Google ScholarDigital Library
- Bastien, C. and Scapin, D. 1995. Evaluating a user interface with ergonomic criteria. Int. J. Human-Comput. Stud. 7, 2, 105--121. Google ScholarDigital Library
- Beyer, H. and Holtzblatt, K. 1998. Contextual Design. Morgan-Kaufman, San Francisco, CA.Google Scholar
- Cockton, G., Lavery, D., and Woolrych, A. 2003. Inspection-based evaluations. In The Human-Computer Interaction Handbook. J. A. Jacko, and A. Sears, Eds. Lawrence Erlbaum Associates, Mahwah, NJ, 1118--1138. Google ScholarDigital Library
- Cohen, J. 1992. A power primer. Psych. Bull. 112, 1, 155--159.Google ScholarCross Ref
- Connell, I., Blandford, A., and Green, T. 2004. CASSM and cognitive walkthrough: Usability issues with ticket vending machines. Behav. Inf. Tech. 23, 5, 307--320.Google ScholarCross Ref
- Dybkjær, H. and Dybkjær, L. 2002. Experiences from a Danish spoken dialogue system. In 2nd Danish Human-Computer Interaction Research Symposium, E. Frøkjær and K. Hornbæk Eds. DIKU tech. rep. 02/19, 15--19.Google Scholar
- Erickson, T. D. 1990. Working with interface metaphors. In The Art of Human Computer Interface Design. B. Laurel, Eds. Addison-Wesley, Reading, MA, 65--73.Google Scholar
- Fleiss, J. L. 1981. Statistical Methods for Rates and Proportions. Wiley, New York.Google Scholar
- Frøkjær, E. and Hornbæk, K. 2002. Metaphors of human thinking in HCI: Habit, stream of thought, awareness, utterance, and knowing. In Proceedings of HF2002/OzCHI 2002 (Melbourne, Australia, Nov. 25--27).Google Scholar
- Fu, L. and Salvendy, G. 2002. Effectiveness of user-testing and heuristic evaluation as a function of performance classification. Behav. Inf. Tech. 21, 2, 137--143.Google ScholarCross Ref
- Gardner, H. 1982. Art, Mind and Brain: A Cognitive Approach to Creativity. Basic Books, New York.Google Scholar
- Gray, W. D. and Salzman, M. C. 1998. Damaged merchandise? A review of experiments that compare usability evaluation methods. Human-Comput. Interact. 13, 3, 203--261. Google ScholarDigital Library
- Green, T. R. G. and Petre, M. 1996. Usability analysis of visual programming environments: a ‘cognitive dimensions’ framework. J. Vis. Lang. Comput. 7, 131--174.Google ScholarCross Ref
- Hartson, H. R., Andre, T. S., and Williges, R. C. 2001. Criteria for evaluating usability evaluation methods. Int. J. Human-Comput. Interact. 13, 4, 373--410.Google ScholarCross Ref
- Helms Jørgensen, A. 1990. Thinking-aloud in user interface design: A method promoting cognitive ergonomics. Ergonomics 33, 4, 501--507.Google ScholarCross Ref
- Hertzum, M. and Jacobsen, N. E. 2001. The evaluator effect: A chilling fact about usability evaluation methods. Internat. J. Human-Comput. Interact. 13, 421--443.Google ScholarCross Ref
- Hornbæk, K. and Frøkjær, E. 2002. Evaluating user interfaces with metaphors of human thinking. In Proceedings of 7th ERCIM Workshop “User Interfaces for All,” (ERCIM Workshop on User Interfaces for All), Chantilly, France, Oct. 24-25). N. Carbonell and C. Stephanidis, Eds. Lecture Notes in Computer Science, vol. 2615, Springer-Verlag, Berlin, Germany, 486--507. Google ScholarDigital Library
- Hornbæk, K. and Frøkjær, E. 2004a. Usability inspection by metaphors of human thinking compared to heuristic evaluation. Int. J. Human-Comput. Interact. 17, 3, 357--374.Google ScholarCross Ref
- Hornbæk, K. and Frøkjær, E. 2004b. Two psychology-based usability inspection techniques studied in a diary experiment. In Proceedings of the Nordic Conference on Human-Computer Interaction (Nordichi 2004), 2--12. Google ScholarDigital Library
- Hornbæk, K. and Frøkjær, E. 2005. Comparing usability problems and redesign proposals as input to practical systems development. In Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI 2005). ACM, New York, 391--400. Google ScholarDigital Library
- Hornbæk, K. and Frøkjær, E. 2006. What kind of usability-problem description are useful for developers? In Proceedings of the Annual Meeting of Human Factors and Ergonomics Society (HFES), 2523--2527.Google Scholar
- James, W. 1890. Principles of Psychology. Henry Holt & Co.Google Scholar
- Jeffries, R., Miller, J., Wharton, C., and Uyeda, K. 1991. User interface evaluation in the real world: A comparison of four techniques. In Proceedings of the ACM Conference on Human Factors in Computing. ACM, New York, pp. 119--124. Google ScholarDigital Library
- John, P. W. M. 1971. Statistical Design and Analysis of Experiments. Macmillan, New York.Google Scholar
- John, B. E. and Packer, H. 1995. Learning and using the cognitive walkthrough method: a case study approach. In Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI'95) (Denver, CO, May 7--11). ACM, New York, 429--436. Google ScholarDigital Library
- John B. and Marks, S. 1997. Tracking the effectiveness of usability evaluation methods. Behav. Inf. Tech. 16, 4/5, 188--202.Google Scholar
- Johnson, J., Roberts, T., Verplank, W., Smith, D., Irby, C., Bear, M., and Mackey, K. 1989. The Xerox star: A retrospective. IEEE Comput. 22, 9, 11--29. Google ScholarDigital Library
- Karat, C.-M., Campbell, R., and Fiegel, T. 1992. Comparison of empirical testing and walkthrough methods in usability interface evaluation. In Proceedings of ACM Conference on Human Factors in Computing Systems. ACM, New York, pp. 397--404. Google ScholarDigital Library
- Kogan, N. 1983. Stylistic variation in childhood and adolescence: Creativity, metaphor, and cognitive styles. In Handbook of Child Psychology: Vol. 3. Cognitive Development, J. H. Flavell and E. M. Markman, Eds. Wiley, New York, 630--705.Google Scholar
- Landauer, T. 1995. The Trouble with Computer. The MIT Press, Cambridge, MA. Google ScholarDigital Library
- Law, E. 2006. Evaluating the downstream utility of user tests and examining the developer effect: A case study. Int. J. Human-Comput. Interact. 21, 2, 147--172.Google ScholarCross Ref
- Lewis, C., Polson, P., Wharton, C., and Rieman, J. 1990. Testing a walkthrough methodology for theory-based design of walk-up-and-use interfaces. In Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI 1990) (Seattle, WA, Apr. 1--5). ACM, New York, 235--242. Google ScholarDigital Library
- Madsen, K. H. 1994. A guide to metaphorical design. Commun. ACM 37, 12, 57--62. Google ScholarDigital Library
- Mankoff, J., Dey, A. K., Hsieh, G., Kientz, J., Ames, M., and Lederer, S. 2003. Heuristic evaluation of ambient displays. In Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI 2003). ACM, New York, 169--176. Google ScholarDigital Library
- Molich, R. 1994. Brugervenlige edb-systemer (in Danish). Teknisk Forlag.Google Scholar
- Molich, R. 2003. User Testing, Discount User Testing. www.dialogdesign.dkGoogle Scholar
- Naur, P. 1988. Human knowing, language, and discrete structures. In Computing: A Human Activity. ACM Press/Addison Wesley, New York, 518--535.Google Scholar
- Naur, P. 1995. Knowing and the Mystique of Logic and Rules. Kluwer Academic Publishers, Dordrecht, the Netherlands.Google Scholar
- Naur, P. 2000. CHI and human thinking. In Proceedings of the 1st Nordic Conference on Computer-Human Interaction (NordiCHI 2000) (Stockholm, Sweden Oct. 23--25). ACM, New York.Google Scholar
- Naur, P. 2007. Computing versus human thinking. Commun. ACM 50, 1, 85--94. Google ScholarDigital Library
- Nielsen, J. 1993. Usability Engineering. Academic Press, San Diego, CA. Google ScholarDigital Library
- Nielsen, J. and Mack, R. L. 1994. Usability Inspection Methods. Wiley, New York. Google ScholarDigital Library
- Nielsen, J. and Molich, R. 1990. Heuristic evaluation of user interfaces. In Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI'90) (Seattle, WA, Apr. 1--5). ACM, New York, 249--256. Google ScholarDigital Library
- Nielsen, J., Molich, R., Snyder, C., and Farrell, S. 2001. E-commerce User Experience. Nielsen Norman Group, Fremont, CA. Google ScholarDigital Library
- Norman, D. A. 1986. Cognitive engineering. In User Centered System Design. Erlbaum, Hillsdale, NJ. 31--61.Google ScholarDigital Library
- Pinelle, D., Gutwin, C., and Greenberg, S. 2003. Task analysis for groupware usability evaluation: Modeling shared-workspace tasks with the mechanics of collaboration. ACM Trans. Computer-Hum. Interact. 10, 4, 281--311. Google ScholarDigital Library
- Preece, J., Rogers, Y. and Sharp, H. 2002. Interaction Design. Wiley, New York. Google ScholarDigital Library
- Raskin, J. 2000. The Humane Interface: New Directions for Designing Interactive Systems. Addison-Wesley, Reading, MA. Google ScholarDigital Library
- Rosenbaum, S., Rohn, J. A., and Humburg, J. 2000. A toolkit for strategic usability: Results from workshops, panels, and surveys. In Proceedings of ACM Conference on Human Factors in Computing Systems (CHI 2000). ACM, New York, 337--344. Google ScholarDigital Library
- Sears, A. and Hess, D. 1998. The effect of task description detail on evaluator performance with cognitive walkthroughs. In Conference Summary on Human Factors in Computing Systems (CHI'98). ACM, New York, 259--260. Google ScholarDigital Library
- Sfard, A. 1998. On two metaphors for learning and on the dangers of choosing just one. Educat. Research. 27, 2, 4--13.Google ScholarCross Ref
- Shneiderman, B. and Plaisant, C. 2005. Designing the User Interface. 4th Edition. Addison-Wesley, Reading, MA. Google ScholarDigital Library
- Smith, S. L. and Mosier, J. N. 1986. Guidelines for designing user interface software. Mitre Tech. Rep. ESD-TR-86-278.Google Scholar
- Spencer, R. 2000. The streamlined cognitive walkthrough method, working around social constraints encountered in a software development company. In Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI'2000) (The Hague, Netherlands). ACM, New York, 353--359. Google ScholarDigital Library
- Wharton, C., Rieman, J., Lewis, C., and Polson, P. 1994. The cognitive walkthrough method: A practitioner's guide. In Usability Inspection Methods, J. Nielsen and R. L. Mack, Eds. Wiley, New York, 105--140. Google ScholarDigital Library
- Wixon, D. 2003. Evaluating usability methods: Why the current literature fails the practitioner. Interactions 10, 4, 29--34. Google ScholarDigital Library
Index Terms
- Metaphors of human thinking for usability inspection and design
Recommendations
The usability inspection performance of work-domain experts: An empirical study
It is a challenge for usability experts to perform usability inspections of interactive systems that are tailored to work-domains of which these experts have little knowledge. To counter this, usability inspections with work-domain experts have been ...
What do usability evaluators do in practice?: an explorative study of think-aloud testing
DIS '06: Proceedings of the 6th conference on Designing Interactive systemsThink-aloud testing is a widely employed usability evaluation method, yet its use in practice is rarely studied. We report an explorative study of 14 think-aloud sessions, the audio recordings of which were examined in detail. The study shows that ...
Usability inspection methods: report on a workshop held at CHI'92, Monterey, CA, May 3–4, 1992
Usability inspection methods, based on informed intuition s about interface design quality, hold promise of providing faster, more cost-effective ways to generate usability evaluations, compared to empirical user evaluation methods . Examples of ...
Comments