skip to main content
research-article

Metaphors of human thinking for usability inspection and design

Published:19 January 2008Publication History
Skip Abstract Section

Abstract

Usability inspection techniques are widely used, but few focus on users' thinking and many are appropriate only for particular devices and use contexts. We present a new technique (MOT) that guides inspection by metaphors of human thinking. The metaphors concern habit, the stream of thought, awareness and associations, the relation between utterances and thought, and knowing. The main novelty of MOT is its psychological basis combined with its use of metaphors to stimulate inspection. The first of three experiments shows that usability problems uncovered with MOT are more serious and more complex to repair than problems found with heuristic evaluation. Problems found with MOT are also judged more likely to persist for expert users. The second experiment shows that MOT finds more problems than cognitive walkthrough, and has a wider coverage of a reference collection of usability problems. Participants prefer using MOT over cognitive walkthrough; an important reason being the wider scope of MOT. The third experiment compares MOT, cognitive walkthrough, and think aloud testing, in the context of nontraditional user interfaces. Participants prefer using think aloud testing, but identify few problems with that technique that are not found also with MOT or cognitive walkthrough. MOT identifies more problems than the other techniques. Across experiments and measures of usability problems' utility in systems design, MOT performs better than existing inspection techniques and is comparable to think aloud testing.

References

  1. Andre, T. S., Hartson, H. R., Belz, S. M., and McCreary, F. A. 2001. The user action framework: A reliable foundation for usability engineering support tools. Int. J. Human-Comput. Stud. 54, 1, 107--136. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Bastien, C. and Scapin, D. 1995. Evaluating a user interface with ergonomic criteria. Int. J. Human-Comput. Stud. 7, 2, 105--121. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Beyer, H. and Holtzblatt, K. 1998. Contextual Design. Morgan-Kaufman, San Francisco, CA.Google ScholarGoogle Scholar
  4. Cockton, G., Lavery, D., and Woolrych, A. 2003. Inspection-based evaluations. In The Human-Computer Interaction Handbook. J. A. Jacko, and A. Sears, Eds. Lawrence Erlbaum Associates, Mahwah, NJ, 1118--1138. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Cohen, J. 1992. A power primer. Psych. Bull. 112, 1, 155--159.Google ScholarGoogle ScholarCross RefCross Ref
  6. Connell, I., Blandford, A., and Green, T. 2004. CASSM and cognitive walkthrough: Usability issues with ticket vending machines. Behav. Inf. Tech. 23, 5, 307--320.Google ScholarGoogle ScholarCross RefCross Ref
  7. Dybkjær, H. and Dybkjær, L. 2002. Experiences from a Danish spoken dialogue system. In 2nd Danish Human-Computer Interaction Research Symposium, E. Frøkjær and K. Hornbæk Eds. DIKU tech. rep. 02/19, 15--19.Google ScholarGoogle Scholar
  8. Erickson, T. D. 1990. Working with interface metaphors. In The Art of Human Computer Interface Design. B. Laurel, Eds. Addison-Wesley, Reading, MA, 65--73.Google ScholarGoogle Scholar
  9. Fleiss, J. L. 1981. Statistical Methods for Rates and Proportions. Wiley, New York.Google ScholarGoogle Scholar
  10. Frøkjær, E. and Hornbæk, K. 2002. Metaphors of human thinking in HCI: Habit, stream of thought, awareness, utterance, and knowing. In Proceedings of HF2002/OzCHI 2002 (Melbourne, Australia, Nov. 25--27).Google ScholarGoogle Scholar
  11. Fu, L. and Salvendy, G. 2002. Effectiveness of user-testing and heuristic evaluation as a function of performance classification. Behav. Inf. Tech. 21, 2, 137--143.Google ScholarGoogle ScholarCross RefCross Ref
  12. Gardner, H. 1982. Art, Mind and Brain: A Cognitive Approach to Creativity. Basic Books, New York.Google ScholarGoogle Scholar
  13. Gray, W. D. and Salzman, M. C. 1998. Damaged merchandise? A review of experiments that compare usability evaluation methods. Human-Comput. Interact. 13, 3, 203--261. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Green, T. R. G. and Petre, M. 1996. Usability analysis of visual programming environments: a ‘cognitive dimensions’ framework. J. Vis. Lang. Comput. 7, 131--174.Google ScholarGoogle ScholarCross RefCross Ref
  15. Hartson, H. R., Andre, T. S., and Williges, R. C. 2001. Criteria for evaluating usability evaluation methods. Int. J. Human-Comput. Interact. 13, 4, 373--410.Google ScholarGoogle ScholarCross RefCross Ref
  16. Helms Jørgensen, A. 1990. Thinking-aloud in user interface design: A method promoting cognitive ergonomics. Ergonomics 33, 4, 501--507.Google ScholarGoogle ScholarCross RefCross Ref
  17. Hertzum, M. and Jacobsen, N. E. 2001. The evaluator effect: A chilling fact about usability evaluation methods. Internat. J. Human-Comput. Interact. 13, 421--443.Google ScholarGoogle ScholarCross RefCross Ref
  18. Hornbæk, K. and Frøkjær, E. 2002. Evaluating user interfaces with metaphors of human thinking. In Proceedings of 7th ERCIM Workshop “User Interfaces for All,” (ERCIM Workshop on User Interfaces for All), Chantilly, France, Oct. 24-25). N. Carbonell and C. Stephanidis, Eds. Lecture Notes in Computer Science, vol. 2615, Springer-Verlag, Berlin, Germany, 486--507. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Hornbæk, K. and Frøkjær, E. 2004a. Usability inspection by metaphors of human thinking compared to heuristic evaluation. Int. J. Human-Comput. Interact. 17, 3, 357--374.Google ScholarGoogle ScholarCross RefCross Ref
  20. Hornbæk, K. and Frøkjær, E. 2004b. Two psychology-based usability inspection techniques studied in a diary experiment. In Proceedings of the Nordic Conference on Human-Computer Interaction (Nordichi 2004), 2--12. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Hornbæk, K. and Frøkjær, E. 2005. Comparing usability problems and redesign proposals as input to practical systems development. In Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI 2005). ACM, New York, 391--400. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Hornbæk, K. and Frøkjær, E. 2006. What kind of usability-problem description are useful for developers? In Proceedings of the Annual Meeting of Human Factors and Ergonomics Society (HFES), 2523--2527.Google ScholarGoogle Scholar
  23. James, W. 1890. Principles of Psychology. Henry Holt & Co.Google ScholarGoogle Scholar
  24. Jeffries, R., Miller, J., Wharton, C., and Uyeda, K. 1991. User interface evaluation in the real world: A comparison of four techniques. In Proceedings of the ACM Conference on Human Factors in Computing. ACM, New York, pp. 119--124. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. John, P. W. M. 1971. Statistical Design and Analysis of Experiments. Macmillan, New York.Google ScholarGoogle Scholar
  26. John, B. E. and Packer, H. 1995. Learning and using the cognitive walkthrough method: a case study approach. In Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI'95) (Denver, CO, May 7--11). ACM, New York, 429--436. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. John B. and Marks, S. 1997. Tracking the effectiveness of usability evaluation methods. Behav. Inf. Tech. 16, 4/5, 188--202.Google ScholarGoogle Scholar
  28. Johnson, J., Roberts, T., Verplank, W., Smith, D., Irby, C., Bear, M., and Mackey, K. 1989. The Xerox star: A retrospective. IEEE Comput. 22, 9, 11--29. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Karat, C.-M., Campbell, R., and Fiegel, T. 1992. Comparison of empirical testing and walkthrough methods in usability interface evaluation. In Proceedings of ACM Conference on Human Factors in Computing Systems. ACM, New York, pp. 397--404. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Kogan, N. 1983. Stylistic variation in childhood and adolescence: Creativity, metaphor, and cognitive styles. In Handbook of Child Psychology: Vol. 3. Cognitive Development, J. H. Flavell and E. M. Markman, Eds. Wiley, New York, 630--705.Google ScholarGoogle Scholar
  31. Landauer, T. 1995. The Trouble with Computer. The MIT Press, Cambridge, MA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Law, E. 2006. Evaluating the downstream utility of user tests and examining the developer effect: A case study. Int. J. Human-Comput. Interact. 21, 2, 147--172.Google ScholarGoogle ScholarCross RefCross Ref
  33. Lewis, C., Polson, P., Wharton, C., and Rieman, J. 1990. Testing a walkthrough methodology for theory-based design of walk-up-and-use interfaces. In Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI 1990) (Seattle, WA, Apr. 1--5). ACM, New York, 235--242. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Madsen, K. H. 1994. A guide to metaphorical design. Commun. ACM 37, 12, 57--62. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Mankoff, J., Dey, A. K., Hsieh, G., Kientz, J., Ames, M., and Lederer, S. 2003. Heuristic evaluation of ambient displays. In Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI 2003). ACM, New York, 169--176. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Molich, R. 1994. Brugervenlige edb-systemer (in Danish). Teknisk Forlag.Google ScholarGoogle Scholar
  37. Molich, R. 2003. User Testing, Discount User Testing. www.dialogdesign.dkGoogle ScholarGoogle Scholar
  38. Naur, P. 1988. Human knowing, language, and discrete structures. In Computing: A Human Activity. ACM Press/Addison Wesley, New York, 518--535.Google ScholarGoogle Scholar
  39. Naur, P. 1995. Knowing and the Mystique of Logic and Rules. Kluwer Academic Publishers, Dordrecht, the Netherlands.Google ScholarGoogle Scholar
  40. Naur, P. 2000. CHI and human thinking. In Proceedings of the 1st Nordic Conference on Computer-Human Interaction (NordiCHI 2000) (Stockholm, Sweden Oct. 23--25). ACM, New York.Google ScholarGoogle Scholar
  41. Naur, P. 2007. Computing versus human thinking. Commun. ACM 50, 1, 85--94. Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. Nielsen, J. 1993. Usability Engineering. Academic Press, San Diego, CA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. Nielsen, J. and Mack, R. L. 1994. Usability Inspection Methods. Wiley, New York. Google ScholarGoogle ScholarDigital LibraryDigital Library
  44. Nielsen, J. and Molich, R. 1990. Heuristic evaluation of user interfaces. In Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI'90) (Seattle, WA, Apr. 1--5). ACM, New York, 249--256. Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. Nielsen, J., Molich, R., Snyder, C., and Farrell, S. 2001. E-commerce User Experience. Nielsen Norman Group, Fremont, CA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  46. Norman, D. A. 1986. Cognitive engineering. In User Centered System Design. Erlbaum, Hillsdale, NJ. 31--61.Google ScholarGoogle ScholarDigital LibraryDigital Library
  47. Pinelle, D., Gutwin, C., and Greenberg, S. 2003. Task analysis for groupware usability evaluation: Modeling shared-workspace tasks with the mechanics of collaboration. ACM Trans. Computer-Hum. Interact. 10, 4, 281--311. Google ScholarGoogle ScholarDigital LibraryDigital Library
  48. Preece, J., Rogers, Y. and Sharp, H. 2002. Interaction Design. Wiley, New York. Google ScholarGoogle ScholarDigital LibraryDigital Library
  49. Raskin, J. 2000. The Humane Interface: New Directions for Designing Interactive Systems. Addison-Wesley, Reading, MA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  50. Rosenbaum, S., Rohn, J. A., and Humburg, J. 2000. A toolkit for strategic usability: Results from workshops, panels, and surveys. In Proceedings of ACM Conference on Human Factors in Computing Systems (CHI 2000). ACM, New York, 337--344. Google ScholarGoogle ScholarDigital LibraryDigital Library
  51. Sears, A. and Hess, D. 1998. The effect of task description detail on evaluator performance with cognitive walkthroughs. In Conference Summary on Human Factors in Computing Systems (CHI'98). ACM, New York, 259--260. Google ScholarGoogle ScholarDigital LibraryDigital Library
  52. Sfard, A. 1998. On two metaphors for learning and on the dangers of choosing just one. Educat. Research. 27, 2, 4--13.Google ScholarGoogle ScholarCross RefCross Ref
  53. Shneiderman, B. and Plaisant, C. 2005. Designing the User Interface. 4th Edition. Addison-Wesley, Reading, MA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  54. Smith, S. L. and Mosier, J. N. 1986. Guidelines for designing user interface software. Mitre Tech. Rep. ESD-TR-86-278.Google ScholarGoogle Scholar
  55. Spencer, R. 2000. The streamlined cognitive walkthrough method, working around social constraints encountered in a software development company. In Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI'2000) (The Hague, Netherlands). ACM, New York, 353--359. Google ScholarGoogle ScholarDigital LibraryDigital Library
  56. Wharton, C., Rieman, J., Lewis, C., and Polson, P. 1994. The cognitive walkthrough method: A practitioner's guide. In Usability Inspection Methods, J. Nielsen and R. L. Mack, Eds. Wiley, New York, 105--140. Google ScholarGoogle ScholarDigital LibraryDigital Library
  57. Wixon, D. 2003. Evaluating usability methods: Why the current literature fails the practitioner. Interactions 10, 4, 29--34. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Metaphors of human thinking for usability inspection and design

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in

      Full Access

      • Published in

        cover image ACM Transactions on Computer-Human Interaction
        ACM Transactions on Computer-Human Interaction  Volume 14, Issue 4
        January 2008
        204 pages
        ISSN:1073-0516
        EISSN:1557-7325
        DOI:10.1145/1314683
        Issue’s Table of Contents

        Copyright © 2008 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 19 January 2008
        • Accepted: 1 April 2007
        • Revised: 1 September 2006
        • Received: 1 July 2005
        Published in tochi Volume 14, Issue 4

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article
        • Research
        • Refereed

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader