skip to main content
10.1145/3424953.3426639acmotherconferencesArticle/Chapter ViewAbstractPublication PagesihcConference Proceedingsconference-collections
research-article

Comparing heuristic evaluation and MALTU model in interaction evaluation of ubiquitous systems

Published:23 December 2020Publication History

ABSTRACT

Ubiquitous systems are made up of sensors and smart devices that communicate with each other to achieve certain goals. They have specific characteristics, such as transparency, attention, calmness, mobility, and context-awareness. Given that the Human-Computer Interaction (HCI) evaluation methods have been proposed for traditional systems, these characteristics present challenges that may require their adaptation. In this paper, we conducted an exploratory study aimed to compare the suitability of Heuristic Evaluation and the MALTU Model to evaluate the Usability and User eXperience (UX) of ubiquitous systems. For this, we apply both methods for evaluating a healthcare system and perform a comparative analysis. From our findings, MALTU is indicated for UX evaluation, being a low-cost approach when compared to Heuristic Evaluation, which is indicated for usability evaluation. Both methods were able to identify the problems that emerge from ubiquity and can provide better results when applied in aggregate. Furthermore, we provide recommendations to apply both methods for the Usability and UX evaluation of ubiquitous systems.

References

  1. Simone D. J. Barbosa and Bruno S. da Silva. 2010. Interação humano-computador (1st. ed.). Elsevier, Rio de Janeiro, RJ. In Portuguese.Google ScholarGoogle Scholar
  2. Rainara M. Carvalho, Rossana M. de C. Andrade, and Káthia M. de Oliveira. 2018. AQUArIUM - A suite of software measures for HCI quality evaluation of ubiquitous mobile applications. Journal of Systems and Software 136 (2018), 101 -- 136. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Rainara M. Carvalho, Rossana M. de C. Andrade, Káthia M. de Oliveira, Ismayle de S. Santos, and Carla I. M. Bezerra. 2017. Quality characteristics and measures for human-computer interaction evaluation in ubiquitous systems. Software Quality Journal 25, 3 (2017), 743--795. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. John W. Creswell and J. David Creswell. 2017. Research design: Qualitative, quantitative, and mixed methods approaches. Sage publications.Google ScholarGoogle Scholar
  5. Thiago H. O. da Silva, Lavínia M. Freitas, and Marília S. Mendes. 2017. Beyond Traditional Evaluations: User's View in App Stores. In Proceedings of the XVI Brazilian Symposium on Human Factors in Computing Systems (IHC 2017). ACM, New York, NY, USA, 1--10. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Thiago H. O. da Silva, Lavínia M. Freitas, Marília S. Mendes, and Elizabeth S. Furtado. 2019. Textual evaluation vs. User testing: a comparative analysis. In Proceedings of the 10th Workshop on Aspects of Human-Computer Interaction for the Social Web (WAIHCWS 2019). SBC, Porto Alegre, RS, Brazil, 21--30.Google ScholarGoogle Scholar
  7. José Cezar de Souza Filho, Marcos Randel F. Brito, and Andréia L. Sampaio. 2020. Heuristic Evaluation of HidrateSpark 2: Supplementary Material. Google ScholarGoogle ScholarCross RefCross Ref
  8. Daniel Giusto, Antonio Iera, Giacomo Morabito, and Luigi Atzori. 2010. The internet of things: 20th Tyrrhenian workshop on digital communications. Springer Science & Business Media.Google ScholarGoogle Scholar
  9. Hidrate Inc. 2020. Why HidrateSpark Works. Retrieved June 28, 2020 from https://hidratespark.com/pages/why-hidrate-spark-works/Google ScholarGoogle Scholar
  10. Proteus Digital Health Inc. 2020. Proteus Digital Health. Retrieved June 28, 2020 from http://www.proteus.com/Google ScholarGoogle Scholar
  11. ISO 9241-210:2019 2019. Ergonomics of human-system interaction --- Part 210: Human-centred design for interactive systems. Technical Report. International Organization for Standardization, Geneva, Switzerland.Google ScholarGoogle Scholar
  12. Pedro A. C. Leite. 2017. Métodos de Avaliação de Usabilidade para Dispositivos da Internet das Coisas. Monografia (Trabalho de Conclusão de Curso). Universidade Federal Rural de Pernambuco, Recife, Brazil. In Portuguese.Google ScholarGoogle Scholar
  13. Marília S. Mendes. 2015. MALTU - um modelo para avaliação da interação em sistemas sociais a partir da linguagem textual do usuário. Ph.D. Thesis. Universidade Federal do Ceará (UFC), Fortaleza, Brazil. In Portuguese.Google ScholarGoogle Scholar
  14. Marilia S. Mendes and Elizabeth Furtado. 2018. An Experience of Textual Evaluation Using the MALTU Methodology. In International Conference on Social Computing and Social Media (SCSM 2018). Springer, Cham, 236--246.Google ScholarGoogle Scholar
  15. Steve Neely, Graeme Stevenson, Christian Kray, Ingrid Mulder, Kay Connelly, and Katie A. Siek. 2008. Evaluating pervasive and ubiquitous systems. IEEE Pervasive Computing 7, 3 (2008), 85--88. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Jakob Nielsen. 1992. Finding usability problems through heuristic evaluation. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '92). ACM, New York, NY, USA, 373--380. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Jakob Nielsen. 1994. 10 Usability Heuristics for User Interface Design. Retrieved June 26, 2020 from https://www.nngroup.com/articles/ten-usability-heuristics/Google ScholarGoogle Scholar
  18. Jakob Nielsen. 1994. Severity Ratings for Usability Problems. Retrieved June 27, 2020 from https://www.nngroup.com/articles/how-to-rate-the-severity-of-usability-problems/Google ScholarGoogle Scholar
  19. Jakob Nielsen. 1994. Usability Engineering (1st. ed.). Morgan Kaufmann, San Francisco, CA.Google ScholarGoogle Scholar
  20. Jakob Nielsen and Rolf Molich. 1990. Heuristic evaluation of user interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '90). ACM, New York, NY, USA, 249--256. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Daniela Quiñones and Cristian Rusu. 2017. How to develop usability heuristics: A systematic literature review. Computer Standards & Interfaces 53 (2017), 89 -- 122. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Larissa C. Rocha, Rossana M. C. Andrade, Andreia L. Sampaio, and Valéria Lelli. 2017. Heuristics to Evaluate the Usability of Ubiquitous Systems. In International Conference on Distributed, Ambient, and Pervasive Interactions (DAPI 2017). Springer, Cham, 120--141.Google ScholarGoogle ScholarCross RefCross Ref
  23. Deógenes P. Silva, Patricia C. de Souza, and Thaíres A. de J. Gonçalves. 2018. Early Privacy: Approximating Mental Models in the Definition of Privacy Requirements in Systems Design. In Proceedings of the 17th Brazilian Symposium on Human Factors in Computing Systems (IHC 2018). ACM, New York, NY, USA, 1--10. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Kaisa Väänänen-Vainio-Mattila, Thomas Olsson, and Jonna Häkkilä. 2015. Towards deeper understanding of user experience with ubiquitous computing systems: systematic literature review and design framework. In IFIP Conference on Human-Computer Interaction (INTERACT 2015). Springer, Cham, 384--401. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Mark Weiser. 1991. The Computer for the 21st Century. Scientific american 265, 3 (1991), 94--105.Google ScholarGoogle Scholar

Index Terms

  1. Comparing heuristic evaluation and MALTU model in interaction evaluation of ubiquitous systems

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Other conferences
        IHC '20: Proceedings of the 19th Brazilian Symposium on Human Factors in Computing Systems
        October 2020
        519 pages
        ISBN:9781450381727
        DOI:10.1145/3424953

        Copyright © 2020 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 23 December 2020

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        IHC '20 Paper Acceptance Rate60of155submissions,39%Overall Acceptance Rate331of973submissions,34%

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader