ABSTRACT
Ubiquitous systems are made up of sensors and smart devices that communicate with each other to achieve certain goals. They have specific characteristics, such as transparency, attention, calmness, mobility, and context-awareness. Given that the Human-Computer Interaction (HCI) evaluation methods have been proposed for traditional systems, these characteristics present challenges that may require their adaptation. In this paper, we conducted an exploratory study aimed to compare the suitability of Heuristic Evaluation and the MALTU Model to evaluate the Usability and User eXperience (UX) of ubiquitous systems. For this, we apply both methods for evaluating a healthcare system and perform a comparative analysis. From our findings, MALTU is indicated for UX evaluation, being a low-cost approach when compared to Heuristic Evaluation, which is indicated for usability evaluation. Both methods were able to identify the problems that emerge from ubiquity and can provide better results when applied in aggregate. Furthermore, we provide recommendations to apply both methods for the Usability and UX evaluation of ubiquitous systems.
- Simone D. J. Barbosa and Bruno S. da Silva. 2010. Interação humano-computador (1st. ed.). Elsevier, Rio de Janeiro, RJ. In Portuguese.Google Scholar
- Rainara M. Carvalho, Rossana M. de C. Andrade, and Káthia M. de Oliveira. 2018. AQUArIUM - A suite of software measures for HCI quality evaluation of ubiquitous mobile applications. Journal of Systems and Software 136 (2018), 101 -- 136. Google ScholarDigital Library
- Rainara M. Carvalho, Rossana M. de C. Andrade, Káthia M. de Oliveira, Ismayle de S. Santos, and Carla I. M. Bezerra. 2017. Quality characteristics and measures for human-computer interaction evaluation in ubiquitous systems. Software Quality Journal 25, 3 (2017), 743--795. Google ScholarDigital Library
- John W. Creswell and J. David Creswell. 2017. Research design: Qualitative, quantitative, and mixed methods approaches. Sage publications.Google Scholar
- Thiago H. O. da Silva, Lavínia M. Freitas, and Marília S. Mendes. 2017. Beyond Traditional Evaluations: User's View in App Stores. In Proceedings of the XVI Brazilian Symposium on Human Factors in Computing Systems (IHC 2017). ACM, New York, NY, USA, 1--10. Google ScholarDigital Library
- Thiago H. O. da Silva, Lavínia M. Freitas, Marília S. Mendes, and Elizabeth S. Furtado. 2019. Textual evaluation vs. User testing: a comparative analysis. In Proceedings of the 10th Workshop on Aspects of Human-Computer Interaction for the Social Web (WAIHCWS 2019). SBC, Porto Alegre, RS, Brazil, 21--30.Google Scholar
- José Cezar de Souza Filho, Marcos Randel F. Brito, and Andréia L. Sampaio. 2020. Heuristic Evaluation of HidrateSpark 2: Supplementary Material. Google ScholarCross Ref
- Daniel Giusto, Antonio Iera, Giacomo Morabito, and Luigi Atzori. 2010. The internet of things: 20th Tyrrhenian workshop on digital communications. Springer Science & Business Media.Google Scholar
- Hidrate Inc. 2020. Why HidrateSpark Works. Retrieved June 28, 2020 from https://hidratespark.com/pages/why-hidrate-spark-works/Google Scholar
- Proteus Digital Health Inc. 2020. Proteus Digital Health. Retrieved June 28, 2020 from http://www.proteus.com/Google Scholar
- ISO 9241-210:2019 2019. Ergonomics of human-system interaction --- Part 210: Human-centred design for interactive systems. Technical Report. International Organization for Standardization, Geneva, Switzerland.Google Scholar
- Pedro A. C. Leite. 2017. Métodos de Avaliação de Usabilidade para Dispositivos da Internet das Coisas. Monografia (Trabalho de Conclusão de Curso). Universidade Federal Rural de Pernambuco, Recife, Brazil. In Portuguese.Google Scholar
- Marília S. Mendes. 2015. MALTU - um modelo para avaliação da interação em sistemas sociais a partir da linguagem textual do usuário. Ph.D. Thesis. Universidade Federal do Ceará (UFC), Fortaleza, Brazil. In Portuguese.Google Scholar
- Marilia S. Mendes and Elizabeth Furtado. 2018. An Experience of Textual Evaluation Using the MALTU Methodology. In International Conference on Social Computing and Social Media (SCSM 2018). Springer, Cham, 236--246.Google Scholar
- Steve Neely, Graeme Stevenson, Christian Kray, Ingrid Mulder, Kay Connelly, and Katie A. Siek. 2008. Evaluating pervasive and ubiquitous systems. IEEE Pervasive Computing 7, 3 (2008), 85--88. Google ScholarDigital Library
- Jakob Nielsen. 1992. Finding usability problems through heuristic evaluation. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '92). ACM, New York, NY, USA, 373--380. Google ScholarDigital Library
- Jakob Nielsen. 1994. 10 Usability Heuristics for User Interface Design. Retrieved June 26, 2020 from https://www.nngroup.com/articles/ten-usability-heuristics/Google Scholar
- Jakob Nielsen. 1994. Severity Ratings for Usability Problems. Retrieved June 27, 2020 from https://www.nngroup.com/articles/how-to-rate-the-severity-of-usability-problems/Google Scholar
- Jakob Nielsen. 1994. Usability Engineering (1st. ed.). Morgan Kaufmann, San Francisco, CA.Google Scholar
- Jakob Nielsen and Rolf Molich. 1990. Heuristic evaluation of user interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '90). ACM, New York, NY, USA, 249--256. Google ScholarDigital Library
- Daniela Quiñones and Cristian Rusu. 2017. How to develop usability heuristics: A systematic literature review. Computer Standards & Interfaces 53 (2017), 89 -- 122. Google ScholarDigital Library
- Larissa C. Rocha, Rossana M. C. Andrade, Andreia L. Sampaio, and Valéria Lelli. 2017. Heuristics to Evaluate the Usability of Ubiquitous Systems. In International Conference on Distributed, Ambient, and Pervasive Interactions (DAPI 2017). Springer, Cham, 120--141.Google ScholarCross Ref
- Deógenes P. Silva, Patricia C. de Souza, and Thaíres A. de J. Gonçalves. 2018. Early Privacy: Approximating Mental Models in the Definition of Privacy Requirements in Systems Design. In Proceedings of the 17th Brazilian Symposium on Human Factors in Computing Systems (IHC 2018). ACM, New York, NY, USA, 1--10. Google ScholarDigital Library
- Kaisa Väänänen-Vainio-Mattila, Thomas Olsson, and Jonna Häkkilä. 2015. Towards deeper understanding of user experience with ubiquitous computing systems: systematic literature review and design framework. In IFIP Conference on Human-Computer Interaction (INTERACT 2015). Springer, Cham, 384--401. Google ScholarDigital Library
- Mark Weiser. 1991. The Computer for the 21st Century. Scientific american 265, 3 (1991), 94--105.Google Scholar
Index Terms
- Comparing heuristic evaluation and MALTU model in interaction evaluation of ubiquitous systems
Recommendations
Heuristics to Evaluate the Usability of Ubiquitous Systems
Distributed, Ambient and Pervasive InteractionsAbstractWhile the ubiquitous systems have characteristics that modify the way the user interacts with the systems, Human-Computer Interaction area studies forms of interaction, with usability being one of the main quality criteria. One of the methods used ...
Evaluating a MALTU Extension for UbiComp and IoT Systems
IHC '23: Proceedings of the XXII Brazilian Symposium on Human Factors in Computing SystemsIn the Ubiquitous Computing (UbiComp) and Internet of Things (IoT) paradigms, systems have certain specific quality-of-use characteristics that traditional Human-Computer Interaction (HCI) evaluation methods may not be able to address, such as context-...
Coherent Heuristic Evaluation (CoHE): Toward Increasing the Effectiveness of Heuristic Evaluation for Novice Evaluators
Design, User Experience, and Usability. Interaction DesignAbstractHeuristic evaluation (HE) is an inspection-based usability evaluation method in which a number of evaluators, typically 3–5, assess the usability of a system based on a set of usability guidelines. HE was first introduced by Nielsen and Molich and ...
Comments