skip to main content
10.1145/3236454.3236489acmconferencesArticle/Chapter ViewAbstractPublication PagesisstaConference Proceedingsconference-collections
research-article

Automatic GUI testing of desktop applications: an empirical assessment of the state of the art

Published:16 July 2018Publication History

ABSTRACT

Testing software applications interacting with their graphical user interface, in short GUI testing, is both important, since it can reveal subtle and annoying bugs, and expensive, due to myriads of possible GUI interactions. Recent attempts to automate GUI testing have produced several techniques that address the problem from different perspectives, sometimes focusing only on some specific platforms, such as Android or Web, and sometimes targeting only some aspects of GUI testing, like test case generation or execution. Although GUI test case generation techniques for desktop applications were the first to be investigated, this area is still actively researched and its state of the art is continuously expanding.

In this paper we comparatively evaluate the state-of-the-art for automatic GUI test cases generation for desktop applications, by presenting a set of experimental results obtained with the main GUI testing tools for desktop applications available. The paper overviews the state of the art in GUI testing, discusses differences, similarities and complementarities among the different techniques, experimentally compares strengths and weaknesses, and pinpoints the open problems that deserve further investigation.

References

  1. Stephan Arlt, Andreas Podelski, Clement Bertolini, Martin Schaf, Indradip Banerjee, and Atif M Memon. 2012. Lightweight static analysis for GUI testing. In Proceedings of the International Symposium on Software Reliability Engineering (ISSRE '12). IEEE Computer Society, 301--310. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Ishan Banerjee, Bao Nguyen, Vahid Garousi, and Atif Memon. 2013. Graphical user interface (GUI) testing: Systematic mapping and repository. Information and Software Technology 55, 10 (2013), 1679--1694. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Sebastian Bauersfeld and Tanja EJ Vos. 2014. User interface level testing with TESTAR; what about more sophisticated action specification and selection?. In Seminar Series on Advanced Techniques and Tools for Software Evolution (SATToSE '14). Springer, 60--78.Google ScholarGoogle Scholar
  4. Benwestgarth. Accessed: 2018-04-30. Crossword Sage. https://sourceforge.net/projects/crosswordsage/.Google ScholarGoogle Scholar
  5. Buddi. Accessed: 2018-04-30. The Digital Cave. http://buddi.digitalcave.ca.Google ScholarGoogle Scholar
  6. Lin Cheng, Jialiang Chang, Zijiang Yang, and Chao Wang. 2016. GUICat: GUI testing as a service. In Proceedings of the International Conference on Automated Software Engineering (ASE '16). ACM, 858--863. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Shauvik Roy Choudhary, Alessandra Gorla, and Alessandro Orso. 2015. Automated Test Input Generation for Android: Are We There Yet?. In Proceedings of the International Conference on Automated Software Engineering (ASE '16). IEEE Computer Society, 429--440.Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Cobertura. Accessed: 2018-04-30. Cobertura. http://cobertura.github.io/cobertura/.Google ScholarGoogle Scholar
  9. Svetoslav Ganov, Chip Killmar, Sarfraz Khurshid, and Dewayne E Perry. 2009. Event listener analysis and symbolic execution for testing GUI applications. In Formal Methods and Software Engineering. Springer, 69--87. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Florian Gross, Gordon Fraser, and Andreas Zeller. 2012. Search-based system testing: high coverage, no false alarms. In Proceedings ofthe International Symposium on Software Testing and Analysis (ISSTA '12). ACM, 67--77. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. IBM. Accessed: 2018-04-30. Rational Functional Tester. http://www-03.ibm.com/software/products/en/functional.Google ScholarGoogle Scholar
  12. Jiri Kovalsky. Accessed: 2018-04-30. Rachota. http://rachota.sourceforge.net/en/index.html.Google ScholarGoogle Scholar
  13. Leonardo Mariani, Mauro Pezzè, Oliviero Riganelli, and Mauro Santoro. 2012. AutoBlackTest: Automatic Black-Box Testing of Interactive Applications. In Proceedings of the International Conference on Software Testing, Verification and Validation (ICST '12). IEEEComputer Society, 81--90. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Leonardo Mariani, Mauro Pezzè, Oliviero Riganelli, and Mauro Santoro. 2014. Automatic testing of GUI-based applications. Software Testing, Verification and Reliability 24, 5 (2014), 341--366. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Leonardo Mariani, Mauro Pezzè, and Daniele Zuddas. 2015. Recent Advances in Automatic Black-Box Testing. In Advances in Computers. Elsevier.Google ScholarGoogle Scholar
  16. Leonardo Mariani, Mauro Pezzè, and Daniele Zuddas. 2018. Augusto: Exploiting Popular Functionalities for the Generation of Semantic GUI Tests with Oracles. In Proceedings of the International Conference on Software Engineering (ICSE '18). to appear. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Atif M. Memon, Ishan Banerjee, and Adithya Nagarajan. 2003. GUI Ripping: Reverse Engineering of Graphical User Interfaces for Testing. In Proceedings of The Working Conference on Reverse Engineering (WCRE '03). IEEE Computer Society, 260--269. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Microsoft. Accessed: 2018-04-30. Spec Explorer. https://msdn.microsoft.com/en-us/library/ee620411.aspx.Google ScholarGoogle Scholar
  19. Bao N Nguyen, Bryan Robbins, Ishan Banerjee, and Atif Memon. 2014. GUITAR: an innovative tool for automated testing of GUI-driven software. Automated Software Engineering 21, 1 (2014), 65--105. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Cu D. Nguyen, Alessandro Marchetto, and Paolo Tonella. 2013. Automated Oracles: An Empirical Study on Cost and Effectiveness. In Proceedings of the European Software Engineering Conference held jointly with the ACM SIGSOFT International Symposium on Foundations of Software Engineering (ESEC/FSE '13). ACM, 136--146. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. SeleniumHQ. Accessed: 2018-04-30. Selenium. https://www.seleniumhq.org.Google ScholarGoogle Scholar
  22. Adrian Smith. Accessed: 2018-04-30. Universal Password Manager. http://upm.sourceforge.net/index.html.Google ScholarGoogle Scholar
  23. Tricentis. Accessed: 2018-04-30. Tosca. https://www.tricentis.com/software-testing-tools/.Google ScholarGoogle Scholar
  24. Tanja EJ Vos, Peter M Kruse, Nelly Condori-Fernández, Sebastian Bauersfeld, and Joachim Wegener. 2015. Testar: Tool support for test automation at the user interface level. International Journal of Information System Modeling and Design 6, 3 (2015), 46--83. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Xun Yuan, Myra B Cohen, and Atif M Memon. 2011. GUI Interaction Testing: Incorporating Event Context. IEEE Transactions on Software Engineering 37, 4 (2011), 559--574. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Automatic GUI testing of desktop applications: an empirical assessment of the state of the art

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      ISSTA '18: Companion Proceedings for the ISSTA/ECOOP 2018 Workshops
      July 2018
      143 pages
      ISBN:9781450359399
      DOI:10.1145/3236454

      Copyright © 2018 Owner/Author

      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 16 July 2018

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate58of213submissions,27%

      Upcoming Conference

      ISSTA '24

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader