skip to main content
10.1145/503209.503244acmconferencesArticle/Chapter ViewAbstractPublication PagesfseConference Proceedingsconference-collections
Article

Coverage criteria for GUI testing

Authors Info & Claims
Published:01 September 2001Publication History

ABSTRACT

A widespread recognition of the usefulness of graphical user interfaces (GUIs) has established their importance as critical components of today's software. GUIs have characteristics different from traditional software, and conventional testing techniques do not directly apply to GUIs. This paper's focus is on coverage critieria for GUIs, important rules that provide an objective measure of test quality. We present new coverage criteria to help determine whether a GUI has been adequately tested. These coverage criteria use events and event sequences to specify a measure of test adequacy. Since the total number of permutations of event sequences in any non-trivial GUI is extremely large, the GUI's hierarchical structure is exploited to identify the important event sequences to be tested. A GUI is decomposed into GUI components, each of which is used as a basic unit of testing. A representation of a GUI component, called an event-flow graph, identifies the interaction of events within a component and intra-component criteria are used to evaluate the adequacy of tests on these events. The hierarchical relationship among components is represented by an integration tree, and inter-component coverage criteria are used to evaluate the adequacy of test sequences that cross components. Algorithms are given to construct event-flow graphs and an integration tree for a given GUI, and to evaluate the coverage of a given test suite with respect to the new coverage criteria. A case study illustrates the usefulness of the coverage report to guide further testing and an important correlation between event-based coverage of a GUI and statement coverage of its software's underlying code.

References

  1. 1.D. Chays, S. Dan, P. G. Frankl, F. I. Vokolos, and E. J. Weyuker. A framework for testing database applications. In Proceedings of the PO00 International Symposium on Software Testing and Analysis (ISSTA), pages 147-157, 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. 2.J. S. Gourlay. A mathematicM framework for the investigation of testing. IEEE Transactions on Software En9ineering, 9(6):686-709, Nov. 1983.Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. 3.M. L. Hammontree, J. J. Hendrickson, and B. W. Hensley. Integrated data capture and analysis tools for research and testing an graphical user interfaces. In Proceedings of the Conference on Human Factors in Computing Systems, pages 431-432, New York, NY, USA, May 1992. ACM Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. 4.M. J. Harrold and M. L. Sofia. Interprocedual data flow testing. In R. A. Kemmerer, editor, Proceedings of the ACM SIGSOFT '89 Third Symposium on Testing, Analysis, and Verification (TA V3), pages 158-167, 1989. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. 5.P. C. Jorgensen and C. Erickson. Object-oriented integration testing. Communications of the A CM, 37(9):30-38, Sept. 1994. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. 6.D. J. Kasik and H. G. George. Toward automatic generation of novice user test scripts. In Proceedings of the Conference on Human Factors in Computing Systems : Common Ground, pages 244-251, New York, 13-18 Apr. 1996. ACM Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. 7.L. R. Kepple. The black art of GUI testing. Dr. Dobb's Journal of Software Tools, 19(2):40, Feb. 1994.Google ScholarGoogle Scholar
  8. 8.A. M. Memon, M. E. Pollack, and M. L. Sofia. Using a goal-driven approach to generate test cases for GUIs. In Proceedings of the 21st International Conference on Software En9ineerin9 , pages 257-266. ACM Press, May 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. 9.A. M. Memon, M. E. Pollack, and M. L. Sofia. Automated test oracles for GUIs. In D. S. Rosenblum, editor, Proceedings of the A CM SIGSOFT 8th International Symposium on the Foundations of Software Engineering (FSE-O0), pages 30-39, NY, Nov. 8-10 2000. ACM Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. 10.A. M. Memon, M. E. Pollack, and M. L. Sofia. A planning-based approach to GUI testing. In Proceedings of The 13th International Software/Internet Quality Week, May 2000.Google ScholarGoogle Scholar
  11. 11.A. M. Memon, M. E. Pollack, and M. L. Sofia. Hierarchical GUI test case generation using automated planning. IEEE Transactions on Software Engineering, 27(2):144-155, Feb. 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. 12.T. Ostrand, A. Anodide, H. Foster, and T. Goradia. A visual test development environment for GUI systems. In Proceedings of the A CM SIGSOFT International Symposium on Software Testing and Analysis (ISSTA-98), pages 82-92, New York, Mar.2-5 1998. ACM Press. Google ScholarGoogle Scholar
  13. 13.S. Rapps and E. J. Weyuker. Selecting software test data using data flow information. IEEE Transactions on Software Engineering, 11(4):367-375, Apr. 1985. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. 14.R. K. Shehady and D. P. Siewiorek. A method to automate user interface testing using variable finite state machines. In Proceedings of The Twenty-Seventh Annual International Symposium on Fault-Tolerant Computing (FTCS'97), pages 80-88, Washington- Brussels - Tokyo, June 1997. IEEE Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. 15.L. The. Stress Tests For GUI Programs. Datamation, 38(18):37, Sept. 1992.Google ScholarGoogle Scholar
  16. 16.E. J. Weyuker. The applicability of program schema results to programs. International Journal of Computer and Information Sciences, 8(5):387-403, Oct. 1979.Google ScholarGoogle ScholarCross RefCross Ref
  17. 17.E. J. Weyuker. Translatability and decidability questions for restricted classes of program schemas. SIAM Journal on Computing, 8(4):587-598, 1979.Google ScholarGoogle ScholarCross RefCross Ref
  18. 18.L. White and H. Almezen. Generating test cases for GUI responsibilities using complete interaction sequences. In Proceedings of the International Symposium on Software Reliability Engineering, pages 110-121, Oct. 8-11 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. 19.S. Wolfram. Mathematiea: A System for Doing Mathematics by Computer. Addison-Wesley, Reading, Massachusetts, 1988. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. 20.H. Zhu and P. Hall. Test data adequacy measurements. Software Engineering Journal, 8(1):21-30, Jan. 1993. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Coverage criteria for GUI testing

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Conferences
          ESEC/FSE-9: Proceedings of the 8th European software engineering conference held jointly with 9th ACM SIGSOFT international symposium on Foundations of software engineering
          September 2001
          329 pages
          ISBN:1581133901
          DOI:10.1145/503209
          • Conference Chairs:
          • A. Min Tjoa,
          • Volker Gruhn
          • cover image ACM SIGSOFT Software Engineering Notes
            ACM SIGSOFT Software Engineering Notes  Volume 26, Issue 5
            Sept. 2001
            329 pages
            ISSN:0163-5948
            DOI:10.1145/503271
            Issue’s Table of Contents

          Copyright © 2001 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 1 September 2001

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • Article

          Acceptance Rates

          ESEC/FSE-9 Paper Acceptance Rate29of137submissions,21%Overall Acceptance Rate112of543submissions,21%

          Upcoming Conference

          FSE '24

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader