skip to main content
10.1145/566172.566186acmconferencesArticle/Chapter ViewAbstractPublication PagesisstaConference Proceedingsconference-collections
Article

Efficient instrumentation for code coverage testing

Published:01 July 2002Publication History

ABSTRACT

Evaluation of Code Coverage is the problem of identifying the parts of a program that did not execute in one or more runs of a program. The traditional approach for code coverage tools is to use static code instrumentation. In this paper we present a new approach to dynamically insert and remove instrumentation code to reduce the runtime overhead of code coverage. We also explore the use of dominator tree information to reduce the number of instrumentation points needed. Our experiments show that our approach reduces runtime overhead by 38-90% compared with purecov, a commercial code coverage tool. Our tool is fully automated and available for download from the Internet.

References

  1. C-Cover Code Coverage Analyzer for C/C++, . http://www.bullseye.com/ccover.html, Bullseye Testing Technology.Google ScholarGoogle Scholar
  2. Crashme Benchmark by MySQL Database System, . http://www.mysql.com/information/crash-me.php.Google ScholarGoogle Scholar
  3. Rational PureCoverage for Unix, . http://www.rational.com/products/purecoverage/index.jtmpl, Rational Software Corporation.Google ScholarGoogle Scholar
  4. SPEC newsletter, . September 1995, http://www.specbench.org/osg/cpu95/CINT95.Google ScholarGoogle Scholar
  5. A. Aho, R. Sethi, and J. Ullman, Compilers: Principles, Techniques and Tools. 1986: Addison-Wesley Publishing Com. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. H. Agrawal, "Dominators, Super Blocks and Program Coverage," POPL 94, Portland, Oregon, pp. 25-34. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. B. Buck and J. K. Hollingsworth, "An API for Runtime Code Patching," The International Journal of High Performance Computing Applications,14, Winter 2000, pp. 317-329. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. C. Pavlopoulou and M. Young, "Residual Test Coverage Monitoring," International Conference on Software Enginnering. 1999, Los Angeles, CA, pp. 277-284. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. D. Bitton, D. J. DeWitt, and C. Turbyfill, "Benchmarking Database Systems - A Systematic Approach," Ninth International Conference on Very Large Data Bases. Oct. 31-Nov. 2, Florence, Italy. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. J. Dean, C. A. Waldspurger, and W. E. Weihl, "Transparent, Low-Overhead Profiling on Modern Processors," Workshop on Profile and Feedback-Directed Compilation. October, Paris, France.Google ScholarGoogle Scholar
  11. J. Dean, J. E. Hicks, C. A. Waldspurger, W. E. Weihl, and G. Chrysos, "ProfileMe: Hardware Support for Instruction-Level Profiling on Out-of-Order Processors," 30th Symposium on Microarchitecture (Micro-30). December. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. J. Larus, "Whole Program Paths," PLDI '99. May 1999, Atlanta, GA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. S. S. Muchnick, Advanced Compiler Design and Implementation. 1997: Morgan Kaufmann, Publishers Inc. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. R. L. Probert, "Optimal Insertion of Software Probes in Well-Delimited Programs," IEEE Transactions on Software Engineering, January, 1981, pp. 34-42.Google ScholarGoogle Scholar
  15. T Ball and J. R. Larus, "Efficient Path Profiling," 29th Annual IEEE/ACM International Symposium on Microarchitecture, Paris, France, pp. 46-57. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. W. E. Weihl, CPI: Continous Profiling Infrastructure, DIGITAL Forefront Magazine 1997.Google ScholarGoogle Scholar
  1. Efficient instrumentation for code coverage testing

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      ISSTA '02: Proceedings of the 2002 ACM SIGSOFT international symposium on Software testing and analysis
      July 2002
      248 pages
      ISBN:1581135629
      DOI:10.1145/566172
      • cover image ACM SIGSOFT Software Engineering Notes
        ACM SIGSOFT Software Engineering Notes  Volume 27, Issue 4
        July 2002
        242 pages
        ISSN:0163-5948
        DOI:10.1145/566171
        Issue’s Table of Contents

      Copyright © 2002 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 1 July 2002

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • Article

      Acceptance Rates

      ISSTA '02 Paper Acceptance Rate26of97submissions,27%Overall Acceptance Rate58of213submissions,27%

      Upcoming Conference

      ISSTA '24

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader