skip to main content
10.1145/1555860.1555864acmconferencesArticle/Chapter ViewAbstractPublication PagesisstaConference Proceedingsconference-collections
research-article

Using checklists to review static analysis warnings

Published:19 June 2009Publication History

ABSTRACT

Static analysis tools find silly mistakes, confusing code, bad practices and property violations. But software developers and organizations may or may not care about all these warnings, depending on how they impact code behavior and other factors. In the past, we have tried to identify important warnings by asking users to rate them as severe, low impact or not a bug. In this paper, we observe that the user's rating may be more complicated depending on whether the warning is feasible, changes code behavior, occurs in deployed code and other factors. To better model this, we ask users to review warnings using a checklist which enables more detailed reviews. We find that reviews are consistent across users and across checklist questions, though some users may disagree about whether to fix or filter out certain bug classes.

References

  1. N. Ayewah and W. Pugh. A report on a survey and study of static analysis users. In DEFECTS '08: Proceedings of the 2008 workshop on Defects in large software systems, pages 1--5, New York, NY, USA, 2008. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. N. Ayewah, W. Pugh, J. D. Morgenthaler, J. Penix, and Y. Zhou. Evaluating static analysis defect warnings on production software. In PASTE '07: Proceedings of the 7th ACM SIGPLAN-SIGSOFT workshop on Program analysis for software tools and engineering, pages 1--8, New York, USA, 2007. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. D. Hovemeyer and W. Pugh. Finding bugs is easy. In OOPSLA '04: Companion to the 19th annual ACM SIGPLAN conference on Object-oriented programming systems, languages, and applications, pages 132--136, New York, NY, USA, 2004. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. D. Hovemeyer and W. Pugh. Finding more null pointer bugs, but not too many. In PASTE '07: Proceedings of the 7th ACM SIGPLAN-SIGSOFT workshop on Program analysis for software tools and engineering, pages 9--14, New York, USA, 2007. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Y. P. Khoo, J. S. Foster, M. Hicks, and V. Sazawal. Path projection for user-centered static analysis tools. In PASTE '08: Proceedings of the 8th ACM SIGPLAN-SIGSOFT workshop on Program analysis for software tools and engineering, pages 57--63, New York, NY, USA, 2008. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. S. Kim and M. D. Ernst. Which warnings should i fix first? In ESEC-FSE '07: Proceedings of the the 6th joint meeting of the European software engineering conference and the ACM SIGSOFT symposium on The foundations of software engineering, pages 45--54, New York, NY, USA, 2007. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. N. Nagappan and T. Ball. Static analysis tools as early indicators of pre-release defect density. In ICSE '05: Proceedings of the 27th international conference on Software engineering, pages 580--586, New York, NY, USA, 2005. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Using checklists to review static analysis warnings

                Recommendations

                Reviews

                Serge Berger

                Static analysis is important for detecting code defects. Most automatic tools based on static analysis have reached a certain level of maturity, making them trusted partners in code quality assurance. However, the tools are often limited in terms of adjusting to the right level of warning for code violations. This paper creates a checklist to correlate the intent of the program with the severity of the issue reported by a static analysis tool-the tool is FindBugs. Statistically, false positives are the most costly within the model. Ayewah and Pugh exploit the Pearson correlation coefficient to check the consistency of checklist responses, and use the chi-square test to trace the trends. The results are summarized in a few tables. Visualization-for example, receiver operating characteristic (ROC) curves-of the results over the false positives ratio would have better demonstrated the points and helped analyze statistically significant dependencies. The authors intend to continue researching the subject, with a larger number of developers involved in the checklist activities. Therefore, statistical characteristics of the model should improve. Online Computing Reviews Service

                Access critical reviews of Computing literature here

                Become a reviewer for Computing Reviews.

                Comments

                Login options

                Check if you have access through your login credentials or your institution to get full access on this article.

                Sign in
                • Published in

                  cover image ACM Conferences
                  DEFECTS '09: Proceedings of the 2nd International Workshop on Defects in Large Software Systems: Held in conjunction with the ACM SIGSOFT International Symposium on Software Testing and Analysis (ISSTA 2009)
                  June 2009
                  34 pages
                  ISBN:9781605586540
                  DOI:10.1145/1555860

                  Copyright © 2009 ACM

                  Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

                  Publisher

                  Association for Computing Machinery

                  New York, NY, United States

                  Publication History

                  • Published: 19 June 2009

                  Permissions

                  Request permissions about this article.

                  Request Permissions

                  Check for updates

                  Qualifiers

                  • research-article

                  Upcoming Conference

                  ISSTA '24

                PDF Format

                View or Download as a PDF file.

                PDF

                eReader

                View online with eReader.

                eReader