Abstract
We consider reading techniques a fundamental means of achieving high quality software. Due to the lack of research in this area, we are experimenting with the application and comparison of various reading techniques. This paper deals with our experiences with a family of reading techniques known as Perspective-Based Reading (PBR), and its application to requirements documents. The goal of PBR is to provide operational scenarios where members of a review team read a document from a particular perspective, e.g., tester, developer, user. Our assumption is that the combination of different perspectives provides better coverage of the document, i.e., uncovers a wider range of defects, than the same number of readers using their usual technique.
To test the effectiveness of PBR, we conducted a controlled experiment with professional software developers from the National Aeronautics and Space Administration/Goddard Space Flight Center (NASA/GSFC) Software Engineering Laboratory (SEL). The subjects read two types of documents, one generic in nature and the other from the NASA domain, using two reading techniques, a PBR technique and their usual technique. The results from these experiments, as well as the experimental design, are presented and analyzed. Teams applying PBR are shown to achieve significantly better coverage of documents than teams that do not apply PBR.
We thoroughly discuss the threats to validity so that external replications can benefit from the lessons learned and improve the experimental design if the constraints are different from those posed by subjects borrowed from a development organization.
Similar content being viewed by others
References
Campbell, D. T., and Stanley, J. C. 1963. Experimental and Quasi-Experimental Designs for Research. Boston, MA: Houghton Mifflin Company.
Edington, E. S. 1987. Randomization Tests. New York, NY: Marcel Dekker Inc.
Fagan, M. E. 1976. Design and code inspections to reduce errors in program development. IBM Systems Journal 15(3): 182–211.
Hatcher, L., and Stepanski, E. J. 1994. A Step-by-Step Approach to Using the SAS® System for Univariate and Multivariate Statistics. Cary, NC: SAS Institute Inc.4
Heninger, K. L. 1985. Specifying software requirements for complex systems: New techniques and their application. IEEE Transactions on Software Engineering SE-6(1): 2–13.
Humphrey, W. S. 1996. Using a defined and measured personal software process. IEEE Software 13(3): 77–88.
Linger, R. C., Mills, H. D., and Witt, B. I. 1979. Structured Programming: Theory and Practice. The Systems Programming Series. Addison Wesley.
Parnas, D. L., and Weiss, D. M. 1985. Active design reviews: principles and practices. Proceedings of the 8th International Conference on Software Engineering, 215–222.
Porter, A. A., Votta, L. G. Jr., and Basili, V. R. 1995. Comparing detection methods for software requirements inspections: A replicated experiment. IEEE Transactions on Software Engineering 21(6): 563–575.
SAS Institute Inc. 1989. SAS/STAT User's Guide, Version 6, Fourth edition, Vol. 2. Cary, NC: SAS Institute Inc. 3
Software Engineering Laboratory Series. 1992. Recommended Approach to Software Development. Revision 3. SEL-81-305, 41–62.
Shapiro, S. S., and Wilk, M. B. 1965. An analysis of variance test for normality (concrete samples). Biometrika 52: 591–611.
Votta, L. G. Jr. 1993. Does every inspection need a meeting?. Proceedings of ACM SIGSOFT '93 Symposium on Foundations of Software Engineering. Association of Computing Machinery.
Winer, B. J., Brown, D. R., and Michels, K. M. 1991. Statistical Principles in Experimental Design. 3rd ed. New York, NY: McGraw-Hill Inc.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Basili, V.R., Green, S., Laitenberger, O. et al. The empirical investigation of Perspective-Based Reading. Empirical Software Engineering 1, 133–164 (1996). https://doi.org/10.1007/BF00368702
Issue Date:
DOI: https://doi.org/10.1007/BF00368702