skip to main content
article

Integrating automated test generation into the WYSIWYT spreadsheet testing methodology

Published:01 April 2006Publication History
Skip Abstract Section

Abstract

Spreadsheet languages, which include commercial spreadsheets and various research systems, have had a substantial impact on end-user computing. Research shows, however, that spreadsheets often contain faults. Thus, in previous work we presented a methodology that helps spreadsheet users test their spreadsheet formulas. Our empirical studies have shown that end users can use this methodology to test spreadsheets more adequately and efficiently; however, the process of generating test cases can still present a significant impediment. To address this problem, we have been investigating how to incorporate automated test case generation into our testing methodology in ways that support incremental testing and provide immediate visual feedback. We have used two techniques for generating test cases, one involving random selection and one involving a goal-oriented approach. We describe these techniques and their integration into our testing environment, and report results of an experiment examining their effectiveness and efficiency.

References

  1. Avritzer, A. and Weyuker, E. 1995. Automated generation of load test suites and the assessment of the resulting software. IEEE Trans. Softw. Eng. 21, 9 (Sept.), 705--716.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Baresel, A., Binkley, D., Harman, M., and Korel, B. 2004. Evolutionary testing in the presence of loop-assigned flags: A testability transformation approach. In Proceedings of the International Symposium on Software Testing and Analysis (Boston, Mass). ACM Press, New York, 108--118.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Belkin, N. 2000. Helping people find what they don't know. Commun. ACM 41, 8 (Aug.), 58--61.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Bird, D. and Munoz, C. 1983. Automatic generation of random self-checking test cases. IBM Syst. J. 22, 3, 229--245.]]Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Boyapati, C., Khurshid, S., and Marinov, D. 2002. Korat: Automated testing based on Java predicates. In Proceedings of the ACM International Symposium on Software Testing and Analysis (Rome). ACM Press, New York, 123--133.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Brown, D., Burnett, M., Rothermel, G., Fujita, H., and Negoro, F. 2003. Generalizing WYSIWYT visual testing to screen transition languages. In Proceedings of the IEEE Symposium on Human-Centric Computing Languages and Environments (Auckland, New Zealand). IEEE Computer Society Press, Los Alamitos, Calif.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Brown, P. and Gould, J. 1987. Experimental study of people creating spreadsheets. ACM Trans. Office Inf. Syst. 5, 3 (July), 258--272.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Burnett, M., Atwood, J., Djang, R., Gottfried, H., Reichwein, J., and Yang, S. 2001. Forms/3: A first-order visual language to explore the boundaries of the spreadsheet paradigm. J. Functional Program. 11, 2, 155--206.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Burnett, M., Cook, C., Pendse, O., Rothermel, G., Summet, J., and Wallace, C. 2003. End-user software engineering with assertions in the spreadsheet paradigm. In Proceedings of the International Conference on Software Engineering (Portland, Oreg.). IEEE Computer Society Press, Los Alamitos, Calif. 93--103.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Burnett, M., Cook, C., and Rothermel, G. 2004. End-user software engineering. Commun. ACM 47, 9 (Sept.), 53--58.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Burnett, M., Hossli, R., Pulliam, T., VanVoorst, B., and Yang, X. 1994. Toward visual programming languages for steering in scientific visualization: A taxonomy. IEEE Comput. Sci. Eng. 1, 4, 44--62.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Burnett, M., Sheretov, A., Ren, B., and Rothermel, G. 2002. Testing homogeneous spreadsheet grids with the “What You See Is What You Test” methodology. IEEE Trans. Softw. Eng. 28, 6 (June), 576--594.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Burnett, M., Sheretov, A., and Rothermel, G. 1999. Scaling up a “What You See Is What You Test” methodology to spreadsheet grids. In Proceedings of the 1999 IEEE Symposium on Visual Languages (Tokyo). IEEE Computer Society Press, Los Alamitos, Calif. 30--37.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Chang, J. and Richardson, D. 1999. Structural specification-based testing: Automated support and experimental evaluation. In Proceedings of the ACM Symposium on Foundations of Software Engineering (Toulouse, France). ACM Press, New York, 285--302.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Chi, E., Barry, P., Riedl, J., and Konstan, J. 1997. A spreadsheet approach to information visualization. In Proceedings of the IEEE Symposium on Information Visualization (Phoenix, Ariz). IEEE Computer Society Press, Los Alamitos, Calif.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Clarke, L. 1976. A system to generate test data and symbolically execute programs. IEEE Trans. Softw. Eng. 2, 3 (Sept.), 215--222.]]Google ScholarGoogle Scholar
  17. Corritore, C., Kracher, B., and Wiedenbeck, S. 2001. Trust in the online environment. In Proceedings of the HCI International (New Orleans, La). Lawrence Erlbaum, Mahwah, N.J., 1548--1552.]]Google ScholarGoogle Scholar
  18. Cullen, D. 2003. Excel snafu costs firm $24m. The Register.]]Google ScholarGoogle Scholar
  19. DeMillo, R. and Offutt, A. 1991. Constraint-based automatic test data generation. IEEE Trans. Softw. Eng. 17, 9 (Sept.), 900--910.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Duesterwald, E., Gupta, R., and Soffa, M. L. 1992. Rigorous data flow testing through output influences. In Proceedings of the 2nd Irvine Software Symposium (Irvine, Calif). University of California, Irvine.]]Google ScholarGoogle Scholar
  21. Ernst, M., Cockrell, J., Griswold, W., and Notkin, D. 2001. Dynamically discovering likely program invariants to support program evolution. IEEE Trans. Softw. Eng. 27, 2 (Feb.), 1--25.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Ferguson, R. and Korel, B. 1996. The chaining approach for software test data generation. ACM Trans. Softw. Eng. Methodol. 5, 1 (Jan.), 63--86.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Fisher, M., II, Cao, M., Rothermel, G., Cook, C., and Burnett, M. 2002. Automated test case generation for spreadsheets. In Proceedings of the 24th International Conference on Software Engineering (Orlando, Fla.). ACM Press, New York, 241--251.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Fisher, M., II, Jin, D., Rothermel, G., and Burnett, M. 2002. Test reuse in the spreadsheet paradigm. In Proceedings of the International Symposium on Software Reliability Engineering (Annapolis, Md.). IEEE Computer Society, Los Alamitos, Calif.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Fisher, M., II and Rothermel, G. 2005. The EUSES Spreadsheet Corpus: A shared resource for supporting experimentation with spreadsheet dependability mechanisms. In Proceedings of the Workshop on End-User Software Engineering (St. Louis, Miss.). ACM Press, New York.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Frankl, P. and Weyuker, E. 1988. An applicable family of data flow criteria. IEEE Trans. Softw. Eng. 14, 10 (Oct.), 1483--1498.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Gotlieb, A., Botella, B., and Reuher, M. 1998. Automatic test data generation using constraint solving techniques. In Proceedings of the ACM International Symposium on Software Testing and Analysis (Clearwater Beach, Fla.). ACM Press, New York, 53--62.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Gotlieb, A., Botella, B., and Rueher, M. 2000. A CLP framework for computing structural test data. In Proceedings of the 1st International Conference on Computational Logic (London). Springer Verlag, New York, 399--412.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Gupta, N., Mathur, A., and Soffa, M. 1998. Automated test data generation using an iterative relaxation method. In Proceedings of the International Symposium on Foundations of Software Engineering (Lake Buena Vista, Fla.). ACM Press, New York, 231--244.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Hartman, A. and Nagin, K. 2003. Model driven testing---Agedis architecture interfaces and tools. In Proceedings of the 1st European Conference on Model Driven Software Engineering (Nuremberg). imbus AG, Möhrendorf, Germany, 1--11.]]Google ScholarGoogle Scholar
  31. Korel, B. 1990a. A dynamic approach of automated test data generation. In Proceedings of the International Conference on Software Maintenance (San Diego, Calif.). IEEE Computer Society Press, Los Alamitos, Calif., 311--317.]]Google ScholarGoogle Scholar
  32. Korel, B. 1990b. Automated software test data generation. IEEE Trans. Softw. Eng. 16, 8 (Aug.), 870--897.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. Krishna, V., Cook, C., Keller, D., Cantrell, J., Wallace, C., Burnett, M., and Rothermel, G. 2001. Incorporating incremental validation and impact analysis into spreadsheet maintenance: An empirical study. In Proceedings of the International Conference on Software Maintenance (Florence). IEEE Computer Science Press, Los Alamitos, Calif.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Laski, J. and Korel, B. 1993. A data flow oriented program testing strategy. IEEE Trans. Softw. Eng. 9, 3 (May), 347--354.]]Google ScholarGoogle Scholar
  35. Leopold, J. and Ambler, A. 1997. Keyboardless visual programming using voice, handwriting, and gesture. In Proceedings of the 1997 IEEE Symposium of Visual Languages (Capri, Italy). IEEE Computer Society Press, Los Alamitos, Calif., 28--35.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Marinov, D. and Khurshid, S. 2001. TestEra: A novel framework for automated testing of Java programs. In Proceedings of the International Conference on Automated Software Engineering (San Diego, Calif.). IEEE Computer Society Press, Los Alamitos, Calif.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Marre, M. and Bertolino, A. 1996a. Reducing and estimating the cost of test coverage criteria. In Proceedings of the International Conference on Software Engineering (Berlin). IEEE Computer Society Press, Los Alamitos, Calif.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. Marre, M. and Bertolino, A. 1996b. Unconstrained duas and their use in achieving all-uses coverage. In Proceedings of the ACM International Symposium on Software Testing and Analysis (San Diego, Calif.). ACM Press, New York.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. Michael, C., McGraw, G., and Shatz, M. 2001. Generating software test data by evolution. IEEE Trans. Softw. Eng. 27, 12 (Dec.), 1085--1110.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. Myers, B. 1991. Graphical techniques in a spreadsheet for specifying user interfaces. In Proceedings of the ACM CHI '91 (New Orleans, La). ACM Press, New York, 243--249.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. Offutt, A. 1991. An integrated automatic test data generation system. J. Syst. Integration 1, 3 (Nov.), 391--409.]]Google ScholarGoogle Scholar
  42. Offutt, J. and Abdurazik, A. 1999. Generating tests from uml specifications. In Proceedings of the International Conference on the Unified Modeling Language (Fort Collins, Colo.). Springer Verlag, New York.]]Google ScholarGoogle Scholar
  43. Pande, H., Landi, W., and Ryder, B. 1994. Interprocedural def-use associations in C programs. IEEE Trans. Softw. Eng. 20, 5 (May), 385--403.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  44. Panko, R. 1995. Finding spreadsheet errors: Most spreadsheet errors have design flaws that may lead to long-term miscalculation. Information Week, 100.]]Google ScholarGoogle Scholar
  45. Panko, R. 1998. What we know about spreadsheet errors. J. End User Comput. 10, 2 (Spring), 15--21.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  46. Ramamoorthy, C., Ho, S., and Chen, W. 1976. On the automated generation of program test data. IEEE Trans. Softw. Eng. 2, 4 (Dec.), 293--300.]]Google ScholarGoogle Scholar
  47. Rapps, S. and Weyuker, E. 1985. Selecting software test data using data flow information. IEEE Trans. Softw. Eng. 11, 4 (Apr.), 367--375.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  48. Raz, O., Koopman, P., and Shaw, M. 2002. Semantic anomaly detection on online data sources. In Proceedings of the International Conference on Software Engineering (Orlando, Fla). ACM Press, New York, 302--312.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  49. Rothermel, G., Burnett, M., Li, L., DuPuis, C., and Sheretov, A. 2001. A methodology for testing spreadsheets. ACM Trans. Softw. Eng. Methodol. 10, 1 (Jan.), 110--147.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  50. Rothermel, G., Li, L., and Burnett, M. 1997. Testing strategies for form-based visual programs. In Proceedings of the 8th International Symposium on Software Reliability Engineering (Albuquerque, N. Mex.). IEEE Computer Society Press, Los Alamitos, Calif., 96--107.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  51. Rothermel, G., Li, L., DuPuis, C., and Burnett, M. 1998. What You See Is What You Test: A methodology for testing form-based visual programs. In Proceedings of the 20th International Conference on Software Engineering (Kyoto). IEEE Computer Society Press, Los Alamitos, Calif., 198--207.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  52. Rothermel, K., Cook, C., Burnett, M., Schonfeld, J., Green, T., and Rothermel, G. 2000. WYSIWYT testing in the spreadsheet paradigm: An empirical evaluation. In Proceedings of the 22nd International Conference on Software Engineering (Limerick, Ireland). ACM Press, New York.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  53. Scott, A. 2003. Shurgard stock dives after auditor quits over company's accounting. The Seattle Times.]]Google ScholarGoogle Scholar
  54. Smedley, T., Cox, P., and Byrne, S. 1996. Expanding the utility of spreadsheets through the integration of visual programming and user interface objects. In Proceedings of the Conference on Advanced Visual Interfaces '96 (Gubbio, Italy). ACM Press, New York.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  55. Smith, R. 2004. University of Toledo loses $2.4m in projected revenue. The Toledo Blade.]]Google ScholarGoogle Scholar
  56. Sy, N. and Deville, Y. 2003. Consistency techniques for interprocedural test data generation. In Proceedings of the ESEC/FSE'03 (Helsinki). ACM Press, New York, 108--117.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  57. Viehstaedt, G. and Ambler, A. 1992. Visual representation and manipulation of matrices. J. Visual Lang. Comput. 3, 3 (Sept.), 273--298.]]Google ScholarGoogle ScholarCross RefCross Ref
  58. Visser, W., Pasareanu, C., and Khurshid, S. 2004. Test input generation with Java Pathfinder. In Proceedings of the ACM International Symposium on Software Testing and Analysis (Boston, Mass). ACM Press, New York, 97--107.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  59. Weyuker, E. 1993. More experience with dataflow testing. IEEE Trans. Softw. Eng. 19, 9 (Sept.), 912--919.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  60. Wilcox, E., Atwood, J., Burnett, M., Cadiz, J., and Cook, C. 1997. Does continuous visual feedback aid debugging in direct-manipulation programming systems? In Proceedings of the ACM CHI'97 (Atlanta, Ga). ACM Press, New York, 22--27.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  61. Wilson, A., Burnett, M., Beckwith, L., Granatir, O., Casburn, L., Cook, C., Durham, M., and Rothermel, G. 2003. Harnessing curiosity to increase correctness in end-user programming. In Proceedings of the ACM Conference on Human Factors in Computing Systems (Ft. Lauderdale, Fla). ACM Press, New York, 305--312.]] Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Integrating automated test generation into the WYSIWYT spreadsheet testing methodology

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in

        Full Access

        • Published in

          cover image ACM Transactions on Software Engineering and Methodology
          ACM Transactions on Software Engineering and Methodology  Volume 15, Issue 2
          April 2006
          104 pages
          ISSN:1049-331X
          EISSN:1557-7392
          DOI:10.1145/1131421
          Issue’s Table of Contents

          Copyright © 2006 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 1 April 2006
          Published in tosem Volume 15, Issue 2

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • article

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader