Abstract
Spreadsheet languages, which include commercial spreadsheets and various research systems, have had a substantial impact on end-user computing. Research shows, however, that spreadsheets often contain faults. Thus, in previous work we presented a methodology that helps spreadsheet users test their spreadsheet formulas. Our empirical studies have shown that end users can use this methodology to test spreadsheets more adequately and efficiently; however, the process of generating test cases can still present a significant impediment. To address this problem, we have been investigating how to incorporate automated test case generation into our testing methodology in ways that support incremental testing and provide immediate visual feedback. We have used two techniques for generating test cases, one involving random selection and one involving a goal-oriented approach. We describe these techniques and their integration into our testing environment, and report results of an experiment examining their effectiveness and efficiency.
- Avritzer, A. and Weyuker, E. 1995. Automated generation of load test suites and the assessment of the resulting software. IEEE Trans. Softw. Eng. 21, 9 (Sept.), 705--716.]] Google ScholarDigital Library
- Baresel, A., Binkley, D., Harman, M., and Korel, B. 2004. Evolutionary testing in the presence of loop-assigned flags: A testability transformation approach. In Proceedings of the International Symposium on Software Testing and Analysis (Boston, Mass). ACM Press, New York, 108--118.]] Google ScholarDigital Library
- Belkin, N. 2000. Helping people find what they don't know. Commun. ACM 41, 8 (Aug.), 58--61.]] Google ScholarDigital Library
- Bird, D. and Munoz, C. 1983. Automatic generation of random self-checking test cases. IBM Syst. J. 22, 3, 229--245.]]Google ScholarDigital Library
- Boyapati, C., Khurshid, S., and Marinov, D. 2002. Korat: Automated testing based on Java predicates. In Proceedings of the ACM International Symposium on Software Testing and Analysis (Rome). ACM Press, New York, 123--133.]] Google ScholarDigital Library
- Brown, D., Burnett, M., Rothermel, G., Fujita, H., and Negoro, F. 2003. Generalizing WYSIWYT visual testing to screen transition languages. In Proceedings of the IEEE Symposium on Human-Centric Computing Languages and Environments (Auckland, New Zealand). IEEE Computer Society Press, Los Alamitos, Calif.]] Google ScholarDigital Library
- Brown, P. and Gould, J. 1987. Experimental study of people creating spreadsheets. ACM Trans. Office Inf. Syst. 5, 3 (July), 258--272.]] Google ScholarDigital Library
- Burnett, M., Atwood, J., Djang, R., Gottfried, H., Reichwein, J., and Yang, S. 2001. Forms/3: A first-order visual language to explore the boundaries of the spreadsheet paradigm. J. Functional Program. 11, 2, 155--206.]] Google ScholarDigital Library
- Burnett, M., Cook, C., Pendse, O., Rothermel, G., Summet, J., and Wallace, C. 2003. End-user software engineering with assertions in the spreadsheet paradigm. In Proceedings of the International Conference on Software Engineering (Portland, Oreg.). IEEE Computer Society Press, Los Alamitos, Calif. 93--103.]] Google ScholarDigital Library
- Burnett, M., Cook, C., and Rothermel, G. 2004. End-user software engineering. Commun. ACM 47, 9 (Sept.), 53--58.]] Google ScholarDigital Library
- Burnett, M., Hossli, R., Pulliam, T., VanVoorst, B., and Yang, X. 1994. Toward visual programming languages for steering in scientific visualization: A taxonomy. IEEE Comput. Sci. Eng. 1, 4, 44--62.]] Google ScholarDigital Library
- Burnett, M., Sheretov, A., Ren, B., and Rothermel, G. 2002. Testing homogeneous spreadsheet grids with the “What You See Is What You Test” methodology. IEEE Trans. Softw. Eng. 28, 6 (June), 576--594.]] Google ScholarDigital Library
- Burnett, M., Sheretov, A., and Rothermel, G. 1999. Scaling up a “What You See Is What You Test” methodology to spreadsheet grids. In Proceedings of the 1999 IEEE Symposium on Visual Languages (Tokyo). IEEE Computer Society Press, Los Alamitos, Calif. 30--37.]] Google ScholarDigital Library
- Chang, J. and Richardson, D. 1999. Structural specification-based testing: Automated support and experimental evaluation. In Proceedings of the ACM Symposium on Foundations of Software Engineering (Toulouse, France). ACM Press, New York, 285--302.]] Google ScholarDigital Library
- Chi, E., Barry, P., Riedl, J., and Konstan, J. 1997. A spreadsheet approach to information visualization. In Proceedings of the IEEE Symposium on Information Visualization (Phoenix, Ariz). IEEE Computer Society Press, Los Alamitos, Calif.]] Google ScholarDigital Library
- Clarke, L. 1976. A system to generate test data and symbolically execute programs. IEEE Trans. Softw. Eng. 2, 3 (Sept.), 215--222.]]Google Scholar
- Corritore, C., Kracher, B., and Wiedenbeck, S. 2001. Trust in the online environment. In Proceedings of the HCI International (New Orleans, La). Lawrence Erlbaum, Mahwah, N.J., 1548--1552.]]Google Scholar
- Cullen, D. 2003. Excel snafu costs firm $24m. The Register.]]Google Scholar
- DeMillo, R. and Offutt, A. 1991. Constraint-based automatic test data generation. IEEE Trans. Softw. Eng. 17, 9 (Sept.), 900--910.]] Google ScholarDigital Library
- Duesterwald, E., Gupta, R., and Soffa, M. L. 1992. Rigorous data flow testing through output influences. In Proceedings of the 2nd Irvine Software Symposium (Irvine, Calif). University of California, Irvine.]]Google Scholar
- Ernst, M., Cockrell, J., Griswold, W., and Notkin, D. 2001. Dynamically discovering likely program invariants to support program evolution. IEEE Trans. Softw. Eng. 27, 2 (Feb.), 1--25.]] Google ScholarDigital Library
- Ferguson, R. and Korel, B. 1996. The chaining approach for software test data generation. ACM Trans. Softw. Eng. Methodol. 5, 1 (Jan.), 63--86.]] Google ScholarDigital Library
- Fisher, M., II, Cao, M., Rothermel, G., Cook, C., and Burnett, M. 2002. Automated test case generation for spreadsheets. In Proceedings of the 24th International Conference on Software Engineering (Orlando, Fla.). ACM Press, New York, 241--251.]] Google ScholarDigital Library
- Fisher, M., II, Jin, D., Rothermel, G., and Burnett, M. 2002. Test reuse in the spreadsheet paradigm. In Proceedings of the International Symposium on Software Reliability Engineering (Annapolis, Md.). IEEE Computer Society, Los Alamitos, Calif.]] Google ScholarDigital Library
- Fisher, M., II and Rothermel, G. 2005. The EUSES Spreadsheet Corpus: A shared resource for supporting experimentation with spreadsheet dependability mechanisms. In Proceedings of the Workshop on End-User Software Engineering (St. Louis, Miss.). ACM Press, New York.]] Google ScholarDigital Library
- Frankl, P. and Weyuker, E. 1988. An applicable family of data flow criteria. IEEE Trans. Softw. Eng. 14, 10 (Oct.), 1483--1498.]] Google ScholarDigital Library
- Gotlieb, A., Botella, B., and Reuher, M. 1998. Automatic test data generation using constraint solving techniques. In Proceedings of the ACM International Symposium on Software Testing and Analysis (Clearwater Beach, Fla.). ACM Press, New York, 53--62.]] Google ScholarDigital Library
- Gotlieb, A., Botella, B., and Rueher, M. 2000. A CLP framework for computing structural test data. In Proceedings of the 1st International Conference on Computational Logic (London). Springer Verlag, New York, 399--412.]] Google ScholarDigital Library
- Gupta, N., Mathur, A., and Soffa, M. 1998. Automated test data generation using an iterative relaxation method. In Proceedings of the International Symposium on Foundations of Software Engineering (Lake Buena Vista, Fla.). ACM Press, New York, 231--244.]] Google ScholarDigital Library
- Hartman, A. and Nagin, K. 2003. Model driven testing---Agedis architecture interfaces and tools. In Proceedings of the 1st European Conference on Model Driven Software Engineering (Nuremberg). imbus AG, Möhrendorf, Germany, 1--11.]]Google Scholar
- Korel, B. 1990a. A dynamic approach of automated test data generation. In Proceedings of the International Conference on Software Maintenance (San Diego, Calif.). IEEE Computer Society Press, Los Alamitos, Calif., 311--317.]]Google Scholar
- Korel, B. 1990b. Automated software test data generation. IEEE Trans. Softw. Eng. 16, 8 (Aug.), 870--897.]] Google ScholarDigital Library
- Krishna, V., Cook, C., Keller, D., Cantrell, J., Wallace, C., Burnett, M., and Rothermel, G. 2001. Incorporating incremental validation and impact analysis into spreadsheet maintenance: An empirical study. In Proceedings of the International Conference on Software Maintenance (Florence). IEEE Computer Science Press, Los Alamitos, Calif.]] Google ScholarDigital Library
- Laski, J. and Korel, B. 1993. A data flow oriented program testing strategy. IEEE Trans. Softw. Eng. 9, 3 (May), 347--354.]]Google Scholar
- Leopold, J. and Ambler, A. 1997. Keyboardless visual programming using voice, handwriting, and gesture. In Proceedings of the 1997 IEEE Symposium of Visual Languages (Capri, Italy). IEEE Computer Society Press, Los Alamitos, Calif., 28--35.]] Google ScholarDigital Library
- Marinov, D. and Khurshid, S. 2001. TestEra: A novel framework for automated testing of Java programs. In Proceedings of the International Conference on Automated Software Engineering (San Diego, Calif.). IEEE Computer Society Press, Los Alamitos, Calif.]] Google ScholarDigital Library
- Marre, M. and Bertolino, A. 1996a. Reducing and estimating the cost of test coverage criteria. In Proceedings of the International Conference on Software Engineering (Berlin). IEEE Computer Society Press, Los Alamitos, Calif.]] Google ScholarDigital Library
- Marre, M. and Bertolino, A. 1996b. Unconstrained duas and their use in achieving all-uses coverage. In Proceedings of the ACM International Symposium on Software Testing and Analysis (San Diego, Calif.). ACM Press, New York.]] Google ScholarDigital Library
- Michael, C., McGraw, G., and Shatz, M. 2001. Generating software test data by evolution. IEEE Trans. Softw. Eng. 27, 12 (Dec.), 1085--1110.]] Google ScholarDigital Library
- Myers, B. 1991. Graphical techniques in a spreadsheet for specifying user interfaces. In Proceedings of the ACM CHI '91 (New Orleans, La). ACM Press, New York, 243--249.]] Google ScholarDigital Library
- Offutt, A. 1991. An integrated automatic test data generation system. J. Syst. Integration 1, 3 (Nov.), 391--409.]]Google Scholar
- Offutt, J. and Abdurazik, A. 1999. Generating tests from uml specifications. In Proceedings of the International Conference on the Unified Modeling Language (Fort Collins, Colo.). Springer Verlag, New York.]]Google Scholar
- Pande, H., Landi, W., and Ryder, B. 1994. Interprocedural def-use associations in C programs. IEEE Trans. Softw. Eng. 20, 5 (May), 385--403.]] Google ScholarDigital Library
- Panko, R. 1995. Finding spreadsheet errors: Most spreadsheet errors have design flaws that may lead to long-term miscalculation. Information Week, 100.]]Google Scholar
- Panko, R. 1998. What we know about spreadsheet errors. J. End User Comput. 10, 2 (Spring), 15--21.]] Google ScholarDigital Library
- Ramamoorthy, C., Ho, S., and Chen, W. 1976. On the automated generation of program test data. IEEE Trans. Softw. Eng. 2, 4 (Dec.), 293--300.]]Google Scholar
- Rapps, S. and Weyuker, E. 1985. Selecting software test data using data flow information. IEEE Trans. Softw. Eng. 11, 4 (Apr.), 367--375.]] Google ScholarDigital Library
- Raz, O., Koopman, P., and Shaw, M. 2002. Semantic anomaly detection on online data sources. In Proceedings of the International Conference on Software Engineering (Orlando, Fla). ACM Press, New York, 302--312.]] Google ScholarDigital Library
- Rothermel, G., Burnett, M., Li, L., DuPuis, C., and Sheretov, A. 2001. A methodology for testing spreadsheets. ACM Trans. Softw. Eng. Methodol. 10, 1 (Jan.), 110--147.]] Google ScholarDigital Library
- Rothermel, G., Li, L., and Burnett, M. 1997. Testing strategies for form-based visual programs. In Proceedings of the 8th International Symposium on Software Reliability Engineering (Albuquerque, N. Mex.). IEEE Computer Society Press, Los Alamitos, Calif., 96--107.]] Google ScholarDigital Library
- Rothermel, G., Li, L., DuPuis, C., and Burnett, M. 1998. What You See Is What You Test: A methodology for testing form-based visual programs. In Proceedings of the 20th International Conference on Software Engineering (Kyoto). IEEE Computer Society Press, Los Alamitos, Calif., 198--207.]] Google ScholarDigital Library
- Rothermel, K., Cook, C., Burnett, M., Schonfeld, J., Green, T., and Rothermel, G. 2000. WYSIWYT testing in the spreadsheet paradigm: An empirical evaluation. In Proceedings of the 22nd International Conference on Software Engineering (Limerick, Ireland). ACM Press, New York.]] Google ScholarDigital Library
- Scott, A. 2003. Shurgard stock dives after auditor quits over company's accounting. The Seattle Times.]]Google Scholar
- Smedley, T., Cox, P., and Byrne, S. 1996. Expanding the utility of spreadsheets through the integration of visual programming and user interface objects. In Proceedings of the Conference on Advanced Visual Interfaces '96 (Gubbio, Italy). ACM Press, New York.]] Google ScholarDigital Library
- Smith, R. 2004. University of Toledo loses $2.4m in projected revenue. The Toledo Blade.]]Google Scholar
- Sy, N. and Deville, Y. 2003. Consistency techniques for interprocedural test data generation. In Proceedings of the ESEC/FSE'03 (Helsinki). ACM Press, New York, 108--117.]] Google ScholarDigital Library
- Viehstaedt, G. and Ambler, A. 1992. Visual representation and manipulation of matrices. J. Visual Lang. Comput. 3, 3 (Sept.), 273--298.]]Google ScholarCross Ref
- Visser, W., Pasareanu, C., and Khurshid, S. 2004. Test input generation with Java Pathfinder. In Proceedings of the ACM International Symposium on Software Testing and Analysis (Boston, Mass). ACM Press, New York, 97--107.]] Google ScholarDigital Library
- Weyuker, E. 1993. More experience with dataflow testing. IEEE Trans. Softw. Eng. 19, 9 (Sept.), 912--919.]] Google ScholarDigital Library
- Wilcox, E., Atwood, J., Burnett, M., Cadiz, J., and Cook, C. 1997. Does continuous visual feedback aid debugging in direct-manipulation programming systems? In Proceedings of the ACM CHI'97 (Atlanta, Ga). ACM Press, New York, 22--27.]] Google ScholarDigital Library
- Wilson, A., Burnett, M., Beckwith, L., Granatir, O., Casburn, L., Cook, C., Durham, M., and Rothermel, G. 2003. Harnessing curiosity to increase correctness in end-user programming. In Proceedings of the ACM Conference on Human Factors in Computing Systems (Ft. Lauderdale, Fla). ACM Press, New York, 305--312.]] Google ScholarDigital Library
Index Terms
- Integrating automated test generation into the WYSIWYT spreadsheet testing methodology
Recommendations
Achieving scalable mutation-based generation of whole test suites
Without complete formal specification, automatically generated software tests need to be manually checked in order to detect faults. This makes it desirable to produce the strongest possible test set while keeping the number of tests as small as ...
WYSIWYT testing in the spreadsheet paradigm: an empirical evaluation
ICSE '00: Proceedings of the 22nd international conference on Software engineeringIs it possible to achieve some of the benefits of formal testing within the informal programming conventions of the spreadsheet paradigm? We have been working on an approach that attempts to do so via the development of a testing methodology for this ...
Beyond unit-testing in search-based test case generation: challenges and opportunities
SBST '19: Proceedings of the 12th International Workshop on Search-Based Software TestingOver the last decades, white-box search-based techniques have been applied to automate the design and the execution of test cases. While most of the research effort has been devoted to unit-level testing, integration-level test case generation requires ...
Comments