skip to main content
10.1145/2568225.2568300acmconferencesArticle/Chapter ViewAbstractPublication PagesicseConference Proceedingsconference-collections
research-article

Exploring variability-aware execution for testing plugin-based web applications

Published:31 May 2014Publication History

ABSTRACT

In plugin-based systems, plugin conflicts may occur when two or more plugins interfere with one another, changing their expected behaviors. It is highly challenging to detect plugin conflicts due to the exponential explosion of the combinations of plugins (i.e., configurations). In this paper, we address the challenge of executing a test case over many configurations. Leveraging the fact that many executions of a test are similar, our variability-aware execution runs common code once. Only when encountering values that are different depending on specific configurations will the execution split to run for each of them. To evaluate the scalability of variability-aware execution on a large real-world setting, we built a prototype PHP interpreter called Varex and ran it on the popular WordPress blogging Web application. The results show that while plugin interactions exist, there is a significant amount of sharing that allows variability-aware execution to scale to 2^50 configurations within seven minutes of running time. During our study, with Varex, we were able to detect two plugin conflicts: one was recently reported on WordPress forum and another one was not previously discovered.

References

  1. Covering array tables. http://www.public.asu.edu/ ~ccolbou/src/tabby/catable.html.Google ScholarGoogle Scholar
  2. Eclipse website. http://www.eclipse.org/.Google ScholarGoogle Scholar
  3. JavaBDD website. http://javabdd.sourceforge.net/.Google ScholarGoogle Scholar
  4. Microsoft Office website. http://office.microsoft.com/.Google ScholarGoogle Scholar
  5. Mozzila add-ons website. https://developer.mozilla.org/en-US/addons.Google ScholarGoogle Scholar
  6. Plugin conflicts. http://wiki.simple-press.com/ installation/troubleshooting/plugin-conflicts/.Google ScholarGoogle Scholar
  7. Quercus website. http://quercus.caucho.com/.Google ScholarGoogle Scholar
  8. Varex website. http://home.engineering.iastate. edu/~hungnv/Research/Varex/.Google ScholarGoogle Scholar
  9. Web site test tools and site management tools. http://www.softwareqatest.com/qatweb1.html.Google ScholarGoogle Scholar
  10. WordPress plugin compatibility beta. http://wordpress. org/news/2009/10/plugin-compatibility-beta/.Google ScholarGoogle Scholar
  11. WordPress support website. http://wordpress.org/ support/topic/call-to-undefined-function-win_ is_writable-on-line-236.Google ScholarGoogle Scholar
  12. WordPress website. http://wordpress.org/.Google ScholarGoogle Scholar
  13. T. H. Austin and C. Flanagan. Multiple facets for dynamic information flow. In Proc. Symp. Principles of Programming Languages (POPL), pages 165–178, New York, 2012. ACM Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. D. Batory. Feature models, grammars, and propositional formulas. In Proc. Int’l Software Product Line Conference (SPLC), volume 3714 of Lecture Notes in Computer Science, pages 7–20, Berlin/Heidelberg, 2005. Springer-Verlag. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. E. Bodden, T. Tolêdo, M. Ribeiro, C. Brabrand, P. Borba, and M. Mezini. Spllift: Statically analyzing software product lines in minutes instead of years. In Proc. Conf. Programming Language Design and Implementation (PLDI), pages 355–364, New York, 2013. ACM Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. C. Brabrand, M. Ribeiro, T. Tolêdo, and P. Borba. Intraprocedural dataflow analysis for software product lines. In Proc. Int’l Conf. Aspect-Oriented Software Development (AOSD), pages 13–24, New York, 2012. ACM Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. C. Cadar, D. Dunbar, and D. Engler. Klee: Unassisted and automatic generation of high-coverage tests for complex systems programs. In Proc. USENIX Conf. Operating Systems Design and Implementation (OSDI), pages 209–224, Berkeley, CA, 2008. USENIX Association. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. C. Cadar, P. Godefroid, S. Khurshid, C. S. Păsăreanu, K. Sen, N. Tillmann, and W. Visser. Symbolic execution for software testing in practice: Preliminary assessment. In Proc. Int’l Conf. Software Engineering (ICSE), pages 1066–1071, New York, 2011. ACM Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. M. Calder, M. Kolberg, E. H. Magill, and S. Reiff-Marganiec. Feature interaction: A critical review and considered forecast. Computer Networks, 41(1):115–141, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. R. Capizzi, A. Longo, V. N. Venkatakrishnan, and A. P. Sistla. Preventing information leaks through shadow executions. In Proc. Annual Computer Security Applications Conference (ACSAC), pages 322–331, Washington, DC, 2008. IEEE. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. S. Chen, M. Erwig, and E. Walkingshaw. Extending type inference to variational programs. ACM Trans. Program. Lang. Syst. (TOPLAS), 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. E. M. Clarke, O. Grumberg, and D. A. Peled. Model Checking. MIT Press, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. A. Classen, P. Heymans, P.-Y. Schobbens, A. Legay, and J.-F. Raskin. Model checking lots of systems: Efficient verification of temporal properties in software product lines. In Proc. Int’l Conf. Software Engineering (ICSE), pages 335–344, New York, 2010. ACM Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. M. B. Cohen, M. B. Dwyer, and J. Shi. Interaction testing of highly-configurable systems in the presence of constraints. In Proc. Int’l Symp. Software Testing and Analysis (ISSTA), pages 129–139, New York, 2007. ACM Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. B. Cox, D. Evans, A. Filipi, J. Rowanhill, W. Hu, J. Davidson, J. Knight, A. Nguyen-Tuong, and J. Hiser. N-variant systems: A secretless framework for security through diversity. In Proc. USENIX Security Symposium (USENIX-SS), Berkeley, CA, USA, 2006. USENIX Association. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. K. Czarnecki and K. Pietroszek. Verifying feature-based model templates against well-formedness OCL constraints. In Proc. Int’l Conf. Generative Programming and Component Engineering (GPCE), pages 211–220, New York, 2006. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. E. Engström and P. Runeson. Software product line testing - A systematic mapping study. Information and Software Technology (IST), 53(1):2–13, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. M. Erwig and E. Walkingshaw. The choice calculus: A representation for software variation. ACM Trans. Softw. Eng. Methodol., 21(1):6:1–6:27, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. M. Erwig and E. Walkingshaw. Variation programming with the choice calculus. In Generative and Transformational Techniques in Software Engineering IV, pages 55–100. Springer Berlin Heidelberg, 2013.Google ScholarGoogle Scholar
  30. B. J. Garvin and M. B. Cohen. Feature interaction faults revisited: An exploratory study. In Proc. Int’l Symp. Software Reliability Engineering (ISSRE), pages 90–99, Los Alamitos, CA, 2011. IEEE Computer Society. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. P. Godefroid, N. Klarlund, and K. Sen. DART: Directed automated random testing. In Proc. Conf. Programming Language Design and Implementation (PLDI), pages 213–223, New York, 2005. ACM Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. M. Greiler, A. van Deursen, and M.-A. Storey. Test confessions: A study of testing practices for plug-in systems. In Proc. Int’l Conf. Software Engineering (ICSE), pages 244–254, Piscataway, NJ, 2012. IEEE Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. C. Kästner, S. Apel, T. Thüm, and G. Saake. Type checking annotation-based product lines. ACM Trans. Softw. Eng. Methodol. (TOSEM), 21(3):14:1–14:39, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. C. Kästner, P. G. Giarrusso, T. Rendel, S. Erdweg, K. Ostermann, and T. Berger. Variability-aware parsing in the presence of lexical macros and conditional compilation. In Proc. Int’l Conf. Object-Oriented Programming, Systems, Languages and Applications (OOPSLA), pages 805–824, New York, 2011. ACM Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. C. Kästner, A. von Rhein, S. Erdweg, J. Pusch, S. Apel, T. Rendel, and K. Ostermann. Toward variability-aware testing. In Proc. GPCE Workshop on Feature-Oriented Software Development (FOSD), pages 1–8, New York, 2012. ACM Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. C. H. P. Kim, D. S. Batory, and S. Khurshid. Reducing combinatorics in testing product lines. In Proc. Int’l Conf. Aspect-Oriented Software Development (AOSD), pages 57–68, New York, 2011. ACM Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. C. H. P. Kim, S. Khurshid, and D. Batory. Shared execution for efficiently testing product lines. In Proc. Int’l Symp. Software Reliability Engineering (ISSRE), pages 221–230, Los Alamitos, CA, 2012. IEEE Computer Society.Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. C. H. P. Kim, D. Marinov, S. Khurshid, D. Batory, S. Souto, P. Barros, and M. D’Amorim. Splat: Lightweight dynamic analysis for reducing combinatorics in testing configurable systems. In Proc. Europ. Software Engineering Conf./Foundations of Software Engineering (ESEC/FSE), pages 257–267, New York, 2013. ACM Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. K. Lauenroth, K. Pohl, and S. Toehning. Model checking of domain artifacts in product line engineering. In Proc. Int’l Conf. Automated Software Engineering (ASE), pages 269–280, Los Alamitos, CA, 2009. IEEE Computer Society. Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. J. Liebig, A. von Rhein, C. Kästner, S. Apel, J. Dörre, and C. Lengauer. Scalable analysis of variable software. In Proc. Europ. Software Engineering Conf./Foundations of Software Engineering (ESEC/FSE), pages 81–91, New York, 2013. ACM Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. M. Mendonça, A. W ˛ asowski, and K. Czarnecki. SAT-based analysis of feature models is easy. In Proc. Int’l Software Product Line Conference (SPLC), pages 231–240, New York, 2009. ACM Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. A. Nhlabatsi, R. Laney, and B. Nuseibeh. Feature interaction: The security threat from within software systems. Progress in Informatics, pages 75–89, 2008.Google ScholarGoogle ScholarCross RefCross Ref
  43. C. Nie and H. Leung. A survey of combinatorial testing. ACM Comput. Surv., 43(2):11:1–11:29, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  44. J. Petke, S. Yoo, M. B. Cohen, and M. Harman. Efficiency and early fault detection with lower and higher strength combinatorial interaction testing. In Proc. Europ. Software Engineering Conf./Foundations of Software Engineering (ESEC/FSE), pages 26–36, New York, 2013. ACM Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. K. Pohl, G. Böckle, and F. J. van der Linden. Software Product Line Engineering: Foundations, Principles and Techniques. Springer-Verlag, Berlin/Heidelberg, 2005. Google ScholarGoogle ScholarCross RefCross Ref
  46. X. Qu, M. Acharya, and B. Robinson. Impact analysis of configuration changes for test case selection. In Proc. Int’l Symp. Software Reliability Engineering (ISSRE), pages 140–149, Washington, DC, 2011. IEEE Computer Society. Google ScholarGoogle ScholarDigital LibraryDigital Library
  47. E. Reisner, C. Song, K.-K. Ma, J. S. Foster, and A. Porter. Using symbolic evaluation to understand behavior in configurable software systems. In Proc. Int’l Conf. Software Engineering (ICSE), pages 445–454, New York, 2010. ACM Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  48. B. Robinson, M. Acharya, and X. Qu. Configuration selection using code change impact analysis for regression testing. In Proc. Int’l Conf. Software Maintenance (ICSM), pages 129–138, Washington, DC, 2012. IEEE Computer Society. Google ScholarGoogle ScholarDigital LibraryDigital Library
  49. K. Sen, D. Marinov, and G. Agha. Cute: A concolic unit testing engine for C. In Proc. Europ. Software Engineering Conf./Foundations of Software Engineering (ESEC/FSE), pages 263–272, New York, 2005. ACM Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  50. J. Shi, M. Cohen, and M. Dwyer. Integration testing of software product lines using compositional symbolic execution. In Proc. Int’l Conf. Fundamental Approaches to Software Engineering, volume 7212 of Lecture Notes in Computer Science, pages 270–284, Berlin/Heidelberg, 2012. Springer-Verlag. Google ScholarGoogle ScholarDigital LibraryDigital Library
  51. S. Thaker, D. Batory, D. Kitchin, and W. Cook. Safe composition of product lines. In Proc. Int’l Conf. Generative Programming and Component Engineering (GPCE), pages 95–104, New York, 2007. ACM Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  52. T. Thüm, S. Apel, C. Kästner, I. Schaefer, and G. Saake. A classification and survey of analysis strategies for software product lines. ACM Computing Surveys, 2014. to appear.Google ScholarGoogle ScholarDigital LibraryDigital Library
  53. J. Tucek, W. Xiong, and Y. Zhou. Efficient online validation with delta execution. In Proc. Int’l Conf. Architectural Support for Programming Languages and Operating Systems (ASPLOS), pages 193–204, New York, 2009. ACM Press. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Exploring variability-aware execution for testing plugin-based web applications

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      ICSE 2014: Proceedings of the 36th International Conference on Software Engineering
      May 2014
      1139 pages
      ISBN:9781450327565
      DOI:10.1145/2568225

      Copyright © 2014 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 31 May 2014

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate276of1,856submissions,15%

      Upcoming Conference

      ICSE 2025

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader