skip to main content
10.1145/2151024.2151034acmconferencesArticle/Chapter ViewAbstractPublication PagesveeConference Proceedingsconference-collections
research-article

SimTester: a controllable and observable testing framework for embedded systems

Published:03 March 2012Publication History

ABSTRACT

In software for embedded systems, the frequent use of interrupts for timing, sensing, and I/O processing can cause concurrency faults to occur due to interactions between applications, device drivers, and interrupt handlers. This type of fault is considered by many practitioners to be among the most difficult to detect, isolate, and correct, in part because it can be sensitive to execution interleavings and often occurs without leaving any observable incorrect output. As such, commonly used testing techniques that inspect program outputs to detect failures are often ineffective at detecting them. To test for these concurrency faults, test engineers need to be able to control interleavings so that they are deterministic. Furthermore, they also need to be able to observe faults as they occur instead of relying on observable incorrect outputs.

In this paper, we introduce SimTester, a framework that allows engineers to effectively test for subtle and non-deterministic concurrency faults by providing them with greater controllability and observability. We implemented our framework on a commercial virtual platform that is widely used to support hardware/software co-designs to promote ease of adoption. We then evaluated its effectiveness by using it to test for data races and deadlocks. The result shows that our framework can be effective and efficient at detecting these faults.

References

  1. L. Albertsson. Simulation-Based Debugging of Soft Real-Time Applications. In Proceedings of the IEEE Real-Time Technology and Applications Symposium (RTAS), pages 107--108, Taipei, Taiwan, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. J. H. Andrews, L. C. Briand, and Y. Labiche. Is Mutation An Appropriate Tool for Testing Experiments? In Proceedings of the International Conference on Software Engineering (ICSE), pages 402--411, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Apache Software Foundation. Datarace on org.apache.catalina.loader.webappclassloader. https://issues.apache.org/bugzilla/show_bug.cgi?id=37458.Google ScholarGoogle Scholar
  4. K. Asrigo, L. Litty, and D. Lie. Using VMM-based Sensors to Monitor Honeypots. In Proceedings of the 2nd International Conference on Virtual Execution Environments (VEE), pages 13--23, Ottawa, Ontario, Canada, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. T. Ball, E. Bounimova, B. Cook, V. Levin, J. Lichtenberg, C. McGarvey, B. Ondrusek, S. K. Rajamani, and A. Ustuner. Thorough Static Analysis of Device Drivers. In Proceedings of the 1st ACM SIGOPS European Conference on Computer Systems (EuroSys), pages 73--85, Leuven, Belgium, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. S. Bhansali, W.-K. Chen, S. de Jong, A. Edwards, R. Murray, M. Drinić, D. Mihocka, and J. Chau. Framework for Instruction-level Tracing and Analysis of Program Executions. In Proceedings of the 2nd International Conference on Virtual Execution Environments (VEE), pages 154--163, Ottawa, Ontario, Canada, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. M. D. Bond, K. E. Coons, and K. S. McKinley. PACER: Proportional Detection of Data Races. In Proceedings of the ACM SIGPLAN Conference on Programming Language Design and Implementation (PLDI), pages 255--268, Toronto, Ontario, Canada, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. D. Brylow, N. Damgaard, and J. Palsberg. Static Checking of Interrupt-driven Software. In Proceedings of the 23rd International Conference on Software Engineering (ICSE), pages 47--56, Toronto, Ontario, Canada, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. C. Cadar, D. Dunbar, and D. Engler. KLEE: Unassisted and Automatic Generation of High-Coverage Tests for Complex Systems Programs. In Proceedings of the 8th USENIX conference on Operating Systems Design and Implementation (OSDI), pages 209--224, San Diego, California, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Canonical Ltd. Launchpad: data-races-implementation sql crash. https://bugs.launchpad.net/f-4d-cb/bug/516622.Google ScholarGoogle Scholar
  11. R. H. Carver and K.-C. Tai. Replay and Testing for Concurrent Programs. IEEE Software, 8:66--74, March 1991. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. M. Chang, E. Smith, R. Reitmaier, M. Bebenita, A. Gal, C. Wimmer, B. Eich, and M. Franz. Tracing for Web 3.0: Trace Compilation for the Next Generation Web Applications. In Proceedings of the 2009 ACM SIGPLAN/SIGOPS International Conference on Virtual Execution Environments (VEE), pages 71--80, Washington, DC, USA, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. J. Chen and S. MacDonald. Testing Concurrent Programs Using Value Schedules. In Proceedings of the Twenty-Second IEEE/ACM International Conference on Automated Software Engineering (ASE), pages 313--322, Atlanta, Georgia, USA, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. J. Chow, T. Garfinkel, and P. M. Chen. Decoupling dynamic program analysis from execution in virtual environments. In USENIX 2008 Annual Technical Conference on Annual Technical Conference, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. J. Corbet, A. Rubini, and G. Kroah-Hartman. Linux Device Drivers, 3rd Edition. O'Reilly and Associates, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. M. B. Dwyer and L. A. Clarke. Data Flow Analysis for Verifying Properties of Concurrent Programs. In Proceedings of the 2nd ACM SIGSOFT Symposium on Foundations of Software Engineering (FSE), pages 62--75, New Orleans, Louisiana, 1994. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. EETIMES. Five top causes of nasty bugs. Web page. http://www.eetimes.com/design/embedded/4008917/Five-top-causes-of-nasty-e%mbedded-software-bugs.Google ScholarGoogle Scholar
  18. J. Engblom. Virtutech device modeling language. White-Paper, 2009. http://www.virtutech.com/files/whitepapers/wp-dml-2009-03--30-letter.pdf.Google ScholarGoogle Scholar
  19. D. Engler, B. Chelf, A. Chou, and S. Hallem. Checking System Rules Using System-specific, Programmer-written Compiler Extensions. In Proceedings of the 4th Symposium on Operating System Design & Implementation (OSDI), pages 1--16, San Diego, California, 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. C. Flanagan and S. N. Freund. FastTrack: Efficient and Precise Dynamic Race Detection. In Proceedings of the 2009 ACM SIGPLAN Conference on Programming language Design and Implementation (PLDI), pages 121--133, Dublin, Ireland, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. O. Goh and Y.-H. Lee. Schedulable Online Testing Framework for Real-time Embedded Applications in VM. In Proceedings of the 2007 International Conference on Embedded and Ubiquitous Computing (EUC), pages 730--741, Taipei, Taiwan, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. G. Gracioli and S. Fischmeister. Tracing Interrupts in Embedded Software. In Proceedings of the 2009 ACM SIGPLAN/SIGBED Conference on Languages, Compilers, and Tools for Embedded Systems (LCTES), pages 137--146, Dublin, Ireland, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. C. Haggstrom. Detection of Memory Allocation Bugs on System Level. PhD thesis, Umea University, Sweden, 2005. Adviser-Mikael Rannar.Google ScholarGoogle Scholar
  24. M. Higashi, T. Yamamoto, Y. Hayase, T. Ishio, and K. Inoue. An Effective Method to Control Interrupt Handler for Data Race Detection. In Proceedings of the 5th Workshop on Automation of Software Test (AST), pages 79--86, Cape Town, South Africa, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. D. Hovemeyer and W. Pugh. Finding Concurrency Bugs in Java. In In Proceedings of the PODC Workshop on Concurrency and Synchronization in Java Programs, Newfoundland, Canada, 2004.Google ScholarGoogle Scholar
  26. I. Jackson. IRQ handling race and spurious IIR read in 8250.c. Web page. https://lkml.org/lkml/2009/3/12/379.Google ScholarGoogle Scholar
  27. I. van Lil. Serial: UART\_BUG\_TXEN race conditions. Web page. http://lkml.org/lkml/2006/6/28/81.Google ScholarGoogle Scholar
  28. P. Joshi, M. Naik, C.-S. Park, and K. Sen. CalFuzzer: An Extensible Active Testing Framework for Concurrent Programs. In Proceedings of the 21st International Conference on Computer Aided Verification (CAV), pages 675--681, Grenoble, France, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. V. Kahlon, Y. Yang, S. Sankaranarayanan, and A. Gupta. Fast and Accurate Static Data-race Detection for Concurrent Programs. In Proceedings of the 19th International Conference on Computer Aided Verification (CAV), pages 226--239, Berlin, Germany, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Kernel Trap. deadlock in 3c59x driver. Web page. http://git.kernel.org/?p=linux/kernel/git/torvalds/linux-2.6.git;a=commit%;h=4bf3631cdb012591667ab927fcd7719d92837833.Google ScholarGoogle Scholar
  31. T. Koju, S. Takada, and N. Doi. An Efficient and Generic Reversible Debugger using the Virtual Machine Based Approach. In Proceedings of the 1st ACM/USENIX International Conference on Virtual Execution Environments (VEE), pages 79--88, Chicago, IL, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. K. Kourai and S. Chiba. HyperSpector: Virtual Distributed Monitoring Environments for Secure Intrusion Detection. In Proceedings of the 1st ACM/USENIX International Conference on Virtual Execution Environments (VEE), pages 197--207, Chicago, IL, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. Z. Lai, S. C. Cheung, and W. K. Chan. Inter-context Control-flow and Data-flow Test Adequacy Criteria for nesC Applications. In Proceedings of the International Symposium on Foundations of Software Engineering (FSE), pages 94--104, Atlanta, Georgia, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Liang T. Chen. The Challenge of Race Conditions in Parallel Programming. http://developers.sun.com/solaris/articles/\\raceconditions.html.Google ScholarGoogle Scholar
  35. O. Lichtenstein and A. Pnueli. Checking that Finite State Concurrent Programs Satisfy their Linear Specification. In Proceedings of the ACM SIGACT-SIGPLAN Symposium on Principles of Programming Languages (POPL), pages 97--107, New Orleans, Louisiana, 1985. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. C.-S. Park and K. Sen. Randomized Active Atomicity Violation Detection in Concurrent Programs. In Proceedings of the 16th ACM SIGSOFT International Symposium on Foundations of Software Engineering (FSE), pages 135--145, Atlanta, Georgia, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. A. Pohle, B. Döbel, M. Roitzsch, and H. Hartig. Capability Wrangling Made Easy: Debugging on a Microkernel with Valgrind. In Proceedings of the 6th ACM SIGPLAN/SIGOPS International Conference on Virtual Execution Environments (VEE), pages 3--12, Pittsburgh, Pennsylvania, USA, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. J. Regehr. Random Testing of Interrupt-driven Software. In Proceedings of the 5th ACM International Conference on Embedded Software (EMSOFT), pages 290--298, Jersey City, NJ, USA, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. S. Savage, M. Burrows, G. Nelson, P. Sobalvarro, and T. Anderson. Eraser: A Dynamic Data Race Detector for Multithreaded Programs. ACM Transactions on Computer Systems, 15(4):391--411, 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. B. Schlich, T. Noll, J. Brauer, and L. Brutschy. Reduction of Interrupt Handler Executions for Model Checking Embedded Software. In Proceedings of the 5th international Haifa Verification Conference on Hardware and Software: Verification and Testing (HVC), pages 5--20, Haifa, Israel, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. K. Sen. Race Directed Random Testing of Concurrent Programs. In Proceedings of the 2008 ACM SIGPLAN Conference on Programming Language Design and Implementation (PLDI), pages 11--21, Tucson, AZ, USA, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. K. Sen, D. Marinov, and G. Agha. CUTE: A Concolic Unit Testing Engine for C. In Proceedings of the 13th ACM SIGSOFT International Symposium on Foundations of Software Engineering, pages 263--272, Lisbon, Portugal, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. J. Takalo, J. Kaariainen, P. Parviainen, and T. Ihme. Challenges of Software-Hardware Codesign. White Paper, 2007. http://www.vtt.fi/inf/pdf/workingpapers/2008/W91.pdf.Google ScholarGoogle Scholar
  44. L. Tan, Y. Zhou, and Y. Padioleau. aComment: Mining Annotations from Comments and Code to Detect Interrupt Related Concurrency Bugs. In Proceeding of the 33rd International Conference on Software Engineering (ICSE), pages 11--20, Waikiki, Honolulu, HI, USA, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. Wind River. Wind River Simics. Web-page, 2011. http://www.simics.net.Google ScholarGoogle Scholar
  46. J. W. Voung, R. Jhala, and S. Lerner. RELAY: Static Race Detection on Millions of Lines of Code. In Proceedings of the the 6th joint meeting of the European Software Engineering Conference (ESEC) and the ACM SIGSOFT Symposium on Foundations of Software Engineering (FSE), pages 205--214, Dubrovnik, Croatia, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  47. T. Yu, A. Sung, W. Srisa-an, and G. Rothermel. Using Property-based Oracles when Testing Embedded System Applications. In Proceedings of the Fourth International Conference on Software Testing, Verification and Validation (ICST), pages 100--109, Berlin, Germany, March 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in
  • Published in

    cover image ACM Conferences
    VEE '12: Proceedings of the 8th ACM SIGPLAN/SIGOPS conference on Virtual Execution Environments
    March 2012
    248 pages
    ISBN:9781450311762
    DOI:10.1145/2151024
    • cover image ACM SIGPLAN Notices
      ACM SIGPLAN Notices  Volume 47, Issue 7
      VEE '12
      July 2012
      229 pages
      ISSN:0362-1340
      EISSN:1558-1160
      DOI:10.1145/2365864
      Issue’s Table of Contents

    Copyright © 2012 ACM

    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    • Published: 3 March 2012

    Permissions

    Request permissions about this article.

    Request Permissions

    Check for updates

    Qualifiers

    • research-article

    Acceptance Rates

    Overall Acceptance Rate80of235submissions,34%

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader