ABSTRACT
Program verification is a promising approach to improving program quality, because it can search all possible program executions for specific errors. However, the need to formally describe correct behavior or errors is a major barrier to the widespread adoption of program verification, since programmers historically have been reluctant to write formal specifications. Automating the process of formulating specifications would remove a barrier to program verification and enhance its practicality.This paper describes specification mining, a machine learning approach to discovering formal specifications of the protocols that code must obey when interacting with an application program interface or abstract data type. Starting from the assumption that a working program is well enough debugged to reveal strong hints of correct protocols, our tool infers a specification by observing program execution and concisely summarizing the frequent interaction patterns as state machines that capture both temporal and data dependences. These state machines can be examined by a programmer, to refine the specification and identify errors, and can be utilized by automatic verification tools, to find bugs.Our preliminary experience with the mining tool has been promising. We were able to learn specifications that not only captured the correct protocol, but also discovered serious bugs.
- 1.Thomas Ball, Rupak Majumdar, Todd Millstein, and Sriram K. Rajamani. Automatic predicate abstraction of C programs. In Proceedings of the 2001 ACM SIGPLAN Conference on Programming Language Design and Implementation, volume 36 of ACM SIGPLAN Notices, pages 203-213, July 2001.]] Google ScholarDigital Library
- 2.Thomas Ball and Sriram K. Rajamani. Automatically validating temporal safety properties of interfaces. In Proceedings of the 8th International SPIN Workshop on Model Checking of Software, number 2057 in Lecture Notes in Computer Science, pages 103-122, May 2001.]] Google ScholarDigital Library
- 3.Thomas Ball and Sriram K. Rajamani. Bebop: a path-sensitive interprocedural dataflow engine. In Proceedings of the 2001 ACM SIGPLAN-SOGSOFT Workshop on Program Analysis for Software Tools and Engineering, ACM SIGPLAN Notices, pages 97-103, July 2001.]] Google ScholarDigital Library
- 4.A. W. Biermann and J. A. Feldman. On the synthesis of finite-state machines from samples of their behaviour. IEEE Transactions on Computers, 21:591-597, 1972.]]Google Scholar
- 5.William R. Bush, Jonathan D. Pincus, and David J. Sielaff. A static analyzer for finding dynamic programming errors. Software Practice and Experience, 30:775-802, 2000.]] Google ScholarDigital Library
- 6.Andy Chou, Junfeng Yang, Benjamin Chelf, Seth Hallem, and Dawson Engler. An empirical study of operating systems errors. In Proceedings of the 18th ACM Symposium on Operating Systems Principles (SOSP18), pages 73-88, October 2001.]] Google ScholarDigital Library
- 7.Douglas E. Comer and David L. Stevens. Internetworking with TCP/IP. Client-server Programming and Applications, BSD Socket Version. Prentice-Hall, Englewood Cliffs, NJ 07632, USA, 1993.]] Google ScholarDigital Library
- 8.Jonathan E. Cook and Alexander L. Wolf. Discovering models of software processes from event-based data. ACM Transactions on Software Engineering and Methodology, 7(3):215-249, July 1998.]] Google ScholarDigital Library
- 9.Robert DeLine and Manuel Fahndrich. Enforcing high-level protocols in low-level software. In Proceedings of the SIGPLAN '01 Conference on Programming Language Design and Implementation (PLDI), pages 59-69, June 2001.]] Google ScholarDigital Library
- 10.Dawson Engler, David Yu Chen, Seth Hallem, Andy Chou, and Benjamin Chelf. Bugs as deviant behavior: a general approach to inferring errors in system code. In Proceedings of the 18th ACM Symposium on Operating Systems Principles (SOSP18), pages 57-72, October 2001.]] Google ScholarDigital Library
- 11.Michael D. Ernst, Jake Cockrell, William G. Griswold, and David Notkin. Dynamically discovering likely program invariants to support program evolution. IEEE Transactions in Software Engineering, 27(2):1-25, February 2001.]] Google ScholarDigital Library
- 12.Cormac Flanagan and K. Rustan M. Leino. Houdini, an annotation assistant for ESC/java. In International Symposium on FME 2001: Formal Methods for Increasing Software Productivity, LNCS, volume 1, 2001.]] Google ScholarDigital Library
- 13.Anup K. Ghosh, Christoph Michael, and Michael Shatz. A real-time intrusion detection system based on learning program behavior. In RAID 2000, volume 1907 of Lecture Notes in Computer Science, pages 93-109, 2000.]] Google ScholarDigital Library
- 14.E Mark Gold. Language identification in the limit. Information and Control, 10:447-474, 1967.]]Google ScholarCross Ref
- 15.Michel Gondran and Michel Minoux. Graphs and Algorithms. John Wiley and Sons, 1984.]] Google ScholarDigital Library
- 16.Michael Kearns, Yishay Mansour, Dana Ron, Ronitt Rubinfeld, Robert E. Schapire, and Linda Sellie. On the learnability of discrete distributions. In Proceedings of the Twenty-sixth ACM Symposium on Theory of Computing, pages 273-282, 1994.]] Google ScholarDigital Library
- 17.James R. Larus and Eric Schnarr. EEL: Machine-independent executable editing. In Proceedings of the SIGPLAN '95 Conference on Programming Language Design and Implementation (PLDI), pages 291-300, June 1995.]] Google ScholarDigital Library
- 18.Christoph Michael and Anup Ghosh. Using finite automata to mine execution data for intrusion detection: a preliminary report. In RAID 2000, volume 1907 of Lecture Notes in Computer Science, pages 66-79, 2000.]] Google ScholarDigital Library
- 19.William G. Griswold Michael D. Ernst, Adam Czeisler and David Notkin. Quickly detecting relevant program invariants. In Proceedings of the 22nd International Conference on Software Engineering, June 2000.]] Google ScholarDigital Library
- 20.Kevin P. Murphy. Passively learning finite automata. Technical Report 96-04-017, Santa Fe Institute, 1996.]]Google Scholar
- 21.Anand Raman, Peter Andreae, and Jon Patrick. A beam search algorithm for pfsa inference. Pattern Analysis and Applications, 1(2), 1998.]]Google Scholar
- 22.Anand V. Raman and Jon D. Patrick. The sk-strings method for inferring PFSA. In Proceedings of the workshop on automata induction, grammatical inference and language acquisition at the 14th international conference on machine learning (ICML97), 1997.]]Google Scholar
- 23.S. P. Reiss and M. Renieris. Encoding program executions. In Proceedings of the 23rd International Conference on Software Engeneering (ICSE-01), pages 221-232, Los Alamitos, California, May12-19 2001. IEEE Computer Society.]] Google ScholarDigital Library
- 24.Dana Ron, Yoram Singer, and Naftali Tishby. On the learnability and usage of acyclic probabilistic finite automata. In Proceedings of the 8th Annual Conference on Computational Learning Theory, pages 31-40. ACM Press, New York, NY, 1995.]] Google ScholarDigital Library
- 25.David Rosenthal. Inter-client communication conventions manual (ICCCM), version 2.0. X Consortium, Inc. and Sun Microsystems, 1994. Part of the X11R6 distribution.]]Google Scholar
- 26.Robert Endre Tarjan. Efficiency of a good but not linear set union algorithm. Journal of the ACM, 22(2):215-225, 1975.]] Google ScholarDigital Library
- 27.David Wagner and Drew Dean. Intrusion detection via static analysis. In Proceedings of the 2001 IEEE Symposium on Security and Privacy, May 2001.]] Google ScholarDigital Library
Recommendations
Mining specifications
Program verification is a promising approach to improving program quality, because it can search all possible program executions for specific errors. However, the need to formally describe correct behavior or errors is a major barrier to the widespread ...
Protective Interface Specifications
Abstract.The interface specification of a procedure describes the procedure's behaviour using pre- and postconditions. These pre- and postconditions are written using various functions. If some of these functions are partial, or underspecified, then the ...
Comments