Skip to main content

2002 | Buch

The Essence of Computation

Complexity, Analysis, Transformation

herausgegeben von: Torben Æ. Mogensen, David A. Schmidt, I. Hal Sudborough

Verlag: Springer Berlin Heidelberg

Buchreihe : Lecture Notes in Computer Science

insite
SUCHEN

Über dieses Buch

By presenting state-of-the-art aspects of the theory of computation, this book commemorates the 60th birthday of Neil D. Jones, whose scientific career parallels the evolution of computation theory itself.
The 20 reviewed research papers presented together with a brief survey of the work of Neil D. Jones were written by scientists who have worked with him, in the roles of student, colleague, and, in one case, mentor. In accordance with the Festschrift's subtitle, the papers are organized in parts on computational complexity, program analysis, and program transformation.

Inhaltsverzeichnis

Frontmatter

Computational Complexity

General Size-Change Termination and Lexicographic Descent
Abstract
Size-change termination (SCT) is a general criterion to identify recursive function definitions that are guaranteed to terminate. It extends and subsumes the simpler criterion of lexicographic descent in function calls, which in classical recursion theory is known as multiple recursion. Neil Jones has conjectured that the class of functions computable by size-change terminating programs coincides with the multiply-recursive functions. This paper proves so.
Amir M. Ben-Amram
Comparing Star and Pancake Networks
Abstract
Low dilation embeddings are used to compare similarities between star and pancake networks. The pancake network of dimension n, Pn, has n! nodes, one for each permutation, and an undirected edge between permutations (nodes) when some prefix reversal transforms one permutation into the other. The star network of dimension n, Sn, has n! nodes, one for each permutation, and an undirected edge between permutations (nodes) when the exchange of the first element with some other element transforms one permutation into the other. Comparisons with the burnt pancake network are also discussed. The burnt pancake network, BPn, has 2n.n! nodes, one for each signed permutation, and an undirected edge between signed permutations (nodes) when some prefix reversal transforms one signed permutation into the other, and all symbols in the reversed prefix change sign. Some of the embeddings shown are:
$$ \begin{gathered} \bullet P_n \mathop \Rightarrow \limits^{dil 1} S_{2n} , \bullet S_n \mathop \Rightarrow \limits^{dil 1} P_{(n^3 - 4n^2 + 5n + 4)/2} \hfill \\ \bullet BP_n \mathop \Rightarrow \limits^{dil 1} S_{2n} , \bullet S_n \mathop \Rightarrow \limits^{dil 2} P_{2n - 2} \hfill \\ \end{gathered} $$
Linda Morales, I. Hal Sudborough
Synchronization of Finite Automata: Contributions to an Old Problem
Abstract
In spite of its simple formulation, the problem about the synchronization of a finite deterministic automaton is not yet properly understood. The present paper investigates this and related problems within the general framework of a composition theory for functions over a finite domain N with n elements. The notion of depth introduced in this connection is a good indication of the complexity of a given function, namely, the complexity with respect to the length of composition sequences in terms of functions belonging to a basic set. Our results show that the depth may vary considerably with the target function. We also establish criteria about the reachability of some target functions, notably constants. Properties of n such as primality or being a power of 2 turn out to be important, independently of the semantic interpretation. Most of the questions about depth, as well as about the comparison of different notions of depth, remain open. Our results show that the study of functions of several variables may shed light also to the case where all functions considered are unary.
Arto Salomaa
Lambda Calculi and Linear Speedups
Abstract
The equational theories at the core of most functional programming are variations on the standard lambda calculus. The best known of these is the call-by-value lambda calculus whose core is the value-beta computation rule (λχ.M) VM[V/χ] where V is restricted to be a value rather than an arbitrary term. This paper investigates the transformational power of this core theory of functional programming. The main result is that the equational theory of the call-by-value lambda calculus cannot speed up (or slow down) programs by more than a constant factor. The corresponding result also holds for call-by-need but we show that it does not hold for call-byname: there are programs for which a single beta reduction can change the program’s asymptotic complexity.
David Sands, Jörgen Gustavsson, Andrew Moran

Program Analysis

Design and Implementation of a Special-Purpose Static Program Analyzer for Safety-Critical Real-Time Embedded Software
Abstract
We report on a successful preliminary experience in the design and implementation of a special-purpose Abstract Interpretation based static program analyzer for the verification of safety critical embedded real-time software. The analyzer is both precise (zero false alarm in the considered experiment) and efficient (less than one minute of analysis for 10,000 lines of code). Even if it is based on a simple interval analysis, many features have been added to obtain the desired precision: expansion of small arrays, widening with several thresholds, loop unrolling, trace partitioning, relations between loop counters and other variables. The efficiency of the tool mainly comes from a clever representation of abstract environments based on balanced binary search trees.
Bruno Blanchet, Patrick Cousot, Radhia Cousot, Jérôme Feret, Laurent Mauborgne, Antoine Miné, David Monniaux, Xavier Rival
Meta-circular Abstract Interpretation in Prolog
Abstract
We give an introduction to the meta-circular approach to the abstract interpretation of logic programs. This approach is particularly useful for prototyping and for introductory classes on abstract interpretation. Using interpreters, students can immediately write, adapt, and experiment with interpreters and working dataflow analysers. We use a simple meta-circular interpreter, based on a “non-ground T P” semantics, as a generic analysis engine. Instantiating the engine is a matter of providing an appropriate domain of approximations, together with definitions of “abstract” unification and disjunction. Small changes of the interpreter let us vary both what can be “observed” by an analyser, and how fixed point computation is done. Amongst the dataflow analyses used to exemplify this approach are a parity analysis, groundness dependency analysis, call patterns, depth-k analysis, and a “pattern” analysis to establish most specific generalisations of calls and success sets.
Michael Codish, Harald Søndergaard
Flow Analysis: Games and Nets
Abstract
This paper presents a graph-based formulation of control- flow analysis using results from game semantics and proof-nets. Control- flow analysis aims to give a conservative prediction of the flow of control in a program. In our analysis, terms are represented by proof-nets and control-flow analysis amounts to the analysis of computation paths in the proof-net. We focus on a context free analysis known in the literature as 0-CFA, and develop an algorithm for the analysis. The algorithm for 0- CFA performs dynamic transitive closure of a graph that is based on the judgement associated with the proof-net. Correctness of the algorithm relies on the correspondence between proof-nets and certain kinds of strategies in game semantics.
Chris Hankin, Rajagopal Nagarajan, Prahladavaradan Sampath
On Extracting Static Semantics
Abstract
We examine the problem of automatically extracting a static semantics from a language’s semantic definition. Traditional approaches require manual construction of static and dynamic semantics, followed by a proof that the two are consistent. As languages become more complex, the static analyses also become more complex, and consistency proofs have typically been challenging. We need to find techniques for automatically constructing static analyses that are provably correct. We study the issues of developing such techniques and propose avenues of research in this area. We find that significant advances are required before realizing the practical extraction of static semantics from language definitions.
John Hannan
Foundations of the Bandera Abstraction Tools
Abstract
Current research is demonstrating that model-checking and other forms of automated finite-state verification can be effective for checking properties of software systems. Due to the exponential costs associated with model-checking, multiple forms of abstraction are often necessary to obtain system models that are tractable for automated checking.
The Bandera Tool Set provides multiple forms of automated support for compiling concurrent Java software systems to models that can be supplied to several different model-checking tools. In this paper, we describe the foundations of Bandera’s data abstraction mechanism which is used to reduce the cardinality (and the program’s state-space) of data domains in software to be model-checked. From a technical standpoint, the form of data abstraction used in Bandera is simple, and it is based on classical presentations of abstract interpretation. We describe the mechanisms that Bandera provides for declaring abstractions, for attaching abstractions to programs, and for generating abstracted programs and properties. The contributions of this work are the design and implementation of various forms of tool support required for effective application of data abstraction to software components written in a programming language like Java which has a rich set of linguistic features.
John Hatcliff, Matthew B. Dwyer, Corina S. Păsăreanu, Robby
Types in Program Analysis
Abstract
This paper surveys type-based program analysis with an emphasis on the polyvariant program analyses that can be obtained using conjunctive (or intersection) types and parametric polymorphism. In particular, we show 1) how binding-time analysis and strictness analysis are variations of a common framework based on conjunctive types, 2) that the standard abstract interpretation used for strictness analysis is equivalent to the type-based analysis, and 3) that the conjunctive strictness analysis can be made more modular by blending conjunctions with parametric polymorphism.
Thomas Jensen
Flow Logic: A Multi-paradigmatic Approach to Static Analysis
Abstract
Flow logic is an approach to static analysis that separates the specification of when an analysis estimate is acceptable for a program from the actual computation of the analysis information. It allows one not only to combine a variety of programming paradigms but also to link up with state-of-the-art developments in classical approaches to static analysis, in particular data flow analysis, constraint-based analysis and abstract interpretation. This paper gives a tutorial on flow logic and explains the underlying methodology; the multi-paradigmatic approach is illustrated by a number of examples including functional, imperative, object-oriented and concurrent constructs.
Hanne Riis Nielson, Flemming Nielson
Structure-Preserving Binary Relations for Program Abstraction
Abstract
An abstraction is a property-preserving contraction of a program’s model into a smaller one that is suitable for automated analysis. An abstraction must be sound, and ideally, complete. Soundness and completeness arguments are intimately connected to the abstraction process, and approaches based on homomorphisms and Galois connections are commonly employed to define abstractions and prove their soundness and completeness. This paper develops Mycroft and Jones’s proprosal that an abstraction should be stated as a form of structure-preserving binary relation. Mycroft-Jones-style relations are defined, developed, and employed in characterizations of the homomorphism and Galois-connection approaches to abstraction.
David A. Schmidt

Program Transformation

Principles of Inverse Computation and the Universal Resolving Algorithm
Abstract
We survey fundamental concepts in inverse programming and present the Universal Resolving Algorithm (URA), an algorithm for inverse computation in a first-order, functional programming language. We discuss the principles behind the algorithm, including a three-step approach based on the notion of a perfect process tree, and demonstrate our implementation with several examples. We explain the idea of a semantics modifier for inverse computation which allows us to perform inverse computation in other programming languages via interpreters.
Sergei Abramov, Robert Glück
A Symmetric Approach to Compilation and Decompilation
Abstract
Just as an interpreter for a source language can be turned into a compiler from the source language to a target language, we observe that an interpreter for a target language can be turned into a compiler from the target language to a source language. In both cases, the key issue is the choice of whether to perform an evaluation or to emit code that represents this evaluation.
We substantiate this observation with two source interpreters and two target interpreters. We first consider a source language of arithmetic expressions and a target language for a stack machine, and then the λ- calculus and the SECD-machine language. In each case, we prove that the target-to-source compiler is a left inverse of the source-to-target compiler, i.e., that it is a decompiler.
In the context of partial evaluation, the binding-time shift of going from a source interpreter to a compiler is classically referred to as a Futamura projection. By symmetry, it seems logical to refer to the binding-time shift of going from a target interpreter to a compiler as a Futamura embedding.
Mads Sig Ager, Olivier Danvy, Mayer Goldberg
The Abstraction and Instantiation of String-Matching Programs
Abstract
We consider a naive, quadratic string matcher testing whether a pattern occurs in a text; we equip it with a cache mediating its access to the text; and we abstract the traversal policy of the pattern, the cache, and the text. We then specialize this abstracted program with respect to a pattern, using the off-the-shelf partial evaluator Similix.
Instantiating the abstracted program with a left-to-right traversal policy yields the linear-time behavior of Knuth, Morris and Pratt’s string matcher. Instantiating it with a right-to-left policy yields the linear-time behavior of Boyer and Moore’s string matcher.
Torben Amtoft, Charles Consel, Olivier Danvy, Karoline Malmkjær
WSDFU: Program Transformation System Based on Generalized Partial Computation
Abstract
Generalized Partial Computation (GPC) is a program transformation method utilizing partial information about input data and auxiliary functions as well as the logical structure of a source program. GPC uses both an inference engine such as a theorem prover and a classical partial evaluator to optimize programs. Therefore, GPC is more powerful than classical partial evaluators but harder to implement and control. We have implemented an experimental GPC system called WSDFU (Waseda Simplify-Distribute-Fold-Unfold). This paper demonstrates the power of the program transformation system as well as its theorem prover and discusses some future works.
Yoshihiko Futamura, Zenjiro Konishi, Robert Glück2
Homeomorphic Embedding for Online Termination of Symbolic Methods
Abstract
Well-quasi orders in general, and homeomorphic embedding in particular, have gained popularity to ensure the termination of techniques for program analysis, specialisation, transformation, and verification. In this paper we survey and discuss this use of homeomorphic embedding and clarify the advantages of such an approach over one using well-founded orders. We also discuss various extensions of the homeomorphic embedding relation. We conclude with a study of homeomorphic embedding in the context of metaprogramming, presenting some new (positive and negative) results and open problems.
Michael Leuschel
Simple Driving Techniques
Abstract
Driving was introduced as a program transformation technique by Valentin Turchin in some papers around 1980. It was intended for the programming language REFAL and used in metasystem transitions based on super compilation. In this paper we present one version of driving for a more conventional lisp-like language. Our aim is to extract a simple notion of driving and show that even in this tamed form it has much of the power of more general notions of driving. Our driving technique may be used to simplify functional programs which use function composition and will often be able to remove intermediate data structures used in computations.
Mads Rosendahl
Demonstrating Lambda Calculus Reduction
Abstract
We describe lambda calculus reduction strategies, such as call-by-value, call-by-name, normal order, and applicative order, using big-step operational semantics. We show how to simply and efficiently trace such reductions, and use this in a web-based lambda calculus reducer available at <http://​www.​dina.​kvl.​dk/​~sestoft/​lamreduce/​>.
Peter Sestoft
From Type Inference to Configuration
Abstract
A product line is a set of products and features with constraints on which subsets are available. Numerous configurators have been made available by product line vendors on the internet, in which procurers can experiment with the different options, e.g. how the selection of one product or feature entails or precludes the selection of another product or feature. We explore an approach to configuration inspired by type inference technology. The main contributions of the paper are a formalization of the configuration problem that includes aspects related to the interactive dialogue between the user and the system, a result stating that the configuration problem thus formalized has at least exponential complexity, and some techniques for computing approximate solutions more efficiently. While a substantial number of papers precede the present one in formalizing configuration as a constraint satisfaction problem, few address the aspects concerning interactivity between the user and the system.
Morten Heine Sørensen, Jens Peter Secher
Backmatter
Metadaten
Titel
The Essence of Computation
herausgegeben von
Torben Æ. Mogensen
David A. Schmidt
I. Hal Sudborough
Copyright-Jahr
2002
Verlag
Springer Berlin Heidelberg
Electronic ISBN
978-3-540-36377-4
Print ISBN
978-3-540-00326-7
DOI
https://doi.org/10.1007/3-540-36377-7