Skip to main content

1995 | Buch

TAPSOFT '95: Theory and Practice of Software Development

6th International Joint Conference CAAP/FASE Aarhus, Denmark, May 22–26, 1995 Proceedings

herausgegeben von: Peter D. Mosses, Mogens Nielsen, Michael I. Schwartzbach

Verlag: Springer Berlin Heidelberg

Buchreihe : Lecture Notes in Computer Science

insite
SUCHEN

Über dieses Buch

This volume presents the proceedings of the Sixth International Joint Conference on the Theory and Practice of Software Engineering, TAPSOFT '95, held in Aarhus, Denmark in May 1995. TAPSOFT '95 celebrates the 10th anniversary of this conference series started in Berlin in 1985 to bring together theoretical computer scientists and software engineers (researchers and practitioners) with a view to discussing how formal methods can usefully be applied in software development.
The volume contains seven invited papers, among them one by Vaugham Pratt on the recently revealed bug in the Pentium chip, and 44 revised full papers selected from a total of 147 submissions. In addition the TAPSOFT '95 proceedings contains 10 tool descriptions.

Inhaltsverzeichnis

Frontmatter
A decade of TAPSOFT
Aspects of progress and prospects in theory and practice of software development

The relationship between theory and practice of software development on the background of the driving forces in the 70'ies and 80'ies was the main topic of the first TAPSOFT conference in 1985. After a decade of TAPSOFT the intention of this survey is not so much to give a complete review of the TAPSOFT conferences but to discuss the general background and to focus on specific aspects of theory and practice which seem to be typical for TAPSOFT: The support of software development by algebraic methods, techniques and tools, in particular corresponding activities at TU Berlin. The survey in this paper shows that there is quite a different kind of progress in the decades before and after TAPSOFT'85: Before 1985 the focus was more on the development of new concepts while consolidation and attempts to adapt to practical needs was dominant after 1985.Finally the expectations for the future of theory and practice of software development are discussed on the background of the driving forces in the 90'ies hoping that TAPSOFT will be able to meet these requirements.

Hartmut Ehrig, Bernd Mahr
Theory and practice of software development
Stages in a debate

Starting from the experience gained in organizing TAPSOFT'85, the paper discusses the place of formal methods in software development. It distinguishes two notions of theory: the mathematical science of computation and the treatment of computing as a human activity. An adequate software theory needs to take both theoretical perspectives into account. Therefore, the paper explores the borderline of formalization and human activity in several directions: concerning the role and scope of formalized procedures, the relation between formal models and situated use, the process of learning in software development and the ways computer programs become effective in use. Fundamental assumptions underlying formal methods and their relation to emancipatory approaches such as participatory design are discussed. The paper closes with calling for a dialogical framework for further pursuing these questions.

Christiane Floyd
Rational spaces and set constraints

Set constraints are inclusions between expressions denoting sets of ground terms. They have been used extensively in program analysis and type inference. In this paper we investigate the topological structure of the spaces of solutions to systems of set constraints. We identify a family of topological spaces called rational spaces, which formalize the notion of a topological space with a regular or self-similar structure, such as the Cantor discontinuum or the space of runs of a finite automaton. We develop the basic theory of rational spaces and derive generalizations and proofs from topological principles of some results in the literature on set constraints.

Dexter Kozen
Formal methods and social context in software development

Formal methods have not been accepted to the extent for which many computing scientists hoped. This paper explores some reasons for that fact, and proposes some ways to make progress. One major problem has been that formal methods have not taken sufficient account of the social context of computer systems. For example, social context causes a continuous evolution of requirements for large complex systems. This implies that designs, specifications and code must also evolve with requirements, and that traceability is important. We discuss a traceability technique called hyper-requirements. To better understand social context, we discuss ethnomethodology, a branch of sociology, and situated abstract data types, which help bridge the gap between the technical and the social. These attempt to provide a scientific basis for requirements capture. Some case studies are briefly described. We distinguish between small, large and huge grain formal methods, arguing that small grain methods do not scale up. This motivates our discussions of software composition and a new paradigm of “Domain Specific Formal Methods.”

Joseph A. Goguen, Luqi
Testing can be formal, too

The paper presents a theory of program testing based on formal specifications. The formal semantics of the specifications is the basis for a notion of an exhaustive test set. Under some minimal hypotheses on the program under test, the success of this test set is equivalent to the satisfaction of the specification.The selection of a finite subset of the exhaustive test set can be seen as the introduction of more hypotheses on the program, called selection hypotheses. Several examples of commonly used selection hypotheses are presented.Another problem is the observability of the results of a program with respect to its specification: contrary to some common belief, the use of a formal specification is not always sufficient to decide whether a test execution is a success. As soon as the specification deals with more abstract entities than the program, program results may appear in a form which is not obviously equivalent to the specified results. A solution to this problem is proposed in the case of algebraic specifications.

Marie-Claude Gaudel
Anatomy of the Pentium bug

The Pentium computer chip's division algorithm relies on a table from which five entries were inadvertently omitted, with the result that 1738 single precision dividend-divisor pairs yield relative errors whose most significant bit is uniformly distributed from the 14th to the 23rd (least significant) bit. This corresponds to a rate of one error every 40 billion random single precision divisions. The same general pattern appears at double precision, with an error rate of one in every 9 billion divisions or 75 minutes of division time.These rates assume randomly distributed data. The distribution of the faulty pairs themselves however is far from random, with the effect that if the data is so nonrandom as to be just the constant 1, then random calculations started from that constant produce a division error once every few minutes, and these errors will sometimes propagate many more steps. A much higher rate yet is obtained when dividing small (<100) integers “bruised” by subtracting one millionth, where every 400 divisions will see a relative error of at least one in a million.The software engineering implications of the bug include the observations that the method of exercising reachable components cannot detect reachable components mistakenly believed unreachable, and that hand-checked proofs build false confidence.

Vaughan Pratt
Rational mechanics and natural mathematics

Chu spaces have found applications in computer science, mathematics, and physics. They enjoy a useful categorical duality analogous to that of lattice theory and projective geometry. As natural mathematics Chu spaces borrow ideas from the natural sciences, particularly physics, while as rational mechanics they cast Hamiltonian mechanics in terms of the interaction of body and mind.This paper addresses the chief stumbling block for Descartes' 17th-century philosophy of mind-body dualism, how can the fundamentally dissimilar mental and physical planes causally interact with each other? We apply Cartesian logic to reject not only divine intervention, preordained synchronization, and the eventual mass retreat to monism, but also an assumption Descartes himself somehow neglected to reject, that causal interaction within these planes is an easier problem than between. We use Chu spaces and residuation to derive all causal interaction, both between and within the two planes, from a uniform and algebraically rich theory of between-plane interaction alone. Lifting the two-valued Boolean logic of binary relations to the complex-valued fuzzy logic of quantum mechanics transforms residuation into a natural generalization of the inner product operation of a Hilbert space and demonstrates that this account of causal interaction is of essentially the same form as the Heisenberg-Schrödinger quantum-mechanical solution to analogous problems of causal interaction in physics.

Vaughan Pratt
First-order logic on finite trees

We present effective criteria for first-order definability of regular tree languages. It is known that over words the absence of modulo counting (the “noncounting property”) characterizes the expressive power of first-order logic (McNaughton, Schützenberger), whereas non-counting regular tree languages exist which are not first-order definable. We present new conditions on regular tree languages (more precisely, on tree automata) which imply nondefinability in first-order logic. One method is based on tree homomorphisms which allow to deduce nondefinability of one tree language from nondefinability of another tree language. Additionly we introduce a structural property of tree automata (the socalled Λ-∨-patterns) which also causes tree languages to be undefinable in first-order logic. Finally, it is shown that this notion does not yet give a complete characterization of first-order logic over trees. The proofs rely on the method of Ehrenfeucht-Fraïssé games.

Andreas Potthoff
Decidability of equivalence for deterministic synchronized tree automata

Synchronized tree automata allow limited communication between computations in independent subtrees of the input. This enables them to verify, for instance, the equality of two unary subtrees of unlimited size. The class of tree languages recognized by synchronized tree automata is strictly included in the context-free tree languages. As our main result we show that equivalence of tree languages recognized by deterministic synchronized tree automata can be effectively decided. This contrasts the earlier undecidability result for the equivalence problem for nondeterministic synchronized tree automata.

Kai Salomaa
The equivalence problem for letter-to-letter bottom-up tree transducers is solvable

Letter-to-letter bottom-up tree transducers are investigated in this paper. With an encoding of the so defined tree transformations into relabelings, we establish the decidability of equivalence for this class of tree transducers. Some extensions are next given.

Yves Andre, Francis Bossut
πI: A symmetric calculus based on internal mobility
Davide Sangiorgi
Complete inference systems for weak bisimulation equivalences in the π-calculus

Proof systems for weak bisimulation equivalences in the π-calculus are presented, and their soundness and completeness are shown. The proofs of the completeness results rely on the notion of symbolic bisimulation. Two versions of π-calculus are investigated, one without and the other with the mismatch construction. For each version of the calculus proof systems for both late and early weak bisimulation equivalences are studied. Thus there are four proof systems in all. These proof systems are related in a natural way: the proof systems for early and late equivalences differ only in the inference rule for the input prefix, while the proof system for the version of π-calculus with mismatch is obtained by adding a single inference rule for the version without it.

Huimin Lin
Reasoning about higher-order processes

We address the specification and verification problem for process calculi such as Chocs, CML and Facile where processes or functions are transmissible values. Our work takes place in the context of a static treatment of restriction and of a bisimulation-based semantics. As a paradigmatic and simple case we concentrate on (Plain) Chocs. We show that Chocs bisimulation can be characterized by an extension of Hennessy-Milner logic including a constructive implication, or function space constructor. This result is a non-trivial extension of the classical characterization result for labelled transition systems. In the second part of the paper we address the problem of developing a proof system for the verification of process specifications. Building on previous work for CCS we present a sound proof system for a Chocs sub-calculus not including restriction. We present two completeness results: one for the full specification language using an infinitary system, and one for a special class of so-called well-described specifications using a finitary system.

Roberto M. Amadio1, Mads Dam
Confluence of processes and systems of objects

An extension to the theory of confluence in the process calculus CCS is presented. The theory is generalized to an extension of the π-calculus. This calculus is used to provide semantics by translation for a parallel object-oriented programming language. The confluence theory is applied to prove the indistinguishability in an arbitrary program context of two class definitions which generate binary tree data structures one of which allows concurrent operations.

Xinxin Liu, David Walker
An algebraic approach to temporal logic

The sequential calculus is an algebraic calculus, intended for reasoning about phenomena with a duration and their sequencing. It can be specialized to various domains used for reasoning about programs and systems, including Tarski's calculus of binary relations, Kleene's regular expressions, Hoare's CSP and Dijkstra's regularity calculus.In this paper we use the sequential calculus as a tool for algebraizing temporal logics. We show that temporal operators are definable in terms of sequencing and we show how a specific logic may be selected by introducing additional axioms. All axioms of the complete proof system for discrete linear temporal logic (given in [9]) are obtained as theorems of sequential algebra.Our work embeds temporal logic into an algebra naturally equipped with sequencing constructs, and in which recursion is definable. This could be a first step towards a design calculus for transforming temporal specifications by stepwise refinement into executable programs.This paper is an extended abstract of a technical report [5] containing full proofs (available by ftp). Most proofs have been omitted and simplifying assumptions were made to make the presentation easier.

Burghard von Karger
On behavioural abstraction and behavioural satisfaction in higher-order logic

The behavioural semantics of specifications with higher-order formulae as axioms is analyzed. A characterization of behavioural abstraction via behavioural satisfaction of formulae in which the equality symbol is interpreted as indistinguishability, due to Reichel and recently generalized to the case of first-order logic by Bidoit et al, is further generalized to this case. The fact that higher-order logic is powerful enough to express the indistinguishability relation is used to characterize behavioural satisfaction in terms of ordinary satisfaction, and to develop new methods for reasoning about specifications under behavioural semantics.

Martin Hofmann, Donald Sannella
Assumption/guarantee specifications in linear-time temporal logic (extended abstract)

Previous works on assumption/guarantee specifications typically reason about relevant properties at the semantic level or define a special-purpose logic. We feel it is beneficial to formulate such specifications in a more widely used formalism. Specifically, we adopt the lineartime temporal logic (LTL) of Manna and Pnueli. We find that, with past temporal operators, LTL admits a succinct syntactic formulation of assumption/guarantee specifications. This contrasts, in particular, with the work by Abadi and Lamport using TLA, where working at the syntactic level is more complicated. Our composition rules are derived entirely within LTL and can also handle internal variables. We had to overcome a number of technical problems in this pursuit, in particular, the problem of extracting the safety closure of a temporal formula. As a by-product, we identify general conditions under which the safety closure can be expressed in a succinct way that facilitates syntactic manipulation.

Bengt Jonsson, Yih-Kuen Tsay
Fine hierarchy of regular ω-languages

By applying descriptive set theory we get several facts on the fine structure of regular ω-languages considered by K.Wagner. We present quite different, shorter proofs for main his results and get new results. Our description of the fine structure is new, very clear and automata-free. We prove also a closure property of the fine structure under Boolean operations. Our results demonstrate deep interconnections between descriptive set theory and theory of ω-languages.

Victor Selivanov
Computing the Wadge degree, the Lifschitz degree, and the Rabin index of a regular language of infinite words in polynomial time

Based on a detailed graph theoretical analysis, Wagner's fundamental results of 1979 are turned into efficient algorithms to compute the Wadge degree, the Lifschitz degree, and the Rabin index of a regular ω-language: the two former can be computed in time O(f2qb+klogk) and the latter in time O(f2qb) if the language is represented by a deterministic Muller automaton over an alphabet of cardinality b, with f accepting sets, q states, and k strongly connected components.

Thomas Wilke, Haiseung Yoo
Semi-trace morphisms and rational transductions

We investigate trace and semi-trace morphisms from an algebraic point of view thanks to rational transductions. The main result is a characterization of (semi-) trace morphisms which are equivalent to some rational transduction. Within this result we easily characterize context-free trace morphisms.

Pierre-André Wacrenier
Nonfinite axiomatizability of shuffle inequalities

There is some set of inequations t≤t∼ whose models are the algebras in the variety of ordered algebras generated by the algebras L=(P,.,⊗,1) where P consists of all subsets of the free monoid ∑*, B·C=uv∶u∈B, υ∈C, and B ⊗ C is the shuffle product of the two languages. We show that there is no finite set of such inequations.

Stephen L. Bloom, Zoltán Ésik
On the category of Petri net computations

We introduce the notion of strongly concatenable process as a refinement of concatenable processes [3] which can be expressed axiomatically via a functor $${\cal Q}$$ [_] from the category of Petri nets to an appropriate category of symmetric strict monoidal categories, in the precise sense that, for each net N, the strongly concatenable processes of N are isomorphic to the arrows of $${\cal Q}$$ [N]. In addition, we identify a coreflection right adjoint to $${\cal Q}$$ [_] and characterize its replete image, thus yielding an axiomatization of the category of net computations.

Vladimiro Sassone
High undecidability of weak bisimilarity for Petri nets

It is shown that the problem whether two labelled place/transition Petri nets (with initial markings) are weakly bisimilar is highly undecidable — it resides at least at level ω of the hyperarithmetical hierarchy; on the other hand it belongs to Σ11 (the first level of the analytical hierarchy). It contrasts with Π01-completeness of the same problem for trace (language) equivalence. Relations to similar problems for the process algebra BPP (Basic Parallel Processes) are also discussed.

Petr Jančar
Polynomial algorithms for the synthesis of bounded nets

The so-called synthesis problem for nets, which consists in deciding whether a given graph is isomorphic to the case graph of some net, and then constructing the net, has been solved in the litterature for various types of nets, ranging from elementary nets to Petri nets. The common principle for the synthesis is the idea of regions in graphs, representing possible extensions of places in nets. However, no practical algorithm has been defined so far for the synthesis. We give here explicit algorithms solving in polynomial time the synthesis problem for bounded nets from regular languages or from finite automata.

Eric Badouel, Luca Bernardinello, Philippe Darondeau
Semi-completeness of hierarchical and super-hierarchical combinations of term rewriting systems

In this paper, we study modular aspects of hierarchical and super hierarchical combinations of term rewriting systems. In particular, a sufficient condition for modularity of semi-completeness of hierarchical and super hierarchical combinations is proposed. We first establish modularity of weak normalization for this class (defined by the sufficient condition) and modularity of semi-completeness for a class of crosswise independent unions. From these results, we obtain modularity of semi-completeness for a class of hierarchical and super hierarchical combinations. Our results generalize the semi-completeness results of Ohlebusch [14] and Middeldorp and Toyama [13]. The notion of crosswise independent unions is a generalization of both constructor sharing unions as well as Plump's crosswise disjoint unions.

M. R. K. Krishna Rao
Lazy narrowing: Strong completeness and eager variable elimination (extended abstract)

Narrowing is an important method for solving unification problems in equational theories that are presented by confluent term rewriting systems. Because narrowing is a rather complicated operation, several authors studied calculi in which narrowing is replaced by more simple inference rules. This paper is concerned with one such calculus. Contrary to what has been stated in the literature, we show that the calculus lacks strong completeness, so selection functions to cut down the search space are not applicable. We prove completeness of the calculus and we establish an interesting connection between its strong completeness and the completeness of basic narrowing. We also address the eager variable elimination problem. It is known that many redundant derivations can be avoided if the variable elimination rule, one of the inference rules of our calculus, is given precedence over the other inference rules. We prove the completeness of a restricted variant of eager variable elimination in the case of orthogonal term rewriting systems.

Satoshi Okui, Aart Middeldorp, Tetsuo Ida
On the expressive power of algebraic graph grammars with application conditions

In this paper we introduce positive, negative and conditional application conditions for the single and the double pushout approach to graph transformation. To give the reader some intuition how the formalism can be used for specification we consider consistency and an interesting representation for specific conditions, namely (conditional) equations. Using a graph grammar notion without nonterminal graphs, i.e. each derivation step leads to a graph of the generated language, we prove a hierarchy: graph grammars over rules with positive application conditions are as powerful as the ones over rules without any extra application condition. Introducing negative application conditions makes the formalism more powerful. Graph grammars over rules with conditional application conditions are on top of the hierarchy.

Annika Wagner
Generated models and the ω-rule: The nondeterministic case

A language for specifying nondeterministic operations which generalizes the equational specification language is introduced. Then, various notions of generated multimodels are discussed and sufficient conditions for the existence of quasi-initial semantics of nondeterministic specifications are given. Two calculi are introduced: NEQ and NIP. The former is sound and complete with respect to the class of all multimodels. The latter is an extension of the former with the ω-rule. It is sound and complete with respect to one of the classes of the generated multimodels. The calculi reduce to the respective deterministic calculi whenever the specification involves only deterministic operations.

Michal Walicki, Sigurd Meldal
CPO models for a class of GSOS languages

In this paper, we present a general way of giving denotational semantics to a class of languages equipped with an operational semantics that fits the GSOS format of Bloom, Istrail and Meyer. The canonical model used for this purpose will be Abramsky's domain of synchronization trees, and the denotational semantics automatically generated by our methods will be guaranteed to be fully abstract with respect to the finitely observable part of the bisimulation preorder. In the process of establishing the full abstraction result, we also obtain several general results on the bisimulation preorder (including a complete axiomatization for it), and give a novel operational interpretation of GSOS languages.

Luca Aceto, Anna Ingólfsdóttir
Statecharts, transition structures and transformations

Statecharts are state-transition machines endowed with hierarchy on states and parallelism on transitions. It is shown that a statechart is described by a pair of relations over transitions (a transition structure), the former describing causality and the other describing a notion of asymmetric independence. A statechart can be effectively constructed from its transition structure. Transition structures corresponding to a subclass of Statecharts are characterized. Natural notions of morphisms among transition structures allow to define classes of statechart transformations which preserve behaviour.

Adriano Peron
An imperative object calculus

We develop an imperative calculus of objects. Its main type constructor is the one for object types, which incorporate variance annotations and Self types. A subtyping relation between object types supports object subsumption. The type system for objects relies on unusual but beneficial assumptions about the possible subtypes of an object type. With the addition of polymorphism, the calculus can express classes and inheritance.

Martín Abadi, Luca Cardelli
A refinement of import/export declarations in modular logic programming and its semantics

Encapsulation constructs with import/export declarations is the structuring facility offered in most commercial Prolog systems. However, real-life applications have shown to require a finer information exchange between encapsulated pieces of code. In this paper, a refinement of import/export declarations for modules of logic programs is presented. This offers a stricter form of communication between the modules and a larger variety of visibility states of their predicates, the standard approaches being special cases of it. The semantics of this module system has been examined and model-theoretic, fixpoint and operational ones are given and have been proved to be equivalent. Instead of using other logics, all these semantics extend the ones of Horn clause logic using concepts commonly used in it. In addition, the module system has been naturally transformed to Horn clause logic exploiting the distinction of the predicates within a module according to the interface declarations of this module. A form of equivalence with the other semantics of the system is given. In addition, the employed transformation has provided us with a basis for a preprocessor based implementation of the module system.

Isambo Karali, Constantin Halatsis
Strictness and totality analysis with conjunction

We extend the strictness and totality analysis of [12] by allowing conjunction at all levels rather than at the top-level. We prove the strictness and totality analysis correct with respect to a denotational semantics and finally construct an algorithm for inferring the strictness and totality properties.

Kirsten Lackner Solberg
Generic techniques for source-level debugging and dynamic program slicing

Algebraic specifications have been used successfully as a formal basis for software development. This paper discusses how the origin and dynamic dependence relations implicitly defined by an algebraic specification can be used to define powerful language-specific tools. In particular, the generation of tools for source-level debugging and dynamic program slicing from specifications of interpreters will be addressed.

Frank Tip
Reasoning with executable specifications

Starting from the specification of a small imperative programming language, and the description of two program transformations on this language, we formally prove the correctness of these transformations. The formal specifications are given in a single format, and can be compiled into both executable tools and collections of definitions to reason about into a theorem prover. This work is a case study of an environment integrating executable tool generation and formal reasoning on these tools.

Yves Bertot, Ranan Fraer
Calculating software generators from solution specifications

We have successfully demonstrated an automated transformation system that compiles practical software modules from the semantic specification of a domain-specific application design language. The integrated suite of transformation and translation tools represents a new level of design automation for software. Although there is much more that can be done to further improve the performance of generated code, the prototype system demonstrates the feasibility of this approach.The implementation of type-parametric theorems as transformation tactics for HOT has not been done before. It remains to be seen whether algebra-specific transformations can be incorporated in the same tool by referring to a database of algebraic laws. In the current system, algebra-specific transformations are performed by term-rewriting, which is an entirely different paradigm.

Richard B. Kieburtz, Francoise Bellegarde, Jef Bell, James Hook, Jeffrey Lewis, Dino Oliva, Tim Sheard, Lisa Walton, Tong Zhou
Comparing flow-based binding-time analyses

Binding-time analyses based on flow analysis have been presented by Bondorf, Consel, Bondorf and Jørgensen, and Schwartzbach and the present author. The analyses are formulated in radically different ways, making comparison non-trivial.In this paper we demonstrate how to compare such analyses. We prove that the first and the fourth analyses can be specified by constraint systems of a particular form, enabling direct comparison. As corollaries, we get that Bondorf's analysis is more conservative than ours, that both analyses can be performed in cubic time, and that the core of Bondorf's analysis is correct. Our comparison is of analyses that apply to the pure λ-calculus.

Jens Palsberg
Can you trust your data

A new program analysis is presented, and two compile time methods for this analysis are given. The analysis attempts to answer the question: “Given some trustworthy and some untrustworthy input, can we trust the value of a given variable after execution of some code”. The analyses are based on an abstract interpretation framework and a constraint generation framework respectively. The analyses are proved safe with respect to an instrumented semantics. We explicitly deal with a language with pointers and possible aliasing problems. The constraint based analysis is related directly to the abstract interpretation and therefore indirectly to the instrumented semantics.

Peter Ørbæk
Static and dynamic processor allocation for higher-order concurrent languages

Starting from the process algebra for Concurrent ML we develop two program analyses that facilitate the intelligent placement of processes on processors. Both analyses are obtained by augmenting an inference system for counting the number of channels created, the number of input and output operations performed, and the number of processes spawned by the execution of a Concurrent ML program. One analysis provides information useful for making a static decision about processor allocation; to this end it accumulates the communication cost for all processes with the same label. The other analysis provides information useful for making a dynamic decision about processor allocation; to this end it determines the maximum communication cost among processes with the same label. We prove the soundness of the inference system and the two analyses and demonstrate how to implement them; the latter amounts to transforming the syntax-directed inference problems to instances of syntax-free equation solving problems.

Hanne Riis Nielson, Flemming Nielson
Mechanized inductive proof of properties of a simple code optimizer

We demonstrate how mechanical proofs of properties of a simple code generator and a partial evaluator can be done by term rewriting induction. We yield proofs that the code generator is correct and that the partial evaluator produces equivalent, optimal, shorter code. We treat a case of disequations and show how comparisons can be done adequately.

Alfons Geser
Describing a Signal Analyzer in the process algebra PMC — A case study

In this paper we take a look at real-time systems from an implementation-oriented perspective. We are interested in the formal description of genuinely distributed systems whose correct functional behaviour depends on real-time constraints. The question of how to combine real-time with distributed processing in a clean and satisfactory way is the object of our investigation.The approach we wish to advance is based on PMC, an asynchronous process algebra with multiple clocks. The keywords here are ‘asynchrony’ as the essential feature of distributed computation and the notion of a ‘clock’ as an elementary real-time mechanism. We base the discussion on an actual industrial product: The Brüel & Kjær 2145 Vehicle Signal Analyzer, an instrument for measuring and analyzing noise generated by cars and other machines with rotating objects. We present an extension of PMC by ML-style value passing and demonstrate its use on a simplified version of the Brüel & Kjær Signal Analyzer.

Henrik Reif Andersen, Michael Mendler
A gentle introduction to specification engineering using a case study in telecommunications

Software development based on formal methods is the only way to provably correct software. Therefore a method for the development of complex systems in intuitive steps is needed. A suitable solution is the transformational approach where verified semantics-preserving transformation rules are used to come from a first verified specification to the desired system. A problem is that for most industrial applications the system development never terminates because requirements change and new functionalities have to be added to the system.This paper describes a new approach for the development of extensible specifications in small intuitive steps. New transformation rules are introduced that guarantee that intermediate results of development can be used for further steps.

Stephan Kleuker
Precise interprocedural dataflow analysis with applications to constant propagation

This paper concerns interprocedural dataflow-analysis problems in which the dataflow information at a program point is represented by an environment (i.e., a mapping from symbols to values), and the effect of a program operation is represented by a distributive environment transformer. We present an efficient dynamic-programming algorithm that produces precise solutions.The method is applied to solve precisely and efficiently two (decidable) variants of the interprocedural constant-propagation problem: copy constant propagation and linear constant propagation. The former interprets program statements of the form x∶=7 and x∶=y. The latter also interprets statements of the form x∶=5*y=+17.

Mooly Sagiv, Thomas Reps, Susan Horwitz
Formal specification and prototyping of a program specializer

This paper reports on the use of formal specifications in the development of a software maintenance tool for specializing imperative programs, which have become very complex due to extensive modifications. The tool is specified in terms of inference rules and operates by induction on the abstract syntax. The correctness of these rules is proved using rule induction. A Prolog prototype has been derived for Fortran programs, using the Centaur programming environment.

Sandrine Blazy, Philippe Facon
Proving the correctness of recursion-based automatic program transformations

This paper shows how the Improvement Theorem-a semantic condition for the total correctness of program transformation on higher-order functional programs-has practical value in proving the correctness of automatic techniques, including deforestation and supercompilation. This is aided by a novel formulation (and generalisation) of deforestation-like transformations, which also greatly adds to the modularity of the proof with respect to extensions to both the language and the transformation rules.

David Sands
Reactive system specification and refinement

This paper describes formal approaches for reactive and real time system specification and development, using a process of systematic translation from statechart descriptions of a system into a specification language utilising real time logic (RTL), and refinement within this language. Alternative implementation strategies using synchronisation constraints and synchronisation code are also provided, together with examples of development using the approach.The approach provides a unitary formalism which combines statecharts, RTL and temporal logic. Animation and proof tools are also briefly described.

K. Lano
Measuring concurrency of regular distributed computations

In this paper we present a concurrency measure that is especially adapted to distributed programs that exhibit regular run-time behaviours, including many programs that are obtained by automatic parallelization of sequential code. This measure is based on the antichain lattice of the partial order that models the distributed execution under consideration. We show the conditions under which the measure is computable on an infinite execution that is the repetition of a finite pattern. There, the measure can be computed by considering only a bounded number of patterns, the bound being at most the number of processors.

Cyrille Bareau, Benoit Caillaud, Claude Jard, Rene Thoraval
Non-speculative and upward invocation of continuations in a parallel language

A method of preserving the sequential semantics in parallel programs with first-class continuations is to invoke continuations non-speculatively. This method, which prevents a continuation from being invoked as long as its invocation can infringe the sequential semantics, reduces parallelism by the severe conditions that it imposes, especially on upward uses. In this paper, we present new conditions for invoking continuations in an upward way and both preserving the sequential semantics and providing parallelism. This new approach is formalised in the PCKS-machine, which is proved to be correct by showing that it has the same observational equivalence theory as the sequential semantics.

Luc Moreau
A model inference system for generic specification with application to code sharing

This paper presents a model inference system to control instantiation of generic modules. Generic parameters are specified by properties which represent classes of modules sharing some common features. Just as type checking consists in verifying that an expression is well typed, model checking allows to detect whether a (possibly generic) instantiation of a generic module is valid, i.e. whether the instantiation module is a model of the parameterizing property. Equality of instances can be derived from a canonical representation of modules. At last, we show how the code of generic modules can be shared for all instances of modules.

Didier Bert, Catherine Oriat
Relations as abstract datatypes: An institution to specify relations between algebras

One way to view the execution state of an imperative program is as a many sorted algebra. Program variables are (constant) functions and their types are sorts. The execution of a program defines a relation between the state of the program (algebra) before and after the execution of the program. In this paper we shall define an institution for the specification of relations between structures of some base institution (eg. the institution of equational logic or first order predicate logic). Sets of structures over a common signature, abstract datatypes, in this institution denote relations between structures of the base institution. This makes it possible to apply a rich repertoire of existing techniques for specifying abstract datatypes to the specification of relations. This paper tries to narrow the gap between algebraic specification languages like Clear, ASL or Act-One and model theoretic based specification languages like Z, VDM-SL or the Larch Interface language.

Hubert Baumeister
Performance-oriented formal specifications — the LotoTis approach

The paper presents the performance-oriented, Lotos extension LotoTis. LotoTis allows us to specify performance-oriented behavior via quantified time, quantified nondeterminism, quantified parallelism, and action monitoring. It offers a set of refinement rules from Lotos to LotoTis. Therefore, LotoTis supports the standard conform development of performance-oriented specifications from existing Lotos specifications.

Ina Schieferdecker
Signal: A formal design environment for real-time systems
EP-ATR Project
The META-Frame: An environment for flexible tool management
Bernhard Steffen, Tiziana Margaria, Andreas Claßen
STeP: The Stanford Temporal Prover
Zohar Manna, Nikolaj Bjørner, Anca Browne, Edward Chang, Michael Colón, Luca de Alfaro, Harish Devarajan, Arjun Kapur, Jaejin Lee, Henny Sipma, Tomás Uribe
The HOL-UNITY verification system

The HOL-UNITY verification system consists of a collection of tools for specifying and verifying UNITY programs and their properties. All the tools interface the theorem prover HOL for proving the properties of UNITY programs. In this way HOL-UNITY supports mechanised proving of correctness for parallel programs.

Flemming Andersen, Ulla Binau, Karsten Nyblad, Kim Dam Petersen, Jimmi S. Pettersson
PLATO: A tool to assist programming as term rewriting and theorem proving
Andre J. Sampaio, Armando M. Haeberer, Claudio T. Prates, Cristina D. Ururahy, Marcelo F. Frias, Neudson C. Albuquerque
LOFT: A tool for assisting selection of test data sets from algebraic specifications
Bruno Marre
The SMoLCS ToolSet
Egidio Astesiano, Gianna Reggio, Franco Morando
The Asf+Sdf Meta-environment documentation tools for free
M. G. J. van den Brand, E. Visser
The B-Toolkit demonstration
B-Core (UK) Ltd.
Object Oriented Semantics Directed Compiler Generation: A prototype
Luiz Carlos Castro Guedes, Edward Hermann Haeusler, José Lucas Rangel
Backmatter
Metadaten
Titel
TAPSOFT '95: Theory and Practice of Software Development
herausgegeben von
Peter D. Mosses
Mogens Nielsen
Michael I. Schwartzbach
Copyright-Jahr
1995
Verlag
Springer Berlin Heidelberg
Electronic ISBN
978-3-540-49233-7
Print ISBN
978-3-540-59293-8
DOI
https://doi.org/10.1007/3-540-59293-8