Skip to main content

Über dieses Buch

The Marktoberdorf Summer Schools on Informatics were started in 1970, with the intention to convene every second or third year a group of top researchers in computing, devoted to preach their most recent results to an elite of advanced students - young and most promising people - and prepared to stand their questions, criticism and suggestions. The themes of these Advanced Study In­ stitutes under the sponsorship of the NATO Scientific Affairs Division varied slightly over the years, oscillating more or less around Programming Methodo­ logy, as the following list shows: 1970 Data Structures and Computer Systems 1971 Program Structures and Fundamental Concepts of Programming 1973 Structured Programming and Programmed Structures 1975 Language Hierarchies and Interfaces 1978 Program Construction 1981 Theoretical Foundations of Programming Methodology 1984 Control Flow and Data Flow: Concepts of Distributed Programming 1986 Logic of Programming and Calculi of Discrete Design 1988 Constructive Methods in Computing Science 1989 Logic, Algebra, and Computation Logic, Algebra, and Computation is the theme of the summer school to which this volume is devoted. It is the tenth in succession, but it is also the first in a new series (the "blue" series) that is intended to alternate in future with the traditional (the "red" series) arrangement; in fact the tenth summer school in the "red" series with the title "Programming and Mathematical Method" , held in 1990, was the subject of celebrating both its serial number and the twenty years of Marktoberdorf Summer Schools altogether.



Theory of Computation and Specification over Abstract Data Types, and its Applications

The theory of computable functions on abstract data types is outlined. The problems of extending the theory to establish the scope and limits of the specification and verification of functions are described. Applications of this general mathematical theory to hardware design, program verification and logic programming are discussed.
J. V. Tucker

Fundamentals of Deductive Program Synthesis

This is an introduction to program synthesis, the derivation of a program to meet a given specification. It focuses on the deductive approach, in which the derivation task is regarded as a problem of proving a mathematical theorem.
Zohar Manna, Richard Waldinger

Notes on resolution

These notes are a revised and extended version of [18]. They offer a brief but reasonably complete account of the main ideas underlying the resolution formulation of first order predicate logic, with computational issues receiving the lion’s share of attention. The unification algorithm is discussed in considerable detail, with logical formulas regarded simply as certain data structures. The aim is to explain resolution in general in such a way that the important special case of Horn clause resolution can be properly understood within the broader setting. Horn clause resolution is the theoretical framework for the kind of logic programming which is done by users of PROLOG. However, although many of the ideas we shall discuss are concretely realized in various versions of that programming language, we shall not be explicitly concerned with it. Surface details differ markedly from version to version and often obscure the relatively simple underlying conceptual system. We therefore try to present that system directly.
J. A. Robinson

Introduction to Pure λ-Calculus

These course notes introduce pure λ-calculus, the basic formal system underlying functional programming languages. Section 1 discusses notations for functions in mathematics and introduces Church’s λ-notation. Section 2 introduces de Bruijn’s abstract syntax for λ-terms, and gives the basic definitions constructively as executable CAML programs. Section 3 introduces the notions of a redex and of β-reduction, and gives a table of some commonly used notations for λ-terms. Section 4 gives a proof using Tait’s technique of parallel reduction that β-reduction satisfies the diamond property; the Church-Rosser theorem follows as a corollary. A systematic proof method based on annotated terms is introduced here. Section 5 proves the theorem of finite developments, and Section 6 proves its corollary the standardisation theorem, with a method due to Klop. Section 7 introduces the notion of a Böhm tree. Section 8 presents some variations of the theory of β-reduction of pure λ-K-calculus. Section 9 proves Böhm’s theorem and discusses its semantic consequences. Section 10 contains a short guide to the literature.
Gérard Huet


The aim of this paper is to present a central technique from proof theory, Gentzen’s normalization for natural deduction systems, and to discuss some of its applications.
Helmut Schwichtenberg

Computability — Logical and Recursive Complexity

The basis of this short course is the strong analogy between programs and proofs (of their specifications). The main theme is the classification of computable number-theoretic functions according to the logical complexity of their formal specification or termination proofs. A significant sub-branch of mathematical logic has grown around this theme since the 1950’s and new ideas are presently giving rise to further developments. The methods employed are chiefly those from proof theory, particularly “normalization” as presented in the accompanying lectures of Helmut Schwichtenberg, and “ordinal assignments”. Since program-termination corresponds to well-foundedness of computation trees, it is hardly surprising that transfinite ordinals and their constructive representations play a crucial role, measuring the logical complexity of programs and of the functions which they compute.
Stanley S. Wainer

Reflecting the Open-Ended Computation System of Constructive Type Theory

The computation system of constructive type theory is open-ended so that theorems about computation will hold for a broad class of extensions to the system. We show that despite this openness it is possible to completely reflect the computation system into the language in a clear way by adding simple primitive concepts that anticipate the reflection. This work provides a hook for developing methods to modify the built-in evaluator and to treat the issues of intensionality and computational complexity in programming logics and provides a basis for reflecting the deductive apparatus of type theory.
Robert L. Constable, Stuart F. Allen, Douglas J. Howe

Some Lectures on Modal Logic

An exposition of some modal logics useful for teaching and research in computer science. §1. Preface §2. Propositional modal logic §3. Modal frames §4. Propositional tableaux §5. Modal axioms and their frame semantics 96. Modal predicate tableaux with constant domains §7. Autoepistemic logic §8. Nonmonotonic reasoning §9. Classical concurrent dynamic logic §10. Intuitionistic concurrent dynamic logic.
Anil Nerode

Formal Approaches to Concurrency

An introduction to the two main approaches to the theory of concurrent distributed systems is given: the Milner/Hoare theory of CCS and TCSP and the theory of Petri nets. We define a general abstract programming language GAP which encompasses practically all aspects of CSS, TCSP and related formalisms and give a step failures semantics for it. We introduce place/transition nets and show how GAP can be modelled by Petri nets. Moreover we report on other semantics notions for abstract programming languages and Petri nets and about techniques for modular construction and refinement of Petri nets.
Wilfried Brauer

The Family of Concurrent Logic Programming Languages

Concurrent logic languages are high-level programming languages for parallel and distributed systems that offer a wide range of both known and novel concurrent programming techniques. Being logic programming languages, they preserve many advantages of the abstract logic programming model, including the logical reading of programs and computations, the convenience of representing data-structures with logical terms and manipulating them using unification, and the amenability to meta-programming. Operationally, their model of computation consists of a dynamic set of concurrent processes, communicating by instantiating shared logical variables, synchronizing by waiting for variables to be instantiated, and making nondeterministic choices, possibly based on the availability of values of variables.
This paper surveys the family of concurrent logic programming languages within a uniform operational framework. It demonstrates the expressive power of even the simplest language in the family, and investigates how varying the basic synchronization and control constructs affect the expressiveness and efficiency of the resulting languages.
In addition, the paper reports on techniques for sequential and parallel implementation of languages in this family, mentions their applications to date, and relates these languages to the abstract logic programming model, to the programming language Prolog, and to other concurrent computational models and programming languages.
Ehud Shapiro


Weitere Informationen