Skip to main content

1999 | Buch

Logic-Based Program Synthesis and Transformation

8th International Workshop, LOPSTR’98 Manchester, UK, June 15–19, 1998 Selected Papers

herausgegeben von: Pierre Flener

Verlag: Springer Berlin Heidelberg

Buchreihe : Lecture Notes in Computer Science

insite
SUCHEN

Über dieses Buch

This book presents the thoroughly refereed post-workshop proceedings of the 8th International Workshop on Logic-Based Program Synthesis and Transformation, LOPSTR'98 held in Manchester, UK in June 1998. The 16 revised full papers presented were carefully reviewed and selected during three rounds of inspection from a total of initially 36 extended abstracts submitted. Also included are eight short papers. Among the topics covered are logic specification, mathematical program construction, logic programming, computational logics, inductive program synthesis, constraint logic programs, and mathematical foundations.

Inhaltsverzeichnis

Frontmatter
Attempto Controlled English — Not Just Another Logic Specification Language
Abstract
The specification language Attempto Controlled English (ACE) is a controlled natural language, i.e. a subset of standard English with a domain-specific vocabulary and a restricted grammar. The restriction of full natural language to a controlled subset is essential for ACE to be suitable for specification purposes. The main goals of this restriction are to reduce ambiguity and vagueness inherent in full natural language and to make ACE computer processable. ACE specifications can be unambiguously translated into logic specification languages, and can be queried and executed. In brief, ACE allows domain specialists to express specifications in familiar natural language and combine this with the rigour of formal specification languages.
Norbert E. Fuchs, Uta Schwertel, Rolf Schwitter
A Step Towards a Methodology for Mercury Program Construction: A Declarative Semantics for Mercury
Abstract
Declarative methodologies for logic program construction have been proposed for Prolog. They roughly consist of 1) building a purely logical version of the program based on a clear declarative semantics and 2) performing a number of checks about modes, types, termination and multiplicity. We plan to define a similar methodology for Mercury. This choice is motivated by the fact that type, mode, and multiplicity must be explicitly specified in Mercury, allowing the compiler to perform the second step above. In order to propose a methodology to perform the first step, we need a declarative semantics for Mercury, which has not yet been explicitly defined. The goal of the paper is to propose such a semantics pursuing simplicity and naturalness. We chose to define the semantics with respect to a unique interpretation domain, called the “universe”, which is a kind of higher-order version of the Herbrand universe. Based on this simple domain, the denotation of terms and goals is naturally defined as well as the models of programs. Although the declarative semantics is primarily introduced to improve “manual” program construction by programmers, it could be used in a synthesis context.
Dante Baldan, Baudouin Le Charlier, Christophe Leclère, Isabelle Pollet
Pragmatics in the Synthesis of Logic Programs
Abstract
Many of the systems which we, and those who have worked with us, have built were intended to make it easier for people with particular backgrounds to construct and understand logic programs. A major issue when designing this sort of system is pragmatics: from the many logically equivalent ways of describing a program we must identify styles of description which make particular tasks easier to support. The first half of this paper describes three ways in which we have attempted to understand the pragmatics of particular domains using well known methods from computational logic. These are: design using parameterisable components; synthesis by incremental addition of program slices; and meta-interpretation. These are helpful in structuring designs but do not necessarily provide guidance in design lifecycles - where less detailed designs are used to guide the description of more detailed designs. The second half of this paper summarises an example of this form of guidance.
David Robertson, Jaume Agustí
Using Decision Procedures to Accelerate Domain-Specific Deductive Synthesis Systems
Abstract
This paper describes a class of decision procedures that we have found useful for efficient, domain-specific deductive synthesis, and a method for integrating this type of procedure into a general-purpose refutation-based theorem prover. We suggest that this is a large and interesting class of procedures and show how to integrate these procedures to accelerate a general-purpose theorem prover doing deductive synthesis. While much existing research on decision procedures has been either in isolation or in the context of interfacing procedures to non-refutation-based theorem provers, this appears to be the first reported work on decision procedures in the context of refutation based deductive synthesis where witnesses must be found.
Jeffrey Van Baalen, Steven Roach
Synthesis of Programs in Abstract Data Types
Abstract
In this paper we propose a method for program synthesis from constructive proofs based on a particular proof strategy, we call dischargeable set construction. This proof-strategy allows to build proofs in which active patterns (sequences of application of rules with proper computational content) can be distinguished from correctness patterns (concerning correctness properties of the algorithm implicitly contained in the proof). The synthesis method associates with every active pattern of the proof a program schema (in an imperative language) translating only the computational content of the proof. One of the main features of our method is that it can be applied to a variety of theories formalizing ADT’s and classes of ADT’s. Here we will discuss the method and the computational content of some principles of particular interest in the context of some classes of ADT’s.
Alessandro Avellone, Mauro Ferrari, Pierangelo Miglioli
OOD Frameworks in Component-Based Software - Development in Computational Logic
Abstract
Current Object-oriented Design (OOD) methodologies tend to focus on objects as the unit of reuse, but it is increasingly recognised that frameworks, or groups of interacting objects, are a better unit of reuse. Thus, in next-generation Component-based Development (CBD) methodologies, we can expect components to be frameworks rather than objects. In this paper, we describe a preliminary attempt at a formal semantics for OOD frameworks in CBD in computational logic.
Kung-Kiu Lau, Mario Ornaghi
The Use of Renaming in Composing General Programs
Abstract
Most modern computing systems consist of large numbers of software components that interact with each other. Correspondingly, the capability of re-using and composing existing software components is of primary importance in this scenario. In this paper we analyse the role of renaming as a key ingredient of component-based programming. More precisely, a meta-level renaming operation is introduced in the context of a logic-based program composition setting which features a number of other composition operations over general logic programs, that is, logic programs possibly containing negative premises. Several examples are presented to illustrate the increased knowledge representation capabilities of logic programming for non-monotonic reasoning. The semantics of programs and program compositions is defined in terms of three-valued logic by extending the three-valued semantics for logic programs proposed by Fitting [10]. A computational interpretation of program compositions is formalised by means of an equivalence preserving transformation of arbitrary program compositions into standard general programs.
Antonio Brogi, Simone Contiero, Franco Turini
Inductive Synthesis of Logic Programs by Composition of Combinatory Program Schemes
Abstract
Based on a variable-free combinatory form of definite clause logic programs we outline a methodology and supporting program environment CombInduce for inducing well-moded logic programs from examples. The combinators comprise fold combinators for recursion on lists. The combinator form is distinguished by enabling piecewise composition of semantically meaningful program elements according to the compositional semantics principle. The principle of combining programs from combinators admits induction of programs without appealing to most-specific-generalization and predicate invention in contrast to prevailing ILP approaches. Moreover, the combinator form avoids confusing object and metavariables in the applied metalogic program environment. In addition useful algebraic rewriting rules can be formulated conveniently with the combinators.
Andreas Hamfelt, Jørgen Fischer Nilsson
Specialising Logic Programs with Respect to Call/Post Specifications
Abstract
In this paper we present a program specialisation method which, given a call/post specification, transforms a logic program into a weakly call-correct one satisfying the post-condition. The specialisation is applied to specialised partially correct programs. This notion is based on the definition of specialised derivation which is intended to describe program behaviour whenever some properties on procedure calls are assumed. Top-down and bottom-up semantics of specialised derivations are recalled.
Annalisa Bossi, Sabina Rossi
Generalization in Hierarchies of Online Program Specialization Systems
Abstract
In recent work, we proposed a simple functional language S-graph-n to study metaprogramming aspects of self-applicable online program specialization. The primitives of the language provide support for multiple encodings of programs. An important component of online program specialization is the termination strategy. In this paper we show that such a representation has the great advantage of simplifying generalization of multiply encoded data. After developing and formalizing the basic metaprogramming concepts, we extend two basic methods to multiply encoded data: most specific generalization and the homeomorphic embedding relation. Examples and experiments with the initial design of an online specializer illustrate their use in hierarchies of online program specializers.
Robert Glück, John Hatcliff, Jesper Jørgensen
Improving Homeomorphic Embedding for Online Termination
Abstract
Well-quasi orders in general, and homeomorphic embedding in particular, have gained popularity to ensure online termination of program analysis, specialisation and transformation techniques. It has been recently shown that the homeomorphic embedding relation is strictly more powerful than a large class of involved well-founded approaches. In this paper we provide some additional investigations on the power of homeomorphic embedding. We, however, also illustrate that the homeomorphic embedding relation suffers from several inadequacies in contexts where logical variables arise. We therefore present new, extended homeomorphic embedding relations to remedy this problem.
Michael Leuschel
Successes in Logic Programs
Abstract
In this paper we study how to verify that a pure Prolog program has solutions for a given query. The detailed analysis of the failure/ success behaviour of a program is necessary when dealing with transformation and verification of pure Prolog programs. In a previous work [10] we defined the class of noFD programs and queries which are characterized statically. We proved that a noFD query cannot have finitely failing derivations in a noFD program. Now, by introducing the concept of a set of exhaustive tests, we define the larger class of successful predicates. We prove that a noFD terminating query for successful predicates have at least one successful derivation. Moreover we propose some techniques based on program transformations for simplifying the verification of the successful condition.
Annalisa Bossi, Nicoletta Cocco
Inferring and Compiling Termination for Constraint Logic Programs
Abstract
This paper presents an automated method that deals with termination of constraint logic programs in two steps. First, from the text of a program, the method infers a set of potentially terminating classes using abstract interpretation and boolean mu-calculus. By “potentially”, we roughly mean that for each of these classes, one can find a static order over the literals of the clauses of the program to ensure termination. Then, given a terminating class among those computed at the first step, the second step consists of a “compilation” (or transformation) of the original program to another one by reordering literals. For this new program, the universal left-termination of any query of the considered class is guaranteed. The method has been implemented.
Sébastien Hoarau, Fred Mesnard
Strictness Analysis as Finite-Domain Constraint Solving
Abstract
It has become popular to express dataflow analyses in logical form. In this paper we investigate a new approach to the analysis of functional programs, based on synthesis of constraint logic programs. We sketch how the language Toupie, originally designed with logic program analysis as one objective, lends itself also to sophisticated strictness analysis. Strictness analysis is straightforward in the simplest case, that of analysing a first-order functional language using just two strictness values, namely divergence and “don’t know”. Mycroft’s classical translation immediately yields perfectly valid Boolean constraint logic programs, which, when run, provide the desired strictness information. However, more sophisticated analysis requires more complex domains of strictness values. We recast Wadler’s classical analysis over a 2n-point domain as finite-domain constraint solving. This approach has several advantages. First, the translation is relatively simple. We translate a recursive function definition into one or two constraint program clauses, in a manner which naturally extends Mycroft’s translation for the 2-point case, where the classical approach translate the definition of an n-place function over lists into 4n mutually recursive equations. Second, the resulting program produces relational information, allowing for example to ask which combinations of properties of input will produce a given output. Third, the approach allows us to leverage from established technology, for solving finite-domain constraints, as well as for finding fixed points. Finally, the use of (disjunctive) constraints can yield a higher precision in the analysis of some programs.
Tihomir Gabrić, Kevin Glynn, Harald Søndergaard
Invariant Discovery via Failed Proof Attempts
Abstract
We present a framework for automating the discovery of loop invariants based upon failed proof attempts. The discovery of suitable loop invariants represents a bottleneck for automatic verification of imperative programs. Using the proof planning framework we reconstruct standard heuristics for developing invariants. We relate these heuristics to the analysis of failed proof attempts allowing us to discover invariants through a process of refinement.
Jamie Stark, Andrew Ireland
Preventing Instantiation Errors and Loops for Logic Programs with Multiple Modes Using block Declarations
Abstract
This paper presents several verification methods for logic programs with delay declarations. It is shown how type and instantiation errors related to built-ins can be prevented, and how termination can be ensured. Three features are distinctive of this work: it is assumed that predicates can be used in several modes; it is shown that block declarations, which are a very simple delay construct, are sufficient to ensure the desired properties; the selection rule is taken into account, assuming it to be the rule of most Prolog implementations. The methods can be used both to verify existing programs and to assist in writing new programs.
Jan-Georg Smaus, Pat Hill, Andy King
Algorithms for Synthesizing Reactive Systems: A Perspective
Abstract
Starting in the early 1980s, a number of algorithmic techniques have been proposed for synthesizing the synchronization kernel of reactive systems from high-level temporal logic specifications. These techniques are based on nontrivial results in logic and automata theory and appear to be quite powerful. Nevertheless, essentially none of this work has found its way to practical use. This paper reviews the main results from this area and tries to identify the main reasons that explain their nonexploitation.
Pierre Wolper
Schema-Guided Synthesis of CLP Programs
Abstract
This work is inspired by D.R. Smith’s research on synthesising global search (GS) programs (in the Refine language) from first-order logic specifications (also in Refine) [8,9,10]. We concentrate on synthesising constraint logic programs (CLP) [6] instead. We thus only have to synthesise code that (incrementally) poses the constraints, because the actual constraint propagation and pruning are performed by the CLP system. We here only tackle the family of decision assignment problems; the families of optimisation assignment problems, decision permutation problems, and optimisation permutation problems are covered in [4].
Hamza Zidoum, Pierre Flener, Brahim Hnich
Abstract: Proof Planning with Program Schemas
Abstract
Schema-based program synthesis and transformation techniques tend to be either pragmatic, designed for carrying out real program transformation or synthesis operations but lacking the logical basis to ensure correctness of the programs they synthesise/transform, or rigorous, with strong theoretical foundations, but generating proof obligations which are difficult to satisfy.
Julian Richardson
Logical Synthesis of Imperative O.O. Programs
Abstract
The Ω logic, is designed to specify, to reason about and to synthesize imperative programs in an object oriented language, namely C++. There exists a lot of systems where computing-by-proof is possible but there are based on more or less classical logics and produce λ-expressions. We make the distinction between programming and computing. Functional programs are obviously in the second category. Even if imperative programs can be modeled with functions, thanks to denotational semantics, this is not realistic. We are interested in the first category: we want to synthesize programs doing actions.
Patrick Bellot, Bernard Robinet
Mathematical Foundations for Program Transformations
Abstract
Traditional programming paradigms revolve around mapping a single requirements specification into a program. As less and less software is developed from scratch, and more and more is developed from existing software artifacts, this traditional paradigm is growing less and less predominant. Paradigms that are gaining ground include:
  • Program adaptation, whereby a program is modified to satisfy a specification that it does not originally satisfy; this occurs in adaptive software maintenance and in white box software reuse.
  • Software Incrementation, whereby a program is modified to have an additional functional feature, while preserving its current functional properties.
  • Software Merging, whereby two (typically) similar software products are merged into a single product that cumulates the functional features of each; this arises typically when the two products are obtained by software incrementation from a common base version.
  • Software Composition, whereby various existing reusable assets are composed together (by means of application-specific code) to produce a prespecified software application; this arises in component-based software development.
R. Ben Ayed, J. Desharnais, M. Frappier, A. Mili
An Exhaustive-Search Method Using Layered Streams Obtained Through a Meta-Interpreter for Chain Programs
Abstract
Okumura and Matsumoto have published [1] examples of logic programs using a data structure they call “layered stream,” especially suited for completely traversing a search space in a deterministic manner [2,4]. A layered stream is a representation of a list of lists, in which common heads of adjacent sublists have been factored. This factorization allows the pruning of several branches of the search space in constant time and is also a source of parallelism [3, p. 147]. The published examples include the N-queen problem and “instant insanity” (a precursor of Rubik’s cube). Each such deterministic, layered-stream program supposedly performs exhaustive search over the space generated by some nondeterministic, naive program. However, a method converting a non-deterministic program into a corresponding exhaustive-search, layered-stream version has not yet been proposed, as far as we know. Layered streams have thus remained difficult to use [3, p. 408], since the programmer must be concerned both about factorization of heads and exhaustive search. We build upon the work of Okumura and Matsumoto by showing how to translate nondeterministic programs into exhaustive-search, layered-stream form. Our method is restricted to generate-and-test programs fitting a certain schema.
David A. Rosenblueth
Bottom-Up Specialisation of Logic Programs
Abstract
Partial deduction is an important transformation technique for logic programs, capable of removing inefficiencies from programs [4,5]. As an on-line specialisation technique, it is based on an evaluation mechanism for logic programs. The input to a typical partial deducer is a program and a partially instantiated query. The instantiated part represents the information with respect to which one would like to specialise; the uninstantiated part represents the information not yet known. Therefore, all classical partial deduction techniques use top-down evaluation (or SLD-resolution) to evaluate the program parts that depend on the known input and generate a new program that computes its result using only the remainder of the input. Since the new program has less computations to perform, in general, it will be more efficient.
Wim Vanhoof, Danny De Schreye, Bern Martens
Myrtle: A Set-Oriented Meta-Interpreter Driven by a “Relational” Trace for Deductive Databases Debugging
Abstract
Deductive databases manage large quantities of data and, in general, in a set-oriented way. The existing explanation systems for deductive databases [6,4,1] give information in the shape of forests of proof trees. Although proof trees are often useful, this representation is not sufficient. We propose a tracing technique which consists of integrating a ”relational” trace and an instrumented meta-interpreter using substitution sets. The relational trace efficiently gives precise information about data extraction from the relational database. The meta-interpreter manages substitution sets and gives explanation on the deduction. The expensive aspects of meta-interpretation are reduced by the use of the trace which avoids many calculations. The flexibility of meta-interpretation is preserved. It allows different profiles of trace to be easily produced.
Sarah Mallet, Mireille Ducassé
Backmatter
Metadaten
Titel
Logic-Based Program Synthesis and Transformation
herausgegeben von
Pierre Flener
Copyright-Jahr
1999
Verlag
Springer Berlin Heidelberg
Electronic ISBN
978-3-540-48958-0
Print ISBN
978-3-540-65765-1
DOI
https://doi.org/10.1007/3-540-48958-4