Skip to main content

2006 | Buch

Algebra, Meaning, and Computation

Essays dedicated to Joseph A. Goguen on the Occasion of His 65th Birthday

herausgegeben von: Kokichi Futatsugi, Jean-Pierre Jouannaud, José Meseguer

Verlag: Springer Berlin Heidelberg

Buchreihe : Lecture Notes in Computer Science

insite
SUCHEN

Über dieses Buch

Joseph Goguen is one of the most prominent computer scientists worldwide. His numerous research contributions span many topics and have changed the way we think about many concepts. Our views about data types, programming languages, software specification and verification, computational behavior, logics in computer science, semiotics, interface design, multimedia, and consciousness, to mention just some of the areas, have all been enriched in fundamental ways by his ideas.

This Festschrift volume - published to honor Joseph Goguen on his 65th Birthday on June 28, 2006 - includes 32 refereed papers by leading researchers in the different areas spanned by Joseph Goguen's work. The papers address a broad variety of topics from meaning, meta-logic, specification and composition, behavior and formal languages, as well as models, deduction, and computation.

The papers were presented at a Symposium in San Diego, California, USA in June 2006. Both the Festschrift volume and the Symposium allow the articulation of a retrospective and prospective view of a range of related research topics by key members of the research community in computer science and other fields connected with Joseph Goguen's work.

Inhaltsverzeichnis

Frontmatter

Meaning

Sync or Swarm: Musical Improvisation and the Complex Dynamics of Group Creativity

This essay draws on participant observation, ethnographic interviews, phenomenological inquiry, and recent insights from the study of swarm intelligence and complex networks to illuminate the dynamics of collective musical improvisation. Throughout, it argues for a systems understanding of creativity—a view that takes seriously the notion that group creativity is not simply reducible to individual psychological processes—and it explores interconnections between the realm of musical performance, community activities, and pedagogical practices. Lastly, it offers some reflections on the ontology of art and on the role that music plays in human cognition and evolution, concluding that improvising music together allows participants and listeners to explore complex and emergent forms of social order.

David Borgo
My Friend Joseph Goguen

A personal account of how Joseph Goguen and I came to work together and of the influence that Tibetan Buddhism had on us and on our collaboration. A brief discussion of some neurological experiments using meditators and how Goguen’s work connects Buddhism, computing, and cognition.

Rod Burstall
Metalogic, Qualia, and Identity on Neptune’s Great Moon: Meaning and Mathematics in the Works of Joseph A. Goguen and Samuel R. Delany

The works of Joseph A. Goguen and Samuel R. Delany address wide arrays of ”big” issues in philosophy: identity and qualitative experience, semiotic representation, and the divergence between meaning in formal systems of understanding and in everyday lived experience. This essay attempts to draw out some of the parallels between the works of these two authors, in particular regarding metalogic, qualia, and identity, using illustrative examples from the works of both authors. Their works exhibit parallel dual strands: (1) a desire to rigorously and precisely map out these fundamental issues, and (2) a desire to acknowledge and embrace the ambiguities of phenomenological experience and its divergence from any formalizable theory. In the end, addressing such a wide range of issues has required both authors to develop and adopt new discourse strategies ranging from rational argumentation to mathematics, from religious and philosophical commentary to speculative (science) fiction and poetry.

D. Fox Harrell

Meta-Logic

Quantum Institutions

The exogenous approach to enriching any given base logic for probabilistic and quantum reasoning is brought into the realm of institutions. The theory of institutions helps in capturing the precise relationships between the logics that are obtained, and, furthermore, helps in analyzing some of the key design decisions and opens the way to make the approach more useful and, at the same time, more abstract.

Carlos Caleiro, Paulo Mateus, Amilcar Sernadas, Cristina Sernadas
Jewels of Institution-Independent Model Theory

This paper is dedicated to Joseph Goguen, my beloved teacher and friend, on the ocassion of his 65th anniversary. It is a survey of institution-independent model theory as it stands today, the true form of abstract model theory which is based on the concept of institution. Institution theory was co-fathered by Joseph Goguen and Rod Burstall in late 1970’s. In the final part we discuss some philosophical roots of institution-independent methodologies.

Răzvan Diaconescu
Semantic Web Languages – Towards an Institutional Perspective

The Semantic Web (SW) is viewed as the next generation of the Web that enables intelligent software agents to process and aggregate data autonomously. Ontology languages provide basic vocabularies to semantically markup data on the SW. We have witnessed an increase of numbers of SW languages in the last years. These languages, such as RDF, RDF Schema (RDFS), the OWL suite of languages, the OWL

− −

suite, SWRL, are based on different semantics, such as the RDFS-based, description logic-based, Datalog-based semantics. The relationship among the various semantics poses a challenge for the SW community for making the languages interoperable. Institutions provide a means of reasoning about software specifications regardless of the logical system. This makes it an ideal candidate to represent and reason about the various languages in the Semantic Web. In this paper, we construct institutions for the SW languages and use institution morphisms to relate them. We show that RDF framework together with the RDF serializations of SW languages form an indexed institution. This allows the use of Grothendieck institutions to combine Web ontologies described in various languages.

Dorel Lucanu, Yuan Fang Li, Jin Song Dong
Institutional 2-cells and Grothendieck Institutions

We propose to use Grothendieck institutions based on 2-categorical diagrams as a basis for heterogeneous specification. We prove a number of results about colimits and (some weak variants of) exactness. This framework can also be used for obtaining proof systems for heterogeneous theories involving institution semi-morphisms.

Till Mossakowski
Some Varieties of Equational Logic

The application of ideas from universal algebra to computer science has long been a major theme of Joseph Goguen’s research, perhaps even

the

major theme. One strand of this work concerns algebraic datatypes. Recently there has been some interest in what one may call algebraic

computation

types. As we will show, these are also given by equational theories, if one only understands the notion of equational logic in somewhat broader senses than usual.

Gordon Plotkin
Complete Categorical Deduction for Satisfaction as Injectivity

Birkhoff (quasi-)variety categorical axiomatizability results have fascinated many scientists by their elegance, simplicity and generality. The key factor leading to their generality is that equations, conditional or not, can be regarded as special morphisms or arrows in a special category, where their satisfaction becomes injectivity, a simple and abstract categorical concept. A natural and challenging next step is to investigate complete deduction within the same general and elegant framework. We present a categorical deduction system for equations as arrows and show that, under appropriate finiteness requirements, it is complete for satisfaction as injectivity. A straightforward instantiation of our results yields complete deduction for several equational logics, in which conditional equations can be derived as well at no additional cost, as opposed to the typical method using the theorems of constants and of deduction. At our knowledge, this is a new result in equational logics.

Grigore Roşu

Specification and Composition

Extension Morphisms for CommUnity

Superpositions are useful relationships between programs or components in component based approaches to software development. We study the application of invasive superposition morphisms between components in the architecture design language CommUnity. This kind of morphism allows us to characterise component extension relationships, and in particular, serves an important purpose for enhancing components to implement certain

aspects

, in the sense of aspect oriented software development. We show how this kind of morphism combines with regulative superposition and refinement morphisms, on which CommUnity relies, and illustrate the need and usefulness of extension morphisms for the implementation of aspects, in particular, certain fault tolerance related aspects, by means of a case study.

Nazareno Aguirre, Tom Maibaum, Paulo Alencar
Non-intrusive Formal Methods and Strategic Rewriting for a Chemical Application

The concept of formal islands allows adding to existing programming languages, formal features that can be compiled later on into the host language itself, therefore inducing no dependency on the formal language. We illustrate this approach with the

TOM

system that provides matching, normalization and strategic rewriting, and we give a formal island implementation for the simulation of a chemical reactor.

Oana Andrei, Liliana Ibanescu, Hélène Kirchner
From OBJ to ML to Coq

Rigorous program development is notoriously difficult because it involves many aspects, among which specification, programming, verification, code reuse, maintenance, and version management. Besides, these various tasks are interdependent, requiring going back and forth between them. In this paper, we are interested in certain language features and in languages which help make the user’s life easier for developing programs satisfying their specifications.

Jacek Chrząszcz, Jean-Pierre Jouannaud
Weak Adhesive High-Level Replacement Categories and Systems: A Unifying Framework for Graph and Petri Net Transformations

Adhesive high-level replacement (HLR) systems have been recently introduced as a new categorical framework for graph tranformation in the double pushout (DPO) approach. They combine the well-known concept of HLR systems with the concept of adhesive categories introduced by Lack and Sobociński.

While graphs, typed graphs, attributed graphs and several other variants of graphs together with corresponding morphisms are adhesive HLR categories, such that the categorical framework of adhesive HLR systems can be applied, this has been claimed also for Petri nets. In this paper we show that this claim is wrong for place/transition nets and algebraic high-level nets, although several results of the theory for adhesive HLR systems are known to be true for the corresponding Petri net transformation systems.

In fact, we are able to define a weaker version of adhesive HLR categories, called weak adhesive HLR categories, which is still sufficient to show all the results known for adhesive HLR systems. This concept includes not only all kinds of graphs mentioned above, but also place/transition nets, algebraic high-level nets and several other kinds of Petri nets. For this reason weak adhesive HLR systems can be seen as a unifying framework for graph and Petri net transformations.

Hartmut Ehrig, Ulrike Prange
From OBJ to Maude and Beyond

The OBJ algebraic specification language and its Eqlog and FOOPS multiparadigm extensions are revisited from the perspective of the Maude language design. A common thread is the quest for ever more expressive computational logics, on which executable formal specifications of increasingly broader classes of systems can be based. Several recent extensions, beyond Maude itself, are also discussed.

José Meseguer
Constructive Action Semantics in OBJ

Goguen and Malcolm specify semantics of programming languages in OBJ. Here, we consider how the extensibility and reusability of their specifications could be improved. We propose using the notation and modular structure of the Constructive Action Semantics framework in OBJ, and give a simple illustration. The reader is assumed to be familiar with OBJ.

Peter D. Mosses
Horizontal Composability Revisited

We recall the contribution of Goguen and Burstall’s 1980 CAT paper and its powerful influence on theories of specification implementation that were emerging at about the same time, via the introduction of the notions of

vertical

and

horizontal composition

of implementations. We then give a different view of implementation which we believe provides a more adequate reflection of the rather subtle interplay between implementation, specification structure and program structure.

Donald Sannella, Andrzej Tarlecki
Composition by Colimit and Formal Software Development

Goguen emphasized long ago that colimits are how to compose systems [7]. This paper corroborates and elaborates Goguen’s vision by presenting a variety of situations in which colimits can be mechanically applied to support software development by refinement. We illustrate the use of colimits to support automated datatype refinement, algorithm design, aspect weaving, and security policy enforcement.

Douglas R. Smith

Behaviour and Formal Languages

Proving Behavioral Refinements of COL-specifications

The COL institution (constructor-based observational logic) has been introduced as a formal framework to specify both generation- and observation-oriented properties of software systems. In this paper we consider behavioral refinement relations between COL-specifications taking into account implementation constructions. We propose a general strategy for proving the correctness of such refinements by reduction to (standard) first-order theorem proving with induction. Technically our strategy relies on appropriate proof rules and on a lifting construction to encode the reachability and observability notions of the COL institution.

Michel Bidoit, Rolf Hennicker
The Reactive Engine for Modular Transducers

This paper explains the design of the second release of the Zen toolkit [5–7]. It presents a notion of reactive engine which simulates finite-state machines represented as shared

aums

[8]. We show that it yields a modular interpreter for finite state machines described as

local

transducers. For instance, in the manner of Berry and Sethi, we define a compiler of regular expressions into a scheduler for the reactive engine, chaining through aums labeled with phases — associated with the letters of the regular expression. This gives a modular composition scheme for general finite-state machines.

Many variations of this basic idea may be put to use according to circonstances. The simplest one is when aums are reduced to dictionaries, i.e. to (minimalized) acyclic deterministic automata recognizing finite languages. Then one may proceed to adding supplementary structure to the aum algebra, namely non-determinism, loops, and transduction. Such additional choice points require fitting some additional control to the reactive engine. Further parameters are required for some functionalities. For instance, the local word access stack is handy as an argument to the output routine in the case of transducers. Internal virtual addresses demand the full local state access stack for their interpretation.

A characteristic example is provided, it gives a complete analyser for compound substantives. It is an abstraction from a modular version of the Sanskrit segmenter presented in [9]. This improved segmenter uses a regular relation condition relating the phases of morphology generation, and enforcing the correct geometry of morphemes. Thus we obtain compound nouns from iic*.(noun+iic.ifc), where iic and ifc are the respectively prefix and suffix substantival forms for compound formation.

Gérard Huet, Benoît Razet
A Bialgebraic Review of Deterministic Automata, Regular Expressions and Languages

This papers reviews the classical theory of deterministic automata and regular languages from a categorical perspective. The basis is formed by Rutten’s description of the Brzozowski automaton structure in a coalgebraic framework. We enlarge the framework to a so-called bialgebraic one, by including algebras together with suitable distributive laws connecting the algebraic and coalgebraic structure of regular expressions and languages. This culminates in a reformulated proof via finality of Kozen’s completeness result. It yields a complete axiomatisation of observational equivalence (bisimilarity) on regular expressions. We suggest that this situation is paradigmatic for (theoretical) computer science as the study of “generated behaviour”.

Bart Jacobs
Sheaves and Structures of Transition Systems

We present a way of viewing labelled transition systems as sheaves: these can be thought of as systems of observations over a topology, with the property that consistent local observations can be pasted together into global observations. We show how this approach extends to hierarchical structures of labelled transition systems, where behaviour is taken as a limit construction. Our examples show that this is particularly effective when transition systems have structured states.

Grant Malcolm
Uniform Functors on Sets

This paper studies uniformity conditions for endofunctors on sets following Aczel [1], Turi [21], and others. The “usual” functors on sets are uniform in our sense, and assuming the Anti-Foundation Axiom

AFA

, a uniform functor

H

has the property that its greatest fixed point

H

*

is a final coalgebra whose structure is the identity map. We propose a notion of uniformity whose definition involves notions from recent work in coalgebraic recursion theory: completely iterative monads and completely iterative algebras (cias). Among our new results is one which states that for a uniform

H

, the entire set-theoretic universe

V

is a cia: the structure is the inclusion of

HV

into the universe

V

itself.

Lawrence S. Moss
An Algebraic Approach to Regular Sets

In recent years an increasing interest in regular sets for different kinds of elements could be observed. The introduction of XML has led to investigations of regular sets of both ranked and unranked trees and also of attributed unranked trees.

The aim of this short note is to introduce a uniform notion of regularity. If instantiated for strings, ranked trees and unranked trees it will coincide with the existing concepts and it can easily be extended to arbitrary data types. This leads to a natural notion of regularity for different kinds of attributed unranked trees and also to regular sets of structured elements which have not yet been investigated. The approach takes advantage from freeness constraints and parametric abstract data types as offered by the algebraic specification language

Casl

Horst Reichel

Models, Deduction, and Computation

Elementary Algebraic Specifications of the Rational Complex Numbers

From the range of techniques available for algebraic specifications we select a core set of features which we define to be the

elementary algebraic specifications

. These include equational specifications with hidden functions and sorts and initial algebra semantics. We give an elementary equational specification of the field operations and conjugation operator on the rational complex numbers ℚ(

i

) and discuss some open problems.

Jan A. Bergstra, John V. Tucker
From Chaos to Undefinedness

The semantic and logical treatment of recursion and of recursive definitions in computer science, in particular in requirements specification, in programming languages and related formalisms such as

λ

-calculus or recursively defined functions is one of the key issues of the semantic theory of programming and programming languages. As it has been recognised already in the early days of the theory of programming there are several options to formalise and give a theory of the semantics of recursive function declarations. In different branches of computer science, logics, and mathematics various techniques for dealing with the semantics of recursion have been developed and established. We outline, compare, and shortly discuss advantages and disadvantages of these different possibilities, illustrate them by a simple running example, and relate these approaches.

Manfred Broy
Completion Is an Instance of Abstract Canonical System Inference

Abstract canonical systems and inference (ACSI) were introduced to formalize the intuitive notions of good proof and good inference appearing typically in first-order logic or in Knuth-Bendix like completion procedures.

Since this abstract framework is intended to be generic, it is of fundamental interest to show its adequacy to represent the main systems of interest. This has been done for ground completion (where all equational axioms are ground) but was still an open question for the general completion process.

By showing that the standard completion is an instance of the ACSI framework we close the question. For this purpose, two proof representations, proof terms and proofs by replacement, are compared to built a proof ordering that provides an instantiation adapted to the abstract canonical system framework.

Guillaume Burel, Claude Kirchner
Eliminating Dependent Pattern Matching

This paper gives a reduction-preserving translation from Coquand’s

dependent pattern matching

[4] into a traditional type theory [11] with universes, inductive types and relations and the axiom

K

[22]. This translation serves as a proof of termination for structurally recursive pattern matching programs, provides an implementable compilation technique in the style of functional programming languages, and demonstrates the equivalence with a more easily understood type theory.

Healfdene Goguen, Conor McBride, James McKinna
Iterative Lexicographic Path Orders

We relate Kamin and Lévy’s original presentation of

lexicographic path orders

(LPO), using an inductive definition, to a presentation, which we will refer to as

iterative lexicographic path orders

(ILPO), based on Bergstra and Klop’s definition of recursive path orders by way of an auxiliary term rewriting sytem.

Jan Willem Klop, Vincent van Oostrom, Roel de Vrijer
A Functorial Framework for Constraint Normal Logic Programming

The semantic constructions and results for definite programs do not extend when dealing with negation. The main problem is related to a well-known problem in the area of algebraic specification: if we fix a constraint domain as a given model, its free extension by means of a set of Horn clauses defining a set of new predicates is semicomputable. However, if the language of the extension is richer than Horn clauses its free extension (if it exists) is not necessarily semicomputable. In this paper we present a framework that allows us to deal with these problems in a novel way. This framework is based on two main ideas: a reformulation of the notion of constraint domain and a functorial presentation of our semantics. In particular, the semantics of a logic program

P

is defined in terms of three

functors

:

$({\mathcal {OP}}_{P},{\mathcal {ALG}}_{P},{\mathcal {LOG}}_{P})$

that apply to constraint domains and provide the operational, the least fixpoint and the logical semantics of

P

, respectively. The idea is that the application of

${\mathcal {OP}}_{P}$

to a specific constraint solver, provides the operational semantics of

P

that uses this solver; the application of

${\mathcal {ALG}}_{P}$

to a specific domain, provides the least fixpoint of

P

over this domain; and the application of

${\mathcal {LOG}}_{P}$

to a theory of constraints provides the logic theory associated to

P

. We prove that these three functors are in some sense equivalent.

Paqui Lucio, Fernando Orejas, Edelmira Pasarella, Elvira Pino
A Stochastic Theory of Black-Box Software Testing

We introduce a mathematical framework for black-box software testing of functional correctness, based on concepts from stochastic process theory. This framework supports the analysis of two important aspects of testing, namely: (i) coverage, probabilistic correctness and reliability modelling, and (ii) test case generation. Our model corrects some technical flaws found in previous models of probabilistic correctness found in the literature. It also provides insight into the design of new testing strategies, which can be more efficient than random testing.

Karl Meinke
Some Tips on Writing Proof Scores in the OTS/CafeOBJ Method

The OTS/CafeOBJ method is an instance of the proof score approach to systems analysis, which has been mainly devoted by researchers in the OBJ community. We describe some tips on writing proof scores in the OTS/CafeOBJ method and use a mutual exclusion protocol to exemplify the tips. We also argue soundness of proof scores in the OTS/CafeOBJ method.

Kazuhiro Ogata, Kokichi Futatsugi
Drug Interaction Ontology (DIO) and the Resource-Sensitive Logical Inferences

In this paper, we propose a formulation for inference rules in Drug Interaction Ontology (DIO). Our formulation for inference rules is viewed from the standpoint of process-description. The relations in DIO are now described as resource-sensitive linear logical implications. The compositional reasoning on certain drug-interactions discussed in our previous work on DIO is represented as a construction of a linear logical proof. As examples of our formulation, we use some anti-cancer drug interactions.

Mitsuhiro Okada, Yutaro Sugimoto, Sumi Yoshikawa, Akihiko Konagaya
Backmatter
Metadaten
Titel
Algebra, Meaning, and Computation
herausgegeben von
Kokichi Futatsugi
Jean-Pierre Jouannaud
José Meseguer
Copyright-Jahr
2006
Verlag
Springer Berlin Heidelberg
Electronic ISBN
978-3-540-35464-2
Print ISBN
978-3-540-35462-8
DOI
https://doi.org/10.1007/11780274