Skip to main content

2017 | Book

Concurrency, Security, and Puzzles

Essays Dedicated to Andrew William Roscoe on the Occasion of His 60th Birthday


About this book

This festschrift was written in honor of Andrew William (Bill) Roscoe on the occasion of his 60th birthday, and features tributes by Sir Tony Hoare, Stephen Brookes, and Michael Wooldridge. Bill Roscoe is an international authority in process algebra, and has been the driving force behind the development of the FDR refinement checker for CSP. He is also world renowned for his pioneering work in analyzing security protocols, modeling information flow, human-interactive security, and much more.
Many of these areas are reflected in the 15 invited research articles in this festschrift, and in the presentations at the "BILL-60" symposium held in Oxford, UK, on January 9 and 10, 2017.

Table of Contents

Stealthy Protocols: Metrics and Open Problems
This paper is a survey of both methods that could be used to support stealthy communication over both wired and wireless networks and techniques for evaluating them. By stealthy communication we mean communication using channels that guarantee that the nature of the communication, or even the fact that communication is taking place at all, is hidden. Although stealthy communication and information hiding have been studied from a number of different points of view, e.g. image steganography, network covert channels, and covert wireless communication, not much has been done to tie these different threads together and attempt to see how the different branches of stealthy communication research can inform each other. In this paper we take the first steps to remedying this deficiency. We identify open problems, point out gaps, and indicate directions for further research.
Olga Chen, Catherine Meadows, Gautam Trivedi
A Specification Theory of Real-Time Processes
This paper presents an assume-guarantee specification theory (aka interface theory from [11]) for modular synthesis and verification of real-time processes with critical timing constraints. Four operations, i.e. conjunction, disjunction, parallel and quotient, are defined over specifications, drawing inspirations from classic specification theories like refinement calculus [4, 19]. We show that a congruence (or pre-congruence) characterised by a trace-based semantics [14] captures exactly the notion of substitutivity (or refinement) between specifications.
Chris Chilton, Marta Kwiatkowska, Faron Moller, Xu Wang
Towards Verification of Cyber-Physical Systems with UTP and Isabelle/HOL
In this paper, we outline our vision for building verification tools for Cyber-Physical Systems based on Hoare and He’s Unifying Theories of Programming (UTP) and interactive proof technology in Isabelle/HOL. We describe our mechanisation and explain some of the design decisions that we have taken to get a convenient and smooth implementation. In particular, we describe our use of lenses to encode state. We illustrate our work with an example UTP theory and describe the implementation of three foundational theories: designs, reactive processes, and the hybrid relational calculus. We conclude by reflecting on how tools are linked by unifying theories.
Simon Foster, Jim Woodcock
FDR: From Theory to Industrial Application
FDR is the most well-known verification tool for CSP. Since its early beginnings in 1980s, it has developed into one of the world’s fastest model checking tools. Over the years, FDR has made a significant impact across academic subject areas, most notably in cyber-security, as well as across industrial domains, such as high-tech manufacturing, telecommunications, aerospace, and defence. In honour of Bill Roscoe’s 60th birthday, this paper provides a brief history of FDR, together with a collection of notable examples of FDR’s practical impact in these areas.
Thomas Gibson-Robinson, Guy Broadfoot, Gustavo Carvalho, Philippa Hopcroft, Gavin Lowe, Sidney Nogueira, Colin O’Halloran, Augusto Sampaio
Information Flow, Distributed Systems, and Refinement, by Example
Non-interference is one of the foundational notions of security stretching back to Goguen and Meseguer [3]. Roughly, a set of activities C is non-interfering with a set D if any possible behavior at D is compatible with anything that could have occurred at C. One also speaks of “no information flow” from C to D in this case. Many hands further developed the idea and its variants (e.g. [12, 15]), which also flourished within the process calculus context [1, 2, 6, 13]. A.W. Roscoe contributed a characteristically distinctive idea to this discussion, in collaboration with J. Woodcock and L. Wulf. The idea was that a system is secure for flow from C to D when, after hiding behaviors at the source C, the destination D experiences the system as deterministic [8, 11]. In the CSP tradition, a process is deterministic if, after engaging in a sequence t of events, it can refuse an event a, then it always refuses the event a after engaging in t [9].
Joshua D. Guttman
Abstractions for Transition Systems with Applications to Stubborn Sets
Partial order reduction covers a range of techniques based on eliminating unnecessary transitions when generating a state space. On the other hand, abstractions replace sets of states of a system with abstract representatives in order to create a smaller state space. This article explores how stubborn sets and abstraction can be combined. We provide examples to provide intuition and expand on some recent results. We provide a classification of abstractions and give some novel results on what is needed to combine abstraction and partial order reduction in a sound way.
Henri Hansen
A Hybrid Relational Modelling Language
Hybrid systems are usually composed by physical components with continuous variables and discrete control components where the system state evolves over time according to interacting laws of discrete and continuous dynamics. Combinations of computation and control can lead to very complicated system designs. We treat more explicit hybrid models by proposing a hybrid relational calculus, where both clock and signal are present to coordinate activities of parallel components of hybrid systems. This paper proposes a hybrid relational modelling language with a set of novel combinators which support complex combinations of both testing and signal reaction behaviours to model the physical world and its interaction with the control program. We provide a denotational semantics (based on the hybrid relational calculus) to the language, and explore healthiness conditions that deal with time and signal as well as the status of the program. A number of small examples are given throughout the paper to demonstrate the usage of the language and its semantics.
He Jifeng, Li Qin
What Makes Petri Nets Harder to Verify: Stack or Data?
We show how the yardstick construction of Stockmeyer, also developed as counter bootstrapping by Lipton, can be adapted and extended to obtain new lower bounds for the coverability problem for two prominent classes of systems based on Petri nets: Ackermann-hardness for unordered data Petri nets, and Tower-hardness for pushdown vector addition systems.
Ranko Lazić, Patrick Totzke
Analysing Lock-Free Linearizable Datatypes Using CSP
We consider how we can use the process algebra CSP and the model checker FDR in order to obtain assurance about the correctness of concurrent datatypes. In particular, we perform a formal analysis of a concurrent queue based on a linked list of nodes. We model the queue in CSP and analyse it using FDR. We capture two important properties using CSP, namely linearizability and lock-freedom.
Gavin Lowe
Discrete Random Variables Over Domains, Revisited
We revisit the construction of discrete random variables over domains from [15] and show how Hoare’s “normal termination” symbol \(\checkmark \) can be used to achieve a more expressive model. The result is a natural model of flips of a coin that supports discrete and continuous (sub)probability measures. This defines a new random variables monad on BCD, the category of bounded complete domains, that can be used to augment semantic models of demonic nondeterminism with probabilistic choice. It is the second such monad, the first being Barker’s monad for randomized choice [3]. Our construction differs from Barker’s monad, because the latter requires the source of randomness to be shared across multiple users. The monad presented here allows each user to access a source of randomness that is independent of the sources of randomness available to other users. This requirement is useful, e.g., in models of crypto-protocols.
Michael Mislove
A Demonic Lattice of Information
Landuaer and Redmond’s Lattice of Information was an early and influential formalisation of the pure structure of security [8]: a partial order was defined for information-flow from a hidden state. In modern terms we would say that more-security refines less-security. For Landauer, the deterministic case [op. cit.], the refinement order is a lattice.
Very recently [3, 9] a similar approach has been taken to purely probabilistic systems and there too a refinement order can be defined; but it is not a lattice [12].
In between deterministic and probabilistic is demonic, where behaviour is not deterministic but also not quantifiable. We show that our own earlier approach to this [15, 16] fits into the same pattern as deterministic and probabilistic, and illustrate that with results concerning compositionality, testing, soundness and completeness. Finally, we make some remarks about source-level reasoning.
Carroll Morgan
A Brief History of Security Protocols
The universe seethes with protocols, from the interactions of elementary particles (Feynman diagrams) to the power plays of nation states. Security protocols allow remote entities to interact safely, and as such they form the glue that holds the information society together. In this chapter we give an overview of the evolution of security protocols, from the Needham-Schroeder Secret Key protocol to quantum and post-quantum cryptography, and the tools and techniques used to understand and analyse them with particular emphasis on the major and seminal role played by Bill Roscoe in this history.
Peter Y. A. Ryan
More Stubborn Set Methods for Process Algebras
Six stubborn set methods for computing reduced labelled transition systems are presented. Two of them preserve the traces, and one is tailored for on-the-fly verification of safety properties. The rest preserve the tree failures, fair testing equivalence, or the divergence traces. Two methods are entirely new, the ideas of three are recent and the adaptation to the process-algebraic setting with non-deterministic actions is new, and one is recent but slightly generalized. Most of the methods address problems in earlier solutions to the so-called ignoring problem. The correctness of each method is proven, and efficient implementation is discussed.
Antti Valmari
A Branching Time Model of CSP
I present a branching time model of CSP that is finer than all other models of CSP proposed thus far. It is obtained by taking a semantic equivalence from the linear time – branching time spectrum, namely divergence-preserving coupled similarity, and showing that it is a congruence for the operators of CSP. This equivalence belongs to the bisimulation family of semantic equivalences, in the sense that on transition systems without internal actions it coincides with strong bisimilarity. Nevertheless, enough of the equational laws of CSP remain to obtain a complete axiomatisation for closed, recursion-free terms.
Rob van Glabbeek
Virtualization Based Development
Virtualization involves replacing all elements of an embedded development environment, including paper specifications, target hardware, test instruments, and plant and equipment, with software representations, for an all-in-software path to creating working executable versions of an embedded system or of any of its components or sub-systems. A virtualization based development process leverages virtualization in all phases of development. It enables acceleration by allowing development tasks to start earlier and by proceeding at a higher pace and on a wider front, for example through automation of development and test, optimization of design, increased test scope and coverage, and calibration procedures that are not feasible with real hardware implementations. This paper outlines the concepts and some of the applications of virtualization based development, defines the technology, tool and process requirements, and introduces VLAB™ as a tool and operating environment for virtualization based development of embedded software and systems. In order to make these concepts as fully covered and easily understood as possible, we will focus and contain the scope of the paper to their application to the development of automotive controller modules, or ECUs, for modern engine systems control.
Jay Yantchev, Atanas Parashkevov
Concurrency, Security, and Puzzles
Thomas Gibson-Robinson
Philippa Hopcroft
Ranko Lazić
Copyright Year
Electronic ISBN
Print ISBN

Premium Partner