Skip to main content
Top

2024 | Book

The French School of Programming

insite
SEARCH

About this book

The French School of Programming is a collection of insightful discussions of programming and software engineering topics, by some of the most prestigious names of French computer science. The authors include several of the originators of such widely acclaimed inventions as abstract interpretation, the Caml, OCaml and Eiffel programming languages, the Coq proof assistant, agents and modern testing techniques.

The book is divided into four parts: Software Engineering (A), Programming Language Mechanisms and Type Systems (B), Theory (C), and Language Design and Programming Methodology (D). They are preceded by a Foreword by Bertrand Meyer, the editor of the volume, a Preface by Jim Woodcock providing an outsider’s appraisal of the French school’s contribution, and an overview chapter by Gérard Berry, recalling his own intellectual journey. Chapter 2, by Marie-Claude Gaudel, presents a 30-year perspective on the evolution of testing starting with her own seminal work. In chapter 3, Michel Raynal covers distributed computing with an emphasis on simplicity. Chapter 4, by Jean-Marc Jézéquel, former director of IRISA, presents the evolution of modeling, from CASE tools to SLE and Machine Learning. Chapter 5, by Joëlle Coutaz, is a comprehensive review of the evolution of Human-Computer Interaction. In part B, chapter 6, by Jean-Pierre Briot, describes the sequence of abstractions that led to the concept of agent. Chapter 7, by Pierre-Louis Curien, is a personal account of a journey through fundamental concepts of semantics, syntax and types. In chapter 8, Thierry Coquand presents “some remarks on dependent type theory”. Part C begins with Patrick Cousot’s personal historical perspective on his well-known creation, abstract interpretation, in chapter 9. Chapter 10, by Jean-Jacques Lévy, is devoted to tracking redexes in the Lambda Calculus. The final chapter of that part, chapter 11 by Jean-Pierre Jouannaud, presents advances in rewriting systems, specifically the confluence of terminating rewriting computations. Part D contains two longer contributions. Chapter 12 is a review by Giuseppe Castagna of a broad range of programming topics relying on union, intersection and negation types. In the final chapter, Bertrand Meyer covers “ten choices in language design” for object-oriented programming, distinguishing between “right” and “wrong” resolutions of these issues and explaining the rationale behind Eiffel’s decisions.

This book will be of special interest to anyone with an interest in modern views of programming — on such topics as programming language design, the relationship between programming and type theory, object-oriented principles, distributed systems, testing techniques, rewriting systems, human-computer interaction, software verification… — and in the insights of a brilliant group of innovators in the field.

Table of Contents

Frontmatter
Chapter 1. The French School of Programming:A Personal View
Abstract
Although France has never been a world leader in the software industry, Computer science research has traditionally been at a worldwide level there, thanks to the Universities, the CNRS and Inria and to some researchers abroad. This book is devoted to important parts of the French field of programming languages, essential since program texts are the only way to drive computers. That research almost always tried to link mathematical rigor with practical concerns—an old French tradition. This has been particularly true for the development and linking of new and clean programming languages and formal verification systems, often created and linked together with the solid base of their mathematical semantics and their theorems. Such formal semantics served and still serve as a consistency guide during the design and implementation development, instead of being only addons by other people after the fact, as too often done with much less efficiency. This introductory chapter surveys the 12 subsequent chapters, each dedicated to a particular technical approach or language and written by their team leaders. It ends by myself telling in a nontechnical way how my own 50-years career dealt with the creation of a few original and mathematically well-studied theoretical frameworks and practical languages. Named Esterel, the last one has led to the creation in 2000 of a successful company that has become a world-leader in the field of certified software for safety-critical reactive systems (it also led to some success in industrial hardware design and verification, a successful application domain unfortunately killed by the 2008 financial crisis). Of course, this is not a single-man story, by far, and I also try to tell the associated social story with some humor because it was a lot of fun for me, and for my groups I hope.
Gérard Berry

Software Engineering

Frontmatter
Chapter 2. “Testing Can Be Formal Too”: 30 Years Later
Abstract
This chapter relates a series of works that contributed to the recognition of the use of formal specifications when testing software.
After presenting a first set of results on test methods based on algebraic data types, generalisations to other formal approaches are described.
The outcome of several case studies is reported, and as conclusion, the place of formal testing within the software validation and verification activities is discussed. A concrete example is developed in appendix.
Marie-Claude Gaudel
Chapter 3. A Short Visit to Distributed Computing Where Simplicity Is Considered a First Class Property
Abstract
Similarly to the injunction “Know yourself” engraved on the frontispiece of Delphi’s temple more than two millennia ago, the sentence “Make it as simple as possible, but not simpler” (attributed to Einstein) should be engraved on the entrance door of all research laboratories. At the long run, what does remain of our work? Mathematicians and physicists have formulas. We have algorithms! The main issue with simplicity is that it is often confused with triviality, but as stated by J. Perlis, the recipient of the first Turing Award, “Simplicity does not precede complexity, but follows it”.
Considering my research domain, namely distributed computing, this chapter surveys topics I was interested in during my career and presents a few results, approaches, and algorithms I designed (with colleagues). The design of these algorithms strove to consider (maybe unsuccessfully) concision, generality, simplicity, and elegance as first class properties. Said in other words, this chapter does not claim objectivity.
Michel Raynal
Chapter 4. Modeling: From CASE Tools to SLE and Machine Learning
Abstract
Finding better ways to handle software complexity (both inherent and accidental) is the holy grail for a significant part of the software engineering community, and especially for the Model Driven Engineering (MDE) one. To that purpose, plenty of techniques have been proposed, leading to a succession of trends in model based software developments paradigms in the last decades. While these trends seem to pop out from nowhere, we claim in this chapter that most of them actually stem from trying to get a better grasp on the variability of software. We revisit the history of MDE trying to identify the main aspect of variability they wanted to address when they were introduced. We conclude on what are the variability challenges of our time, including variability of data leading to machine learning of models.
Jean-Marc Jézéquel
Chapter 5. At the Confluence of Software Engineeringand Human-Computer Interaction:A Personal Account
Abstract
In this chapter, I review how the fields of Human Computer Interaction (HCI) and Computer Science have come together over the last 60 years to create and support novel forms of interaction. We see how interaction models have progressively incorporated human skills and abilities, as well as the physical and social properties taken from the real world. I organize this evolution into three periods—pre-HCI, seminal HCI, ubiquitous HCI. Each of these periods is illustrated with key reference works that have influenced my own research as well as contributions of French researchers to the field.
Joëlle Coutaz

Programming Language Mechanisms and Type Systems

Frontmatter
Chapter 6. From Procedures, Objects, Actors, Components, Services, to Agents
A Comparative Analysis of the History and Evolution of Programming Abstractions
Abstract
The objective of this chapter is to propose some retrospective analysis of the evolution of programming abstractions, from procedures, objects, actors, components, services, up to agents, by replacing them within a general historical perspective. Some common referential with three axes/dimensions is chosen: action selection at the level of one entity, coupling flexibility between entities, and abstraction level. We indeed may observe some continuous quest for higher flexibility (through notions such as late binding, or reification of connections) and higher level of abstraction. Concepts of components, services and agents have some common objectives (notably, software modularity and reconfigurability), with multi-agent systems raising further concepts of autonomy and coordination, notably through the notion of auto-organization and the use of knowledge. We hope that this analysis helps to highlight some of the basic forces motivating the progress of programming abstractions and therefore that it may provide some seeds for the reflection about future programming abstractions.
Jean-Pierre Briot
Chapter 7. Semantics and Syntax, Between Computer Science and Mathematics
Abstract
This text recounts my scientific itinerary from the late 1970s up to now, as I view it today, as well as the context in which it took place. The views expressed here are of course quite personal, and extremely partial in regard of the global landscape of research on programming languages in France and in the world. My research takes place mostly on the theoretical end of the spectrum of computer science. As a matter of fact, my scientific journey is now mainly taking place in homotopical algebra and higher category theory, with an eye on their recently unveiled links with type theory.
Pierre-Louis Curien
Chapter 8. Some Remarks About Dependent Type Theory
Abstract
The goal of this chapter is to describe a calculus designed in 1984/1985. This calculus was obtained by applying the ideas introduced by N.G. de Bruijn for AUTOMATH to some functional systems created by J.-Y. Girard. There were also strong connections with the work of P. Martin-Löf. This calculus provided quite simple uniform notations for proofs and (functional) programs. Because of this simplicity and uniformity, it was possible to use it for analysing logical problems such as impredicativity, paradoxes, but also notions of computer science such as parametricity. It could also be used as the basis of implementations of proof and functional systems [29], arguably simpler, or at least competitive, with the ones that were available at the time. One main theme of this work is the importance of notations in mathematics and computer science: new questions were asked and solved only because of the use of AUTOMATH notation, itself a variation of \(\lambda \)-notation introduced by A. Church.
Thierry Coquand

Theory

Frontmatter
Chapter 9. A Personal Historical Perspective on Abstract Interpretation
Abstract
Following an historical perspective, and restricted to my work with Radhia Cousot, I discuss the origin and evolution of concepts in abstract interpretation applied to semantics, verification, static and dynamic analysis, and algorithm design.
Abstract interpretation is a unifying theory of formal methods that proposes a general methodology for proving the correctness of computing systems, by sound (and sometimes complete) approximation on their semantics.
Patrick Cousot
Chapter 10. Tracking Redexes in the Lambda Calculus
Abstract
Residuals of redexes keep track of redexes along reductions in the lambda calculus. Families of redexes keep track of redexes created along these reductions. In this chapter, we review these notions and their relation to a labeled \(\lambda \)-calculus introduced here in a systematic way. These properties may be extended to combinatory logic, term rewriting systems, process calculi and proofnets of linear logic.
Jean-Jacques Lévy
Chapter 11. Confluence of Terminating Rewriting Computations
Abstract
Rewriting is an intentional model of computation which is inherently non-deterministic. Defining functions by rewriting requires to prove that the result of a given computation is unique for every input, a property called confluence. This chapter describes confluence criteria for first-order as well as higher-order, terminating rewriting computations described by means of a set R of rewrite rules and a set E of equations such as associativity and commutativity. These criteria aim at reducing the confluence property of \((R,E)\) to the joinability of its critical pairs, obtained by overlapping left-hand sides of rules at subterms, which is the only source of non-trivial non-determinism. Joinability is the property that the normal forms of a pair of expressions resulting from an overlap are equal modulo E. Two properties are therefore needed: termination of rewriting modulo E, and existence of finitely many most general unifiers modulo E for computing the overlaps. We show that the first-order and higher-order case yield the same results, obtained by following a uniform approach. In the higher-order case, an abstract axiomatization of types is provided so as to capture by a single result typed computations based on different type systems.
Jean-Pierre Jouannaud

Language Design and Programming Methodology

Frontmatter
Chapter 12. Programming with Union, Intersection, and Negation Types
Abstract
In this essay I present the advantages and, I dare say, the beauty of programming in a language with set-theoretic types, that is, types that include union, intersection, and negation type connectives. I show by several examples how set- theoretic types are necessary to type some common programming patterns, but also how they play a key role in typing several language constructs—from branching and pattern matching to function overloading and type-cases—very precisely.
I start by presenting the theory of types known as semantic subtyping and extend it to include polymorphic types. Next, I discuss the design of languages that use these types. I start by defining a theoretical framework that covers all the examples given in the first part of the presentation. Since the system of the framework cannot be effectively implemented, I then describe three effective restrictions of this system: \((i)\) a polymorphic language with explicitly-typed functions, \((ii)\) an implicitly typed polymorphic language á la Hindley-Milner, and \((iii)\) a monomorphic language that, by implementing classic union-elimination, precisely reconstructs intersection types for functions and implements a very general form of occurrence typing.
I conclude the presentation with a short overview of other aspects of these languages, such as pattern matching, gradual typing, and denotational semantics.
Giuseppe Castagna
Chapter 13. Right and Wrong: Ten Choices in Language Design
Abstract
A description of language design choices that have a profound effect on software quality, criticism of how ordinary OO languages addressed them, and explanation of the thinking behind Eiffel’s corresponding mechanisms.
Bertrand Meyer
Metadata
Title
The French School of Programming
Editor
Bertrand Meyer
Copyright Year
2024
Electronic ISBN
978-3-031-34518-0
Print ISBN
978-3-031-34517-3
DOI
https://doi.org/10.1007/978-3-031-34518-0

Premium Partner