main-content

## Über dieses Buch

A basic issue in computer science is the complexity of problems. Computational complexity measures how much time or memory is needed as a function of the input problem size. Descriptive complexity is concerned with problems which may be described in first-order logic. By virtue of the close relationship between logic and relational databses, it turns out that this subject has important applications to databases such as analysing the queries computable in polynomial time, analysing the parallel time needed to compute a query, and the analysis of nondeterministic classes. This book is written as a graduate text and so aims to provide a reasonably self-contained introduction to this subject. The author has provided numerous examples and exercises to further illustrate the ideas presented.

## Inhaltsverzeichnis

### Introduction

Abstract
In the beginning, there were two measures of computational complexity: time and space. From an engineering standpoint, these were very natural measures, quantifying the amount of physical resources needed to perform a computation. From a mathematical viewpoint, time and space were somewhat less satisfying, since neither appeared to be tied to the inherent mathematical complexity of the computational problem.
Neil Immerman

### 1. Background in Logic

Abstract
Mathematics enables us to model many things abstractly. Group theory, for example, abstracts features of such diverse activities as English changeringing and quantum mechanics. Mathematical logic carries the abstraction one level higher: it is a mathematical model of mathematics. This book shows that the computational complexity of all problems in computer science can be understood via the complexity of their logical descriptions. We begin with a high-level introduction to logic. Although much of the material is well-known, we urge readers to at least skim this background chapter as the concentration on finite and ordered structures, i.e., relational databases, is not standard in most treatments of logic.
Neil Immerman

### 2. Background in Complexity

Abstract
Computational Complexity measures the amount of computational resources, such as time and space, that are needed, as a function of the size of the input, to compute a query. This chapter introduces the reader to complexity theory. We define the complexity measures and complexity classes that we study in the rest of the book. We also explain some of their basic properties, complete problems, and interrelationships.
Neil Immerman

### 3. First-Order Reductions

Abstract
First-order reductions are simple translations that have very little computational power of their own. Thus it is surprising that the vast majority of natural complete problems remain complete via first-order reductions. We provide examples of such complete problems for many complexity classes. All the complexity classes and descriptive languages studied in this book are closed under first-order reductions. Once we express a complete problem for some complexity class C in a language L, it will follow that L contains all queries computable in C.
Neil Immerman

### 4. Inductive Definitions

Abstract
First-order logic is not rich enough to express most interesting computations. A useful and natural way to increase the expressive power of first-order logic is to add the power to define new relations by induction. In this chapter we formalize the notion of inductive definitions via the least fixed point operator (LFP). We prove that first-order logic extended by the least fixed point operator captures exactly polynomial-time.
Neil Immerman

### 5. Parallelism

Abstract
Descriptive complexity is inherently parallel in nature. This is a particularly delightful dividend of applying this form of logic to computer science. The time to compute a query on a certain parallel computer corresponds exactly to the depth of a first-order induction needed to describe the query. There is also a close relationship between the amount of hardware used — memory and processors — and the number of variables in the inductive definition.
Neil Immerman

### 6. Ehrenfeucht-Fraïssé Games

Abstract
We introduce combinatorial games that are useful for determining what can be expressed in various logics. Ehrenfeucht-Fraïssé games offer a semantics for first-order logic that is equivalent to, but more directly applicable than, the standard definitions.
Neil Immerman

### 7. Second-Order Logic and Fagin’s Theorem

Abstract
Second-order logic consists of first-order logic plus the power to quantify over relations on the universe. We prove Fagin’s theorem which says that the queries computable in NP are exactly the second-order existential queries. A corollary due to Stockmeyer says that the second-order queries are exactly those computable in the polynomial-time hierarchy.
Neil Immerman

### 8. Second-Order Lower Bounds

Abstract
Although second-order logic is very powerful, when we restrict second-order quantifiers to monadic relations, we can prove some strong lower bounds using Ehrenfeucht-Fraïssé games.
Neil Immerman

### 9. Complementation and Transitive Closure

Abstract
Over infinite structures, there is a strict hierarchy of languages that is obtained by alternating uses of the least-fixed point operator and negation. For finite structures, we show that the hierarchy collapses to its first-level.
Neil Immerman

### 10. Polynomial Space

Abstract
Polynomial Space is the largest complexity class studied in this book. It consists of what we could compute with a feasible amount of hardware, but with no time limit. PSPACE has several apparently quite different descriptive characterizations.
Neil Immerman

### 11. Uniformity and Precomputation

Abstract
The first two dimensions of complexity are parallel time and hardware. These can be measured by the descriptive resources quantifier depth and number of variables, respectively. The third dimension is precomputation; the work that initially goes into designing the program, formula, or circuit. Precomputation is less well understood than time and hardware.
Neil Immerman

### 12. The Role of Ordering

Abstract
To characterize complexity classes, first-order descriptive complexity requires that each structure has an ordering on its universe. Such an ordering, however, is irrelevant to the properties of graphs or databases that we want to compute. It is easy to use pebble games to prove lower bounds on languages without ordering; on ordered structures these arguments do not work. Furthermore, for databases, the ordering is a low-level implementation issue — how entries are stored in memory or disk — which should be invisible to the person writing queries. For all these reasons, the descriptive characterization of the order-independent queries computable in a given complexity class is a fundamental open problem.
Neil Immerman

### 13. Lower Bounds

Abstract
The very simple problem PARITY is too hard for first-order logic, no matter what numeric predicates we add. When we add counting, but remove ordering, PARITY is expressible. However, a different sort of parity problem becomes inexpressible. A related lower bound suggests that complete problems for P are inherently sequential.
Neil Immerman

### 14. Applications

Abstract
The largest application of descriptive complexity is to the theory of databases. A relational database is a finite logical structure. This chapter begins with a discussion of the expressibility of relational queries. Other applications covered are Dynamic Complexity and Computer Aided Verification.
Neil Immerman

### 15. Conclusions and Future Directions

Abstract
In these last few pages we give our personal view of what descriptive complexity has achieved to date. We also underline some of the current lines of research that seem especially promising. Finally, we outline some possible applications and future directions that although challenging we consider especially worthwhile.
Neil Immerman

### Backmatter

Weitere Informationen