main-content

## Über dieses Buch

This comprehensive and self-contained textbook presents an accessible overview of the state of the art of multivariate algorithmics and complexity. Increasingly, multivariate algorithmics is having significant practical impact in many application domains, with even more developments on the horizon. The text describes how the multivariate framework allows an extended dialog with a problem, enabling the reader who masters the complexity issues under discussion to use the positive and negative toolkits in their own research. Features: describes many of the standard algorithmic techniques available for establishing parametric tractability; reviews the classical hardness classes; explores the various limitations and relaxations of the methods; showcases the powerful new lower bound techniques; examines various different algorithmic solutions to the same problems, highlighting the insights to be gained from each approach; demonstrates how complexity methods and ideas have evolved over the past 25 years.

## Inhaltsverzeichnis

### 1. Preliminaries

We give basic results and definitions in classical complexity theory. These include

NP

- and

PSpace

-completeness, concrete Karp and Cook reductions, the Cook–Levin Theorem, and the Stockmeyer–Meyer Theorem. We introduce some ideas, such as advice classes and randomized reductions, that will be of use later in the book.

Rodney G. Downey, Michael R. Fellows

### 2. The Basic Definitions

The fundamental notions of parameterized complexity are introduced.

Rodney G. Downey, Michael R. Fellows

### 3. Bounded Search Trees

The fundamental method of bounded search trees is introduced. This is essential for all investigations in parameterized complexity.

Rodney G. Downey, Michael R. Fellows

### 4. Kernelization

Kernelization is a pre-processing technique, which takes a large problem and shrinks it to a smaller one that has size depending only on the parameter. This method is another

fundamental

technique in parameterized complexity and we introduce it in this chapter.

Rodney G. Downey, Michael R. Fellows

### 5. More on Kernelization

We look at further kernelization techniques and in particular Turing kernels and kernelizations based upon Crown Structures.

Rodney G. Downey, Michael R. Fellows

### 6. Iterative Compression, and Measure and Conquer, for Minimization Problems

We introduce the technique of iterative compression. We illustrate how this can be combined with analytical techniques such as measure and conquer.

Rodney G. Downey, Michael R. Fellows

### 7. Further Elementary Techniques

We explore a number of elementary techniques including the extremal method, greedy localization and bounded variable integer programming.

Rodney G. Downey, Michael R. Fellows

### 8. Color Coding, Multilinear Detection, and Randomized Divide and Conquer

We introduce the important technique of color coding. We also show how this relates to randomized divide and conquer. Finally, we will show how to tighten the running times for randomized algorithms with Koutis’ (Proceedings of 35th International Colloquium on Automata, Languages and Programming, Part I, pp. 575–586,

2008

) idea of using

algebraic

general methods of speeding up algorithms obtained by color coding.

Rodney G. Downey, Michael R. Fellows

### 9. Optimization Problems, Approximation Schemes, and Their Relation to FPT

We begin to analyze the relationship between classical optimization problems and parameterized complexity. We explore syntactically defined classes of optimization problems that capture some of the main issues.

Rodney G. Downey, Michael R. Fellows

### 10. Treewidth and Dynamic Programming

A central concept in modern algorithm design uses the metaphor of treewidth, both in its original form for graphs, and its extensions to other areas. This chapter is devoted to developing the basic theory of treewidth, and fundamental aspects of producing treewidth algorithms by running dynamic programming on graphs.

Rodney G. Downey, Michael R. Fellows

### 11. Heuristics for Treewidth

We look at heuristics for computing optimal or near-optimal tree decompositions.

Rodney G. Downey, Michael R. Fellows

### 12. Methods via Automata and Bounded Treewidth

This chapter explores the basics of finite-state automata theory, including the generalization of the main theorems of linear automata theory to tree automata, and how these tools lead to FPT algorithmic methods, meta-theorems, and important concrete applications.

Rodney G. Downey, Michael R. Fellows

### 13. Courcelle’s Theorem

We examine monadic second-order logic MS

2

for graphs. We give a “Myhill–Nerode” proof of Courcelle’s Theorem: that MS

2

model checking, parameterized by the treewidth of the input graph, and the size of the MS

2

formula, is linear time fixed-parameter tractable. Seese’s Theorem, and MS

1

logic are also surveyed.

Rodney G. Downey, Michael R. Fellows

### 14. More on Width-Metrics: Applications and Local Treewidth

Applications of the ideas of width metrics, and associated FPT algorithmic strategies, are discussed in the context of logic and databases, matroids, knot polynomials, and other settings. The notion of local treewidth is introduced and the analogue of Courcelle’s Theorem for first order formulas and local treewidth is proved.

Rodney G. Downey, Michael R. Fellows

### 15. Depth-First Search and the Plehn–Voigt Theorem

In this chapter, we explore two methods for proving fixed-parameter tractability results based on the basic graph-theoretic algorithmic technique of depth-first search. These methods also build upon and extend the bounded-treewidth ideas of Chap.

12

.

Rodney G. Downey, Michael R. Fellows

### 16. Other Width Metrics

Treewidth is the best studied and perhaps the most important of a large number of width metrics in graphs. In this chapter we will explore some other metrics which have proven useful.

Rodney G. Downey, Michael R. Fellows

### 17. Well-Quasi-Orderings and the Robertson–Seymour Theorems

As we will see, well-quasi-orderings (

WQO

s) provide a powerful engine for demonstrating that classes of problems are

FPT

. In the next section, we will look at the rudiments of the theory of

WQO

s, and in subsequent sections, we will examine applications to combinatorial problems.

Rodney G. Downey, Michael R. Fellows

### 18. The Graph Minor Theorem

We give a broad sketch of the ideas of the Graph Minor Theorem. We discuss excluding a forest and more general consequences of excluding a graph as a minor. We prove the Thomas Lemma on the existence of lean and linked decompositions. We discuss the related immersion and topological orderings.

Rodney G. Downey, Michael R. Fellows

### 19. Applications of the Obstruction Principle and WQOs

We introduce methods to apply the Graph Minor Theorem and demonstrate theoretical tractability.

Rodney G. Downey, Michael R. Fellows

### 20. Reductions

We introduce the basic premises of parameterized reductions. They are fundamental to all hardness theory in this area, and essential for all of the rest of the book.

Rodney G. Downey, Michael R. Fellows

### 21. The Basic Class W[1] and an Analog of Cook’s Theorem

We introduce the basic class

W

[1] and prove the fundamental analogue of the Cook–Levin Theorem. This is central to much of the hardness theory for parameterized complexity.

Rodney G. Downey, Michael R. Fellows

### 22. Some Other W[1] Hardness Results

In this chapter we will analyze a couple of further

W

[1]-hardness and completeness results. These examples are taken from logic and formal language theory. We also look at further applications of the theory to arenas such as algebra and number theory in the exercises where problems such as the following are proven to be

W

[1]-hard.

Rodney G. Downey, Michael R. Fellows

### 23. The W-Hierarchy

We introduce the

W

-hierarchy of parameterized complexity classes in the tower

$$\mathrm{FPT}=W[0] \subseteq W[1] \subseteq W[2] \subseteq\cdots \subseteq W[t]\subseteq \cdots\subseteq W[\mathrm{SAT}]\subseteq \cdots\subseteq W[\mathrm{P}].$$

Presumably, all the inclusions are proper. The

W

-Hierarchy gives us means of comparing parameterized problems that are presumably

not

in

FPT

, with respect to the “strength” of their parameterized intractability. The key idea is to use a form of

circuit depth

in defining the classes. We discuss the remarkable fact that “almost all” (or at least a very large percentage) of the natural parameterized problems investigated to date are precisely

complete

for one of the first three classes in this hierarchy, a remarkable empirical finding that supports the definitional framework. We also introduce a natural generalization called the

W

[

t

]-hierarchy, about which only a little (but for exact parameterized complexity analysis, a

useful little

), is so far known.

Rodney G. Downey, Michael R. Fellows

### 24. The Monotone and Antimonotone Collapse Theorems: Monotone W[2t+1]=W[2t] and Antimonotone W[2t+2]=W[2t+1]

In this chapter, we will give proofs of the two structural collapses noted in Chap. 23.2.1 after Theorem 23.3.1. The techniques are similar to the proof of the Normalization Theorem.

Rodney G. Downey, Michael R. Fellows

### 25. Beyond W[t]-Hardness

We introduce the classes

W

[P] and variations, proving a number of concrete hardness and membership results for this level of the

W

-hierarchy.

Rodney G. Downey, Michael R. Fellows

### 26. Fixed Parameter Analogues of PSpace and k-Move Games

We define parameterized complexity classes that allow us to investigate the complexity of

k

-move games. We establish a number of concrete hardness and completeness results for games such as the

k

-move versions of

Geography

and

Chess

. We relate these classes to bounded space computations.

Rodney G. Downey, Michael R. Fellows

### 27. Provable Intractability: The Class XP

We look at the X-classes, and give analogues of exponential time, namely the class XP. We report on how

FPT

is provably (unconditionally) a

proper

subset of XP. We also prove an XP-completeness result.

Rodney G. Downey, Michael R. Fellows

### 28. Another Basis for the W-Hierarchy and the Tradeoff Theorem

We give a uniform circuit basis for the

W

-hierarchy. We introduce the syntactic classes

G

[

t

] and prove the Replacement and Tradeoff Theorems. These lead to evidence that the

W

-hierarchy is proper and give a different perspective on its naturality.

Rodney G. Downey, Michael R. Fellows

### 29. The M-Hierarchy, and XP-Optimality

We relate

NP

to parameterized complexity via the ETH, SETH, and prove results on

XP

-optimality. Namely, we give methods to show that, under reasonable hypotheses, current algorithms for various computational tasks are

optimal

up to an

O

-factor.

Rodney G. Downey, Michael R. Fellows

### 30. Kernelization Lower Bounds

We introduce powerful new techniques to show that FPT parameterized problems

do not

have polynomial-sized many : 1 kernels, under standard assumptions of classical complexity theory. A new completeness program for exploring the issue for Turing kernelization is also described.

Rodney G. Downey, Michael R. Fellows

### 31. Parameterized Approximation

This chapter introduces the idea of parameterized approximation. This is a method that either gives a “no” guarantee or an approximate solution. That is, we either see a certificate that there is no solution of size

k

or we find an approximate solution of size

g

(

k

). This chapter provides applications of parameterized approximation, as well as methods of demonstrating complete inapproximability, meaning that there is

no

approximate solution for any computable function

g

(

k

).

Rodney G. Downey, Michael R. Fellows

### 32. Parameterized Counting and Randomization

In this chapter, we will first look at parameterized counting problems as an analog to the classical problem of counting. We establish the classes #

W

[

t

] and related issues, and prove completeness results. We present the Flum–Grohe results on the hardness of counting

k

-cycles. Later we introduce a formal model for parameterized randomization. We prove an analog of the Valiant–Vazirani Theorem.

Rodney G. Downey, Michael R. Fellows

### 33. Research Horizons

In this final chapter, we describe what we think are some of the most important open problems of the field. In the first book, Downey and Fellows (Parameterized Complexity. Monographs in Computer Science, Springer, Berlin,

1999

), we offered two lists of problems, 18 altogether, that we then thought significant and especially challenging. As this book goes to press, 12 of these have been resolved! Many of the solutions to these problems involved significant new ideas and advances in the field.

Rodney G. Downey, Michael R. Fellows

### 34. Appendix 1: Network Flows and Matchings

In this appendix, we develop some basic machinery which allows us to begin work in

topological graph theory

, an area where the emphasis is on connectivity and “shape”.

Rodney G. Downey, Michael R. Fellows

### 35. Appendix 2: Menger’s Theorems

In this appendix, we supply a basic introduction to important min/max theorems concerning graph separators.

Rodney G. Downey, Michael R. Fellows

### Backmatter

Weitere Informationen