Skip to main content
main-content

Über dieses Buch

The study of random graphs was begun by Paul Erdos and Alfred Renyi in the 1960s and now has a comprehensive literature. A compelling element has been the threshold function, a short range in which events rapidly move from almost certainly false to almost certainly true. This book now joins the study of random graphs (and other random discrete objects) with mathematical logic. The possible threshold phenomena are studied for all statements expressible in a given language. Often there is a zero-one law, that every statement holds with probability near zero or near one. The methodologies involve probability, discrete structures and logic, with an emphasis on discrete structures.
The book will be of interest to graduate students and researchers in discrete mathematics.

Inhaltsverzeichnis

Frontmatter

Beginnings

Frontmatter

0. Two Starting Examples

Abstract
The core of our work is the study of Zero-One Laws for sparse random graphs. We may think of this study as having two sources. The first is the Zero-One Law for random graphs with constant probability, as given in Section 0.1. The second is the notion of evolution of random graph as discussed in Section 1.1.1. In that evolution it is central that the edge probability p be taken not just as a constant but as a function p = p(n) of the total number of vertices. In Section 0.2 we examine such an evolution in the much easier case of a random unary predicate. To allow an easy introduction we avoid a plethora of notation in this chapter, the technical preliminaries — including many key definitions — are left for Chapter 1 and beyond.
Joel Spencer

1. Preliminaries

Abstract
First, what is a graph. We’ll be looking at graphs on vertex set V = {1, ..., n}. For a combinatorialist, our graphs will be undirected, with neither loops nor multiple edges. For a logician, our graph is a set V together with a symmetric irreflexive binary relation. We express the relation — the notion that i, j are joined by an edge — by writing i ͠ j. The terms “i, j are adjacent” , “i, j are joined by an edge” and i ͠ j all have the same meaning.
Joel Spencer

2. The Ehrenfeucht Game

Abstract
The Ehrenfeucht Game is a powerful tool for logicians but it can be an even more powerful tool for those with little knowledge of logic. The Game itself is defined without reference to Logic and the analysis of the game is combinatorial. Some basic bridging theorems allow one oftentimes to go from analysis of the Game to results about first order statements. Ehrenfeucht’s original paper [6] is a true classic.
Joel Spencer

Random Graphs

Frontmatter

3. Very Sparse Graphs

Abstract
We hold to the view proposed in the original papers of Erdös and Rényi that the random graph G(n, p) evolves as p increases from empty to full. In its early stages — much like natural evolution — the behaviors are relatively simple to describe. For the random graph, early stages means up to p ͠ 1/n. As we are viewing the random graph through only a first order lens we shall actually go a bit further in this section. We summarize the results of Section 3.1 – 3.5 with Theorem 3.0.8.
Joel Spencer

4. The Combinatorics of Rooted Graphs

Abstract
Formally a rooted graph is simply a graph H with a designated subset R of vertices, called the roots. We allow the degenerate case R = Ø but not R = V(H). The rooted graph is denoted by the pair (R, H) . For our purposes rooted graphs are a convenient notation for discussing the extension statements Ext(R, H) defined in Section 1.3 and our definitions will be meant to reflect the almost always character of random graphs. But we also feel that the theory of rooted graphs is a fascinating combinatorial structure of interest in its own right.
Joel Spencer

5. The Janson Inequality

Abstract
The Janson Inequality is a relatively recent result that has proven to be very useful in the study of random graphs. We begin with a general (though hardly the most general) formulation. An important and instructive example of its application is given after the statement of the Inequality.
Joel Spencer

6. The Main Theorem

Abstract
We fix an irrational α ∈ (0,1) throughout this chapter. All probabilities are with respect to the random graph G (n, p) with p = n . We recall the statement of our goal, the Main Theorem 1.4.1: For any first order A
$$\mathop {\lim }\limits_{x \to \infty } \Pr \left[ {G(n,{n^{ - \alpha }})| = A} \right] = 0or1$$
Joel Spencer

7. Countable Models

Abstract
Again we fix an irrational α between zero and one. Since the random graphs G(n, p) with p= n satisfy the Zero-One Law of our Main Theorem 1.4.1 the set T α of sentences that hold almost always form a complete theory. Here we examine an axiomatization of T α and its countable models.
Joel Spencer

8. Near Rational Powers of n

Abstract
Those with experience in random graphs should not be surprised that the Zero-One Law does not hold when p = n -1/3. Take a strictly balanced H with e edges, υ vertices and υ/ e = 1/3 — more specifically, take H = K 7. The number X of copies of K 7 has E [ X ] = ( n 7 ) p 21 → 1/7! . From Janson’s Inequality 5.0.4, or even the original methods of Erdös and Rényi, Pr[X = 0] → e –1/7!. Here X = 0 is the first order ¬∃K 7 and e -1/7! is certainly neither zero nor one. One might expect (as, indeed, this author once conjectured) that for any first order A lim n→ ∞ Pr [G(n, n -1/3) ⊨ A] would exist and one might even hope for a description of the limiting probabilities as in Section 3.6. We shall see in this chapter that very much the opposite is true, that there are first order A for which Pr [A] behaves very strangely at and near n -1/3.
Joel Spencer

Extras

Frontmatter

9. A Dynamic View

Abstract
Thus far we have employed what might be considered a static view of random graphs. We fix an edge probability p = p(n) and look at the behavior all first order sentences A in G ͠ G(n, p). Erdös and Rényi, as discussed in Section 1.1.1, always thought of the random graph as evolving from empty to full. Now we shall fix an arbitrary first order sentence A and consider what happens to Pr [A] as the random graphs evolves.
Joel Spencer

10. Strings

Abstract
Here we look at strings over a finite alphabet, such as abbacab or bbaacccb. Our sentences can use words like before, after, next, first, last and would include such possibilities as “Every a is immediately followed by a b,” or “Between every two bs lies a c.”
Joel Spencer

11. Stronger Logics

Abstract
What happens if we increase the power of the first order language of graphs? The examples of this chapter show that, in many cases, the Zero-One Laws no longer hold. We shall only consider the basic case p = ½. Even there relatively simple expansions of the language allow for sentences with no limiting probability.
Joel Spencer

12. Three Final Examples

Abstract
We close with three quite different random structures. While the structures are new the techniques we have already developed allow us a rather complete analysis of their First Order worlds. We wish our readers equal success.
Joel Spencer

Backmatter

Weitere Informationen