Skip to main content

2001 | Buch

Extremal Combinatorics

With Applications in Computer Science

verfasst von: Dr. Stasys Jukna

Verlag: Springer Berlin Heidelberg

Buchreihe : Texts in Theoretical Computer Science. An EATCS Series

insite
SUCHEN

Über dieses Buch

Combinatorial mathematics has been pursued since time immemorial, and at a reasonable scientific level at least since Leonhard Euler (1707-1783). It ren­ dered many services to both pure and applied mathematics. Then along came the prince of computer science with its many mathematical problems and needs - and it was combinatorics that best fitted the glass slipper held out. Moreover, it has been gradually more and more realized that combinatorics has all sorts of deep connections with "mainstream areas" of mathematics, such as algebra, geometry and probability. This is why combinatorics is now apart of the standard mathematics and computer science curriculum. This book is as an introduction to extremal combinatorics - a field of com­ binatorial mathematics which has undergone aperiod of spectacular growth in recent decades. The word "extremal" comes from the nature of problems this field deals with: if a collection of finite objects (numbers, graphs, vectors, sets, etc. ) satisfies certain restrictions, how large or how small can it be? For example, how many people can we invite to a party where among each three people there are two who know each other and two who don't know each other? An easy Ramsey-type argument shows that at most five persons can attend such a party. Or, suppose we are given a finite set of nonzero integers, and are asked to mark an as large as possible subset of them under the restriction that the sum of any two marked integers cannot be marked.

Inhaltsverzeichnis

Frontmatter

Prolog: What This Book Is About

Prolog: What This Book Is About

Many combinatorial problems have the following “extremal” formulation. Given a finite n-element set of points, the goal is to find the maximum (or minimum) possible cardinality of a system of its subsets satisfying certain assumptions. To get a feeling about what kind of problems this book deals with, we list several typical examples. (Although long, the list is far from being exhaustive.) The number(s) in brackets indicate the section(s), where the corresponding problem is discussed.

Stasys Jukna

Notation

Notation

In this section we give the notation that shall be standard throughout the book.

Stasys Jukna

The Classics

Frontmatter
1. Counting

We start with the oldest combinatorial tool — counting.

Stasys Jukna
2. Advanced Counting

When properly applied, the (double) counting argument can lead to more subtle results than those discussed in the previous chapter.

Stasys Jukna
3. The Principle of Inclusion and Exclusion

The principle of inclusion and exclusion (sieve of Eratosthenes) is a powerful tool in the theory of enumeration as well as in number theory. This principle relates the cardinality of the union of certain sets to the cardinalities of intersections of some of them, these latter cardinalities often being easier to handle.

Stasys Jukna
4. The Pigeonhole Principle

The

pigeonhole principle

(also known as

Dirichlet’s principle

) states the “obvious” fact that

n

+ 1 pigeons cannot sit in

n

holes so that every pigeon is alone in its hole. More generally, the pigeonhole principle states the following:

If a set consisting of more than kn objects is partitioned into n classes, then some class receives more than k objects

.

Stasys Jukna
5. Systems of Distinct Representatives

A system of distinct representatives for a sequence of (not necessarily distinct) sets S1, S2, . . ., S m is a sequence of distinct elements x1, x2, . . ., x m such that x i ∈ S i for all i = 1, 2, . . ., m.

Stasys Jukna
6. Colorings

A town has several clubs, with at least two members in each. One day, two famous lecturers visit the town and make two lectures. The problem is that the lectures are hold in parallel, in two different places and at the same time. Every club would like each of the lectures be visited by at least one of its members. Is it possible to arrange the attendance of inhabitants so that every club will know the contents of both lectures? This is a typical coloring problem: we want to distribute red and blue cards among the inhabitants so that each club will have cards of both colors.

Stasys Jukna

Extremal Set Theory

Frontmatter
7. Sunflowers

One of most beautiful results in extremal set theory is the so-called Sunflower Lemma discovered by Erdős and Rado (1960) asserting that in a sufficiently large uniform family, some highly regular configurations, called “sunflowers,” must occur, regardless of the size of the universe. In this chapter we will consider this result as well as some of its modifications and applications.

Stasys Jukna
8. Intersecting Families

A basic interrelation between sets is their intersection. The size (or other characteristics) of mutual intersections between the members of a given family reflects some kind of “dependence” between them. In this chapter we will study the weakest kind of this dependence — the members are required to be non-disjoint. A family is intersecting if any two of its sets have a non-empty intersection.

Stasys Jukna
9. Chains and Antichains

Partial ordered sets provide a common frame for many combinatorial configurations. Formally, a partially ordered set (or poset, for short) is a set P together with a binary relation < between its elements which is transitive and irreflexive: if x < y and y < z then x < z, but x < y and y < x cannot both hold. We write x ⩽ y if x < y or x = y. Elements x and y are comparable if either x ⩽ y or y ⩽ x (or both) hold.

Stasys Jukna
10. Blocking Sets and the Duality

In this chapter we will consider one of the most basic properties of set systems — their duality.

Stasys Jukna
11. Density and Universality
Stasys Jukna
12. Witness Sets and Isolation

Given a set A of distinct 0–1 vectors and a vector u in A, how many bits of u must we know in order to distinguish it from the other vectors in A? Such a set of bits is a witness for the fact that u ∉ A - {u}. In this chapter we will give some basic estimates on the size of these witnesses. We will also consider a related problem of how to isolate an object within a given universum according to its weight. Finally, we will describe the so-called “dictator paradox” saying that, if the society fulfills some simple “democracy axioms,” then there will always be an individual (a dictator?) whose options prevail against all options.

Stasys Jukna
13. Designs

The use of combinatorial objects, called designs, originates from statistical applications. Let us assume that we wish to compare v varieties of wines. In order to make the testing procedure as fair as possible it is natural to require that: (a)each participating person tastes the same number (say k) of varieties so that each person’s opinion has the same weight;(b)each pair of varieties is compared by the same number (say λ) of persons so that each variety gets the same treatment.

Stasys Jukna

The Linear Algebra Method

Frontmatter
14. The Basic Method

The general frame for the linear algebra method in combinatorics is the following: if we want to come up with an upper bound on the size of a set of objects, associate them with elements in a vector space V of relatively low dimension, and show that these elements are linearly independent; hence, we cannot have more objects in our set than the dimension of V.

Stasys Jukna
15. Orthogonality and Rank Arguments

Linear independence is one of the most basic concepts in linear algebra. Not less important are also the concepts of orthogonality and rank. In this chapter we consider some combinatorial applications of these two concepts.

Stasys Jukna
16. Span Programs

In 1993 Karchmer and Wigderson introduced an interesting linear algebraic model for computing boolean functions — the span program. A span program for a function f(x1, ..., x n ) is presented as a matrix over some field, with rows labeled by variables x i or their negations x̄ i (one variable can label many rows) . The span program accepts an input assignment if and only if the all-1 vector can be obtained as a linear combination of the rows whose labels are satisfied by the input. The size of the span program is the number of rows in the matrix. A span program is monotone if only positive literals are used as labels of the rows, i.e. negated variables are not allowed.

Stasys Jukna

The Probabilistic Method

Frontmatter
17. Basic Tools

The probabilistic method is a powerful tool for tackling many problems in discrete mathematics. This is one of the strongest tools in combinatorics and in graph theory. It is also useful in number theory and in combinatorial geometry. More recently, it has been applied in the development of efficient algorithms and in the study of various computational problems.

Stasys Jukna
18. Counting Sieve
Stasys Jukna
19. The Lovász Sieve

Let A1, . . ., A n be events in a probability space. In combinatorial applications the A i are usually “bad” events. We wish to show Prob (∩ Ā i ) > 0, so that there is a point (coloring, tournament, etc) which is “good.” By the counting sieve, this holds if Σ Prob (A i ) < 1. In one sense this is best possible: when events A i are pairwise disjoint, the condition cannot be weakened. At the opposite extreme, if the events A i are mutually independent, then the only thing we need to ensure ∩ Ā i ≠ ∅ is that all Prob (A i ) < 1. The Lovász sieve is employed when there is “much independence” among the A i .

Stasys Jukna
20. Linearity of Expectation

Let X1, ..., X n be random variables, and X = c1X1 + ..., + c n X n .

Stasys Jukna
21. The Deletion Method

As described in previous sections, the basic probabilistic method works as follows: trying to prove that an object with certain properties exists, one defines an appropriate probability space of objects and then shows that the desired properties hold in this space with positive probability. In this section, we consider situations where the “random” object does not have all the desired properties but may have a few “blemishes.” With a small alteration, we remove the blemishes, giving the desired structure.

Stasys Jukna
22. The Second Moment Method

The pigeonhole property of expectation says that a random variable X cannot always be smaller (or always greater) than its expectation E [X] The second moment property tells us more: if the variance of X is much smaller than E[X]2 then X “almost always” equals E[X], i.e., the values of X are concentrated around its expectation.

Stasys Jukna
23. The Entropy Function

Entropy is a basic concept of information theory. In this chapter we will consider some applications of this concept in combinatorics. Most of these applications were collected in the survey by Alon (1994a).

Stasys Jukna
24. Random Walks

There are n rocks forming a circle on a river, and there is a frog on one of those rocks. The frog jumps up himself with probability 1/3, jumps up to the right rock with probability 1/3 and jumps to the left one with probability 1/3, too. The frog problem consists of knowing: where will the frog be after t times?

Stasys Jukna
25. Randomized Algorithms

We have seen how coin flipping can help to design proofs. Randomness can also help to design quite efficient algorithms. In this chapter we will demonstrate the idea with several examples. Comprehensive guides to the state of the art of randomized algorithms are the books by Alon and Spencer (1992), and by Motwani and Raghavan (1995).

Stasys Jukna
26. Derandomization

Given a randomized algorithm A, a natural approach towards derandomizing it is to find a method for searching the associated sample space Ω for a good point ω with respect to a given input instance x; a point is good for x if A(x, ω) = f(x). Given such a point ω, the algorithm A(x,ω) becomes a deterministic algorithm and it is guaranteed to find the correct solution. The problem faced in searching the sample space is that it is usually exponential in size.

Stasys Jukna

Fragments of Ramsey Theory

Frontmatter
27. Ramsey’s Theorem

In 1930 Frank Plumpton Ramsey had written a paper On a problem in formal logic which initiated a part of discrete mathematics nowadays known as Ramsey Theory. At about the same time B.L. van der Waerden (1927) proved his famous Ramsey-type result on arithmetical progressions. A few years later Ramsey’s theorem was rediscovered by P. Erdős and G. Szekeres (1935) while working on a problem in geometry. In 1963 A.W. Hales and R.I. Jewett revealed the combinatorial core of van der Waerden’s theorem and proved a general result which turned this collection of separate ingenious results into Ramsey Theory. In the next three chapters we present some of the most basic facts of this theory.

Stasys Jukna
28. Ramseyan Theorems for Numbers

In this chapter we will discuss several Ramsey-type problems in additive number theory.

Stasys Jukna
29. The Hales-Jewett Theorem

In 1963 A. W. Hales and R. I. Jewett proved a very general result from which most of Ramsey-like theorems may be easily derived. Hales-Jewett theorem is presently one of the most useful techniques in Ramsey theory. Without this result, Ramsey theory would more properly be called Ramseyan theorems.

Stasys Jukna
Epilog: What Next?

In this book we have touched some of the basic ideas and methods of extremal combinatorics. For further in-depth study of the subject I would recommend the following texts.

Stasys Jukna
Backmatter
Metadaten
Titel
Extremal Combinatorics
verfasst von
Dr. Stasys Jukna
Copyright-Jahr
2001
Verlag
Springer Berlin Heidelberg
Electronic ISBN
978-3-662-04650-0
Print ISBN
978-3-642-08559-8
DOI
https://doi.org/10.1007/978-3-662-04650-0