Skip to main content
Top

2022 | Book

Markov Chains on Metric Spaces

A Short Course

insite
SEARCH

About this book

This book gives an introduction to discrete-time Markov chains which evolve on a separable metric space.

The focus is on the ergodic properties of such chains, i.e., on their long-term statistical behaviour. Among the main topics are existence and uniqueness of invariant probability measures, irreducibility, recurrence, regularizing properties for Markov kernels, and convergence to equilibrium. These concepts are investigated with tools such as Lyapunov functions, petite and small sets, Doeblin and accessible points, coupling, as well as key notions from classical ergodic theory. The theory is illustrated through several recurring classes of examples, e.g., random contractions, randomly switched vector fields, and stochastic differential equations, the latter providing a bridge to continuous-time Markov processes.
The book can serve as the core for a semester- or year-long graduate course in probability theory with an emphasis on Markov chains or random dynamics. Some of the material is also well suited for an ergodic theory course. Readers should have taken an introductory course on probability theory, based on measure theory. While there is a chapter devoted to chains on a countable state space, a certain familiarity with Markov chains on a finite state space is also recommended.

Table of Contents

Frontmatter
Chapter 1. Markov Chains
Abstract
We define Markov chains and kernels and state their very first properties, the Markov and strong Markov properties. At the end of the chapter, an introduction to Markov chains in continuous time, usually called Markov processes, is given.
Michel Benaïm, Tobias Hurth
Chapter 2. Countable Markov Chains
Abstract
This chapter is a self-contained mini course on Markov chains with a countable state space. The notions of recurrence and transience are introduced and it is shown how these properties can be verified with the help of Lyapunov functions. We then explain how Lyapunov functions can be used to obtain estimates on the moments of hitting times for a point or a finite set. In addition, we state and prove the ergodic theorem for countable chains, which asserts that the law of the chain converges to a unique stationary distribution provided that the chain is positive recurrent and aperiodic. Towards the end of the chapter, renewal processes are briefly investigated.
Michel Benaïm, Tobias Hurth
Chapter 3. Random Dynamical Systems
Abstract
The chapter is devoted to random dynamical systems. These are Markov chains for which the state variable at time n +1 is a deterministic function of the state variable at time n and a random input sampled from an i.i.d. sequence of random variables. The main result is the fact that any given Markov chain can be represented by such a random dynamical system. In the exercises, Bernoulli convolutions and the Propp-Wilson algorithm are explored.
Michel Benaïm, Tobias Hurth
Chapter 4. Invariant and Ergodic Probability Measures
Abstract
The chapter starts with a detailed exposition on weak convergence and tightness. Then the class of invariant probability measures is defined and it is shown that the weak limit points for the empirical occupation measures of a Feller chain almost surely belong to this class. For the empirical occupation measures, we discuss tightness criteria based on Lyapunov functions. Next, random contractions are presented as a class of random dynamical systems for which one has uniqueness of the invariant probability measure. Then several classical results from ergodic theory are stated and proved, most importantly the Poincaré recurrence theorem, the Birkhoff ergodic theorem, and the ergodic decomposition theorem, the latter both for deterministic dynamical systems and for Markov chains. Finally, invariant probability measures for Markov processes in continuous time are discussed and it is explained how their properties can be deduced from the discrete-time theory.
Michel Benaïm, Tobias Hurth
Chapter 5. Irreducibility
Abstract
The chapter is devoted to various notions of irreducibility that ensure the uniqueness of the invariant probability measure. We start with a measure-theoretic notion and then move on to more topological ones. For a Feller chain, the set of accessible points is introduced and its relation with the supports of invariant probability measures is investigated. For strong Feller and asymptotic strong Feller chains, we prove that ergodic measures have disjoint supports. These results have the useful consequence that, on a connected set, the existence of an invariant probability measure with full support implies unique ergodicity.
Michel Benaïm, Tobias Hurth
Chapter 6. Petite Sets and Doeblin Points
Abstract
The notions of petite set, small set, and (weak) Doeblin point are discussed. For Feller chains, it is shown that the existence of an accessible weak Doeblin point implies irreducibility and thus unique ergodicity. This observation is applied to several examples in discrete and continuous time. In particular, for piecewise deterministic Markov processes and stochastic differential equations, it is shown how the accessibility condition can be phrased as a control problem and how Doeblin points are closely related to the classical Hörmander conditions for hypoelliptic diffusions.
Michel Benaïm, Tobias Hurth
Chapter 7. Harris and Positive Recurrence
Abstract
Harris recurrence is introduced and it is proved that the existence of a recurrent petite set is a sufficient condition for this type of convergence. We also discuss criteria for the existence of a recurrent set, based on Lyapunov functions. Finally, moment estimates for the return times to such a set are derived.
Michel Benaïm, Tobias Hurth
Chapter 8. Harris Ergodic Theorem
Abstract
The chapter starts with a simple version of the Harris ergodic theorem where the entire state space is a petite set. Under this hypothesis, the law of the chain converges at an exponential rate in total variation to the unique invariant probability measure. The same conclusion holds under the weaker assumption that the chain is forced to enter a certain small set by a suitable Lyapunov function. We give two proofs for this result: the recent one by Hairer and Mattingly where a semi-norm is constructed for which the Markov operator is a contraction; and the more classical argument based on a coupling and ideas from renewal theory. Finally, we present a condition, also due to Hairer and Mattingly, that yields exponential convergence to the unique invariant probability measure in a certain Wasserstein distance.
Michel Benaïm, Tobias Hurth
Backmatter
Metadata
Title
Markov Chains on Metric Spaces
Authors
Michel Benaïm
Tobias Hurth
Copyright Year
2022
Electronic ISBN
978-3-031-11822-7
Print ISBN
978-3-031-11821-0
DOI
https://doi.org/10.1007/978-3-031-11822-7