Skip to main content
main-content

Über dieses Buch

This book presents a selection of topics from probability theory. Essentially, the topics chosen are those that are likely to be the most useful to someone planning to pursue research in the modern theory of stochastic processes. The prospective reader is assumed to have good mathematical maturity. In particular, he should have prior exposure to basic probability theory at the level of, say, K.L. Chung's 'Elementary probability theory with stochastic processes' (Springer-Verlag, 1974) and real and functional analysis at the level of Royden's 'Real analysis' (Macmillan, 1968). The first chapter is a rapid overview of the basics. Each subsequent chapter deals with a separate topic in detail. There is clearly some selection involved and therefore many omissions, but that cannot be helped in a book of this size. The style is deliberately terse to enforce active learning. Thus several tidbits of deduction are left to the reader as labelled exercises in the main text of each chapter. In addition, there are supplementary exercises at the end. In the preface to his classic text on probability ('Probability', Addison­ Wesley, 1968), Leo Breiman speaks of the right and left hands of probability.

Inhaltsverzeichnis

Frontmatter

1. Introduction

Abstract
Let (Ω, F, P) be a probability space. To recapitulate: fi is a set called the “sample space”. Its elements are called sample points.F is a σ-field of subsets of Ω containing fi itself. Elements of F are called events. P is a probability measure (i.e., a countably additive nonnegative measure with total mass 1) on the measurable space (Ω, F). If an event A is of the type A = ω ∈ Ω R (ω) for some property R (.), we may write P(R) for P(A). An event is called a sure event if P(A) = 1 and a null event if P(A) = 0. Alternatively, R (.) is said to hold almost surely (a.s. for short) if P(R) = 1. Many statements in probability theory are made with the qualification “almost surely”, though this may not always be stated explicitly.
Vivek S. Borkar

2. Spaces of Probability Measures

Abstract
Let S be a Polish space with a complete metric d taking values in [0,1] and P(S) the space of probability measures on S. Recall the map h : S → [0,1] of Theorem 1.1.1. Since \( \overline {h(S)} \) is compact, \( \overline {C(h(s)} ) \) is separable. Let fi be countable dense in the unit ball of \( \overline {C(h(s)} ) \) and {f′ i} their restrictions to h(h). Define {fi} ⊂ Cb(S) (= the space of bounded continuous functions S → R) by fi = fi o h, i ≥ 1.
Vivek S. Borkar

3. Conditioning and Martingales

Abstract
Let (Ω,F, P) be a probability space and A, BF events such that P(B) > 0. In this section we seek to formalize the intuitive content of the phrase “probability of A given (or, equivalently, “conditioned on”) B”. The words “given B ” imply prior knowledge of the fact that the sample point ω lies in B. It is then natural to consider the reduced probability space (Ω, Fb, Pb) with ΩB = B,FB = A∩B AF (the “trace σ-field”), PB(C) = P(C)/P(B) for CTB, and define the probability of A given B, denoted by P(A/B), as PB(A ∩ B) = P(AB)/P(B). The reader may convince himself of the reasonableness of this procedure by evaluating, say, the probability of two consecutive heads in four tosses of a fair coin given the knowledge that the number of heads was even, using the principle of insufficient reason.
Vivek S. Borkar

4. Basic Limit Theorems

Abstract
Limit theorems form a cornerstone of probability theory. These are results that describe the asymptotic behaviour of sequences of random variables, usually suitably normalized partial sums of another sequence of random variables. Even today a lot of research activity in the field is directed towards refining and extending them.
Vivek S. Borkar

5. Markov Chains

Abstract
A stochastic process or a random process is a family of random variables Xt,tI, taking values in some measurable space (E, ξ), with I = an interval of integers or reals. I has the interpretation of a (discrete or continuous) time interval, t thereby being a time index. The simplest random process in discrete time is a sequence of independent random variables. This can be considered as a process without memory: the value of the process at a given time instant is independent of the values it took in the past. The next level of complication one can conceive of is that of a one step memory: the value taken by the process at time n +1 depends on its past values only through its dependence, if any, on the value at time n, not otherwise. To be precise, it is conditionally independent of its values up to time n conditioned on its value at time n. This is called the Markov property and the process a Markov chain.
Vivek S. Borkar

6. Foundations of Continuous-Time Processes

Abstract
This chapter introduces some basics of continuous time stochastic processes, mainly those concerning their construction and sample path properties. A detailed study of these processes is a vast enterprise and is well beyond the scope of this book. For simplicity, we shall restrict to real-valued processes on the time interval [0,1], that is, a family X t of real random variables on some probability space, indexed by the time variable t ∈ [0,1]. Much of what we do below extends to arbitrary time intervals and more general (e.g. Polish space-valued) processes with minor modifications.
Vivek S. Borkar

Backmatter

Weitere Informationen