Skip to main content

1996 | Buch

Monte Carlo

Concepts, Algorithms, and Applications

verfasst von: George S. Fishman

Verlag: Springer New York

Buchreihe : Springer Series in Operations Research and Financial Engineering

insite
SUCHEN

Über dieses Buch

This book provides an introduction to the Monte Carlo method suitable for a one-or two-semester course for graduate and advanced undergraduate students in the mathematical and engineering sciences. It also can serve as a reference for the professional analyst. In the past, my inability to provide students with a single­ source book on this topic for class and for later professional reference had left me repeatedly frustrated, and eventually motivated me to write this book. In addition to focused accounts of major topics, the book has two unifying themes: One concerns the effective use of information and the other concerns error control and reduction. The book describes how to incorporate information about a problem into a sampling plan in a way that reduces the cost of estimating its solution to within a specified error bound. Although exploiting special structures to reduce cost long has been a hallmark of the Monte Carlo method, the propen­ sity of users of the method to discard useful information because it does not fit traditional textbook models repeatedly has impressed me. The present account aims at reducing the impediments to integrating this information. Errors, both statistical and computational, abound in every Monte Carlo sam­ pling experiment, and a considerable methodology exists for controlling them.

Inhaltsverzeichnis

Frontmatter
1. Introduction
Abstract
The Monte Carlo method provides approximate solutions to a variety of mathematical problems by performing statistical sampling experiments on a computer. Remarkably, the method applies to problems with absolutely no probabilistic content as well as to those with inherent probabilistic structure. This alone does not give the Monte Carlo method an advantage over other methods of approximation. However, among all numerical methods that rely on n-point evaluations in m-dimensional space to produce an approximate solution, the Monte Carlo method has absolute error of estimate that decreases as n −l/2 whereas, in the absence of exploitable special structure, all others have errors that decrease as n −l/m at best. This property gives the Monte Carlo method a considerable edge in computational efficiency as m, the size of the problem, increases. Combinatorial settings illustrate this property especially well. Whereas the exact solution to a combinatorial problem with m elements often has computational cost that increases exponentially or superexponentially with m, the Monte Carlo method. frequently provides an estimated solution with tolerable error at a cost that increases no faster than as a polynomial in m.
George S. Fishman
2. Estimating Volume and Count
Abstract
This chapter introduces the reader to fundamental issues that arise when applying the Monte Carlo method to solving a commonly encountered problem in numerical computation. In its most basic form the problem is to evaluate the volume of a bounded region in multi-dimensional euclidean space. The more general problem is to evaluate the integral of a function on such a region. The Monte Carlo method often offers a competitive and sometimes the only useful solution to the problem. The appeal of the Monte Carlo method arises when the shape of the region of interest makes solution by analytical methods impossible and, in the case of the more general function integration, when little is known about the smoothness and variational properties of the integrand, or what is known precludes the applications of alternative numerical evaluation techniques.
George S. Fishman
3. Generating Samples
Abstract
Generating a random sample constitutes an essential feature of every Monte Carlo experiment. If
$$\zeta = \int_R {\varphi (x)} dx,$$
(1)
as in expression (2.67), is to be estimated, then a procedure that randomly generates points from the uniform distribution on ℱm would suffice. When the cost of evaluating φ(x) in Algorithm \({\overline \zeta _n}\) in Sec. 2.7 is prohibitively high, the equivalence of expression(1) and
$$\zeta = \int_{{\mathbb{R}^m}} {\kappa (Z)dF(Z)}$$
(2)
in expression (2.72c) becomes of interest. Recall that, for A⊑ℬ ⊑ℝm, 0≤F(A) ≤FF(ℬ) ≤F(ℝm) = 1. Since F is a distribution function, an alternative procedure randomly samples Z (1),...,Z (n) independently from F and computes
$${\ddot \zeta _n} = {n^{ - 1}}\sum\limits_{i = 1}^n {\kappa ({Z^{(i)}})}$$
as an unbiased estimate of ζ. Although generating Z (i) from F always costs more than generating X (i) from the uniform distribution on ℱm, evaluating k(Z (i)) may be considerably less costly than evaluating φ(X (i)).
George S. Fishman
4. Increasing Efficiency
Abstract
Section 2.7 broadened the concept of a Monte Carlo experiment to the general problem of evaluating the Lebesgue-Stieltjes integral
$$\zeta = \int_\xi {k\left( z \right)} dF\left( z \right),$$
(1)
where z = (z 1,..., z m ), {F(z)} is a joint d.f. on the m-dimensional region and {k(z)} denotes a weighting kernel defined on Also, the introduction to Ch. 3 indicates that alternative d.f.s and kernels may exist which satisfy expression (1). We call each {F(z), k(z);z } satisfying expression (1) a sampling plan, since each provides a basis for generating data which can be used to estimate ζ unbiasedly. Whereas Ch. 3 describes how to generate data efficiently from a particular {F(z), z } once a sampling plan is chosen, the present chapter studies the relative desirability of alternative sampling plans from the viewpoint of computational efficiency.
George S. Fishman
5. Random Tours
Abstract
This chapter considerably broadens the range of application of the Monte Carlo method by introducing the concept of a random tour on discrete, continuous, and general state spaces. This development serves several purposes, of which the ability to sample from a multivariable distribution remains preeminent. Let {F(x), x ∈ ℋ} denote an m-dimensional d.f. defined on a region ℋ ⊆ ℝ m , and let {g(x), x ∈ ℋ} denote a known function satisfying ∫ g 2(x)dF(x) < ∞. Suppose that the objective is to evaluate
$$\mu \left( g \right) = \int {_x} g\left( x \right)dF\left( x \right).$$
(0)
George S. Fishman
6. Designing and Analyzing Sample Paths
Abstract
As in Ch. 5, we take as our objective the estimation of
$$\mu = \mu \left( g \right) = \int_X {g\left( {\text{x}} \right){\text{d}}F\left( {\text{x}} \right)} $$
(1)
, where F denotes an m-dimensional d.f. on \(X \subseteq {\mathbb{R}^m}\). Consider a Monte Carlo Markov sampling experiment composed of n independent replications, each of which begins in a state drawn from an initializing nonequilibrium distribution π0. After a warm-up interval of k — 1 steps on each replication, sampling continues for t additional steps and one uses the n independent truncated sample paths or realizations, each of length t, to estimate µ. Whereas Ch. 5 concentrates on sample path generating algorithms and a conceptual understanding of convergence to an equilibrium state, this chapter focuses on sampling plan design and statistical inference. With regard to design, the chapter shows how the choices of k, n, π0, and t affect computational and statistical efficiency. With regard to statistical inference, it describes methods for estimating the warm-up interval k that significantly mitigate the influence of the initial states drawn from the nonequilibrium distribution π0. Also, it describes methods for computing asymptotically valid confidence intervals for µ in expression (1) as n → ∞ for fixed t, as t → ∞ for fixed n and as both n → ∞ and t → ∞. Because confidence intervals inevitably depend on variance estimates, we need to impose a moderately stronger restriction on g. Whereas the assumption \(\int_X {{g^2}\left( {\text{x}} \right){\text{d}}F\left( {\text{x}} \right)} < \infty \) in Ch. 5 guarantees a finite variance for a single observation on a sample path, the assumption \(\int_X {{g^4}\left( {\text{x}} \right){\text{d}}F\left( {\text{x}} \right)} < \infty \) is necessary for us to obtain consistent estimators of that variance and of other variances that play essential roles in the derivation of asymptotically valid confidence intervals for µ.
George S. Fishman
7. Generating Pseudorandom Numbers
Abstract
Every Monte Carlo experiment relies on the availability of a procedure that supplies sequences of numbers from which arbitrarily selected nonoverlapping subsequences appear to behave like statistically independent sequences and where the variation in an arbitrarily chosen subsequence of length k (≥1) resembles that of a sample drawn from the uniform distribution on the k-dimensional unit hyper-cube \({\mathcal{I}^k}\). The words “appear to behave” and “resemble” alert the reader to yet another potential source of error that arises in Monte Carlo sampling. In practice, many procedures exist for generating these sequences. In addition to this error of approximation, the relative desirability of each depends on its computing time, on its ease of use, and on its portability By portability, we mean the ease of implementing a procedure or algorithm on a variety of computers, each with its own hardware peculiarities.
George S. Fishman
Backmatter
Metadaten
Titel
Monte Carlo
verfasst von
George S. Fishman
Copyright-Jahr
1996
Verlag
Springer New York
Electronic ISBN
978-1-4757-2553-7
Print ISBN
978-1-4419-2847-4
DOI
https://doi.org/10.1007/978-1-4757-2553-7