main-content

The role of Monte Carlo methods and simulation in all of the sciences has in­ creased in importance during the past several years. These methods are at the heart of the rapidly developing subdisciplines of computational physics, compu­ tational chemistry, and the other computational sciences. The growing power of computers and the evolving simulation methodology have led to the recog­ nition of computation as a third approach for advancing the natural sciences, together with theory and traditional experimentation. Monte Carlo is also a fundamental tool of computational statistics. At the kernel of a Monte Carlo or simulation method is random number generation. Generation of random numbers is also at the heart of many standard statis­ tical methods. The random sampling required in most analyses is usually done by the computer. The computations required in Bayesian analysis have become viable because of Monte Carlo methods. This has led to much wider applications of Bayesian statistics, which, in turn, has led to development of new Monte Carlo methods and to refinement of existing procedures for random number generation.

Chapter 1. Simulating Random Numbers from a Uniform Distribution

Abstract
Because many statistical methods rely on random samples, applied statisticians often need a source of “random numbers”. Older reference books for use in statistical applications contained tables of random numbers, which were intended to be used in selecting samples or in laying out a design for an experiment. Statisticians now rarely use printed tables of random numbers, but occasionally computer-accessed versions of such tables are used. Far more often, however, the computer is used to generate “random” numbers directly.
James E. Gentle

Chapter 2. Transformations of Uniform Deviates: General Methods

Abstract
Sampling of random variates from a nonuniform distribution is usually done by applying a transformation to uniform variates. Each realization of the nonuniform random variable might be obtained from a single uniform variate or from a sequence of uniforms. Some methods that use a sequence of uniforms require that the sequence be independent; other methods use a random walk sequence, a Markov chain.
James E. Gentle

Chapter 3. Simulating Random Numbers from Specific Distributions

Abstract
For the important distributions, specialized algorithms based on the general methods discussed in the previous chapter are available. The important difference in the algorithms is their speed. A secondary difference is the size and complexity of the program to implement the algorithm. Because all of the algorithms for generating from nonuniform distributions rely on programs to generate from uniform distributions, an algorithm that uses only a small number of uniforms to yield a variate of the target distribution may be faster on a computer system on which the generation of the uniform is very fast. As we have mentioned, on a given computer system there may be more than one program available to generate uniform deviates. Often a portable generator is slower than a nonportable one, so for portable generators of nonuniform distributions those that require a small number of uniform deviates may be better. If evaluation of elementary functions is a part of the algorithm for generating random deviates, then the speed of the overall algorithm depends on the speed of the evaluation of the functions. The relative speed of elementary function evaluation is different on different computer systems.
James E. Gentle

Chapter 4. Generation of Random Samples and Permutations

Abstract
In applications of statistical techniques as well as in Monte Carlo studies it is often necessary to take a random sample from a given finite set. A common form of random sampling is simple random sampling without replacement, in which a sample of n items is selected from a population N in such a way that every subset of size n from the universe of N items has an equal chance of being the sample chosen. This is equivalent to a selection mechanism in which n different items are selected, each with equal probability, n/N, and without regard to which other items are selected.
James E. Gentle

Chapter 5. Monte Carlo Methods

Abstract
The most common applications of Monte Carlo methods in numerical computations are for evaluating integrals. Monte Carlo methods can also be used in solving systems of equations (see Chapter 7 of Hammersley and Handscomb, 1964, for example), but other methods are generally better, especially for matrices that are not sparse.
James E. Gentle

Chapter 6. Quality of Random Number Generators

Abstract
Ziff (1992) describes a simulation requiring a total of 6 × 1012 random numbers and using a few months computing time on about a dozen workstations running simultaneously. Research work like this that depends so heavily on random numbers emphasizes the need for high-quality random number generators. Yet, as we have seen, not all random number generators are good ones (Park and Miller, 1988, and Ripley, 1988).
James E. Gentle

Chapter 7. Software for Random Number Generation

Abstract
Random number generators are widely available in a variety of software packages. As Park and Miller (1988) state, however, “good ones are hard to find”.
James E. Gentle

Chapter 8. Monte Carlo Studies in Statistics

Abstract
In statistical inference, certain properties of the test statistic or estimator must be assumed to be known. In simple cases, under rigorous assumptions we have complete knowledge of the statistic. In testing a mean of a normal distribution, we use a t statistic, and we know its exact distribution. In other cases, however, we may have a perfectly reasonable test statistic, but know very little about its distribution. For example, suppose a statistic T, computed from a differenced time series, could be used to test the hypothesis that the order of differencing is sufficient to yield a series with a zero mean. If the standard deviation of T is known under the null hypothesis, that value may be used to construct a test that the differencing is adequate. This, in fact, was what Erastus Lyman de Forest did in the 1870s, in one of the earliest documented Monte Carlo studies of a statistical procedure. De Forest studied ways of smoothing a time series by simulating the data using cards drawn from a box. A description of De Forest’s Monte Carlo study is given in Stigler (1978). Stigler (1991) also describes other Monte Carlo simulation by nineteenth-century scientists, and suggests that “Simulation, in the modern sense of that term, may be the oldest of the stochastic arts”.
James E. Gentle