Skip to main content

1995 | Buch

Probability, Stochastic Processes, and Queueing Theory

The Mathematics of Computer Performance Modeling

verfasst von: Randolph Nelson

Verlag: Springer New York

insite
SUCHEN

Über dieses Buch

We will occasionally footnote a portion of text with a "**,, to indicate Notes on the that this portion can be initially bypassed. The reasons for bypassing a Text portion of the text include: the subject is a special topic that will not be referenced later, the material can be skipped on first reading, or the level of mathematics is higher than the rest of the text. In cases where a topic is self-contained, we opt to collect the material into an appendix that can be read by students at their leisure. The material in the text cannot be fully assimilated until one makes it Notes on "their own" by applying the material to specific problems. Self-discovery Problems is the best teacher and although they are no substitute for an inquiring mind, problems that explore the subject from different viewpoints can often help the student to think about the material in a uniquely per­ sonal way. With this in mind, we have made problems an integral part of this work and have attempted to make them interesting as well as informative.

Inhaltsverzeichnis

Frontmatter

Introduction

1. Introduction
Abstract
Probability is an exquisitely beautiful field of mathematics that is rich in its depth of deductive reasoning and its diversity of applications. The. theory started in the early 17th century with simple counting arguments that were used to answer questions concerning possible outcomes of games of chance. Over the ensuing years, probability has been established as a key tool in a wide range of fields that are as diverse as biology, chemistry, computer science, finance, medicine, physics, political science and sociology. This breadth of application, also found in calculus which dates from the same era, is a consequence of the fact that the theory has its roots in a branch of mathematics that is blessed with a compelling intuitive component. The full beauty of the subject can be best appreciated when its formal and intuitive aspects are melded together with applications. This motivates our goal; to derive probability in a manner that highlights the complementary nature of its formal, intuitive and applicative aspects.
Randolph Nelson

Probability

Frontmatter
2. Randomness and Probability
Abstract
What do we mean by saying that an event occurs randomly with a certain probability? Answering this simple question is the objective of this chapter and our answer forms the foundation for all of the material presented in the text. The existence of randomness is often taken for granted, much like the existence of straight lines in geometry, and the assignment of probabilities to events is often assumed axiomatically. The pervasive use of the word “probability” seems to imply that randomness is the rule rather than the exception. One hears of probabilities being ascribed to such diverse things as the weather, the electability of a presidential candidate, the genetic makeup of a child, the outcome of a sporting event, the chance of winning at blackjack, the behavior of elementary particles, the future marriage prospects of two friends, quantum physics, and the wonderfully incessant gyrations of the stock market.
Randolph Nelson
3. Combinatorics
Abstract
Probabilistic arguments can often be reduced to techniques where one enumerates all possibilities. These techniques are based on the Law of Total Probability and are most useful when there is a set of equally probable basic events and when events of interest consist of combinations of these basic events. In this chapter we focus on such enumerative techniques and derive formal counting techniques collectively called combinatorics. We define four different counting paradigms in Section 3.2,which are expressed in terms of selecting distinguishable balls from an urn or, equivalently, in terms of allocating indistinguishable balls to different urns. To derive expressions for the number of different possibilities for each counting paradigm, we establish recurrence relationships between the number of possibilities in the initial problem to that of a similar, but smaller, problem. This type of argument corresponds to partitioning the counting space into disjoint sets, which are smaller and easier to count.
Randolph Nelson
4. Random Variables and Distributions
Abstract
This chapter continues developing the theory of probability by defining random variables that are functions defined on the sample space of a probability space. Most stochastic models are expressed in terms of random variables: the number of customers in a queue, the fraction of time a processor is busy, and the amount of time there are fewer than k customers in the network are all examples of random variables. It is typically more convenient to work with random variables than with the underlying basic events of a sample space. Unlike a variable in algebra, which represents a deterministic value, a random variable represents the outcome of a random experiment and can thus only be characterized by its probabilistic outcome. To characterize these possibilities, we specify a distribution function that expresses the probability that a random variable has a value within a certain range. For example, if we let X k be the number of heads obtained in k tosses of a fair coin, then the value of X k is random and can range from 0 to k. The probability that there are less than or equal to ℓ heads, denoted by P [X k ≤ ℓ], can be calculated by enumerating all possible outcomes and is given by
$$P[{X_k} \leqslant \ell ] = {2^{ - k}}\sum\limits_{i = 0}^\ell {\left( \begin{gathered} k \hfill \\ i \hfill \\ \end{gathered} \right)}$$
(4.1)
(notice that this is the enigmatic summation of (3.36)). The set of values of P [X k ≤ ℓ] for all ℓ is called the distribution function of the random variable X k . In this chapter we will show that a distribution function can be used in place of a probability space when specifying the outcome of a random experiment. This is the basis for most probabilistic models that specify the distributions of random variables of interest but do not specify the underlying probability space. In fact, it is very rare in models to find a specification of a probability space since this is usually understood from the context of the model.
Randolph Nelson
5. Expectation and Fundamental Theorems
Abstract
In the previous chapter we defined random variables and established some of their properties. Like variables in algebra, random variables have values but these values can only be determined from random experiments. To characterize all possible outcomes from such an experiment we use distribution functions, which provide a complete specification of a random variable without the necessity of defining an underlying probability space. Specifying distribution functions poses no inherent mathematical difficulties. However, in applications it is frequently not possible to determine the complete distribution function. Often a simpler representation is possible; instead of specifying the entire function we specify a set of statistical values. A statistical value is the average of a function of a random variable. This can be thought of as the value that is obtained from averaging the outcomes of a large number of random experiments as discussed in the frequency-based approach to probability of Section 2.1. In the axiomatic framework of Section 2.2 such an average corresponds to the operation of expectation applied to a function of a random variable.
Randolph Nelson

Stochastic Processes

Frontmatter
6. The Poisson Process and Renewal Theory
Abstract
In this chapter we analyze a stochastic process termed a renewal process. A stochastic process is a set of random variables {X(t): tT} defined on a common probability space. It is typical to think of t as time, T as a set of points in time, and X(t)as the value or state of the stochastic process at time t. We classify processes according to time and say that they are discrete time or continuous time depending on whether T is discrete (finite or countably infinite) or continuous. If T is discrete then we typically index the stochastic process with integers, that is, {X 1, X 2,...}. In Part I of the book all random variables that we considered either were independent or had a very simple form of dependency (e.g., they were correlated). Stochastic processes are introduced as a way to capture more complex forms of dependency between sets of random variables.
Randolph Nelson
7. The M/G/1 Queue
Abstract
In this chapter we analyze a simple single server queue that is frequently used to model components of computer systems. This queue is termed the M/G/1 queue. This is standard “queueing notation,” first introduced by Kendall. Typically a queue is described by four variables
$$ A/S/k/c, $$
which have the following interpretation:
A,S — The arrival (A) or service (S) process where M means Poisson arrivals and exponential service times, G means the process is generally distributed, E denotes an -stage Erlang distribution, and D denotes a deterministic distribution.
k — The number of servers.
c — The buffer size of the system. This is the total number of customers the system can hold. Arrivals to the system already containing c customers are assumed to be lost. If not specified, c is assumed to be infinite.
Randolph Nelson
8. Markov Processes
Abstract
In this chapter we consider an important type of stochastic process called the Markov process. A Markov process1 is a stochastic process that has a limited form of “historical” dependency. To precisely define this dependency, let {X(t) : tT} be a stochastic process defined on the parameter set T. We will think of T in terms of time, and the values that X(t) can assume are called states which are elements of a state space S.
Randolph Nelson
9. Matrix Geometric Solutions
Abstract
In Example 8.12 we analyzed a scalar state process that was a modification of the M/M/1 queue. In the example, we classified two sets of states: boundary states and repeating states. Transitions between the repeating states had the property that rates from states 2, 3,..., were congruent to rates between states j, j + 1,... for all j ≥ 2. We noted in the example that this implied that the stationary distribution for the repeating portion of the process satisfied a geometric form. In this chapter we generalize this result to vector state processes that also have a repetitive structure. The technique we develop in this chapter to solve for the stationary state probabilities for such vector state Markov processes is called the matrix geometric method. (The theory of matrix geometric solutions was pioneered by Marcel Neuts; see [86] for a full development of the theory.) In much the same way that the repetition of the state transitions for this variation of the M/M/1 queue considered in Example 8.12 implied a geometric solution (with modifications made to account for boundary states), the repetition of the state transitions for vector processes implies a geometric form where scalars are replaced by matrices. We term such Markov processes matrix geometric processes.
Randolph Nelson
10. Queueing Networks
Abstract
In this chapter we derive the mathematics of product form queueing networks. In such networks the stationary distribution of the network is the product of the distributions of each queue analyzed in isolation from the network (for closed networks this is subject to a normalization constant). When first encountered, such a solution is enigmatic since for open networks it implies independence of the stationary distributions of the individual queues, and for closed networks it implies that the dependence between the queues is captured by normalizing the independent solution over a truncated state space. The derivations presented in this chapter provide insight into why simple solutions of this type exist for such complex networks.
Randolph Nelson
11. Epilogue and Special Topics
Abstract
The last ten chapters and the five appendices have encompassed a wide range of mathematics that is derivable from the axioms of probability. Lest the reader think that we have come to the end of the subject of probability, we allow ourselves in this final parting the indulgence of mentioning some extensions that naturally follow from the development in the text. Each of the three results that are derived in this epilogue can be considered to be the start of a field of mathematics; it is hoped that the brief descriptions that follow will whet the students’ appetite to continue their study. It is more appropriate to view this closing chapter as a beginning rather than an end. This highlights the most intriguing aspect of mathematics — that it appears to be unending.
Randolph Nelson
Backmatter
Metadaten
Titel
Probability, Stochastic Processes, and Queueing Theory
verfasst von
Randolph Nelson
Copyright-Jahr
1995
Verlag
Springer New York
Electronic ISBN
978-1-4757-2426-4
Print ISBN
978-1-4419-2846-7
DOI
https://doi.org/10.1007/978-1-4757-2426-4