Skip to main content
Top

2022 | Book

Handbook of Dynamics and Probability

insite
SEARCH

About this book

Our time is characterized by an explosive growth in the use of ever more complicated and sophisticated (computer) models. These models rely on dynamical systems theory for the interpretation of their results and on probability theory for the quantification of their uncertainties. A conscientious and intelligent use of these models requires that both these theories are properly understood.

This book is to provide such understanding. It gives a unifying treatment of dynamical systems theory and probability theory. It covers the basic concepts and statements of these theories, their interrelations, and their applications to scientific reasoning and physics. The book stresses the underlying concepts and mathematical structures but is written in a simple and illuminating manner without sacrificing too much mathematical rigor.

The book is aimed at students, post-docs, and researchers in the applied sciences who aspire to better understand the conceptual and mathematical underpinnings of the models that they use. Despite the peculiarities of any applied science, dynamics and probability are the common and indispensable tools in any modeling effort. The book is self-contained, with many technical aspects covered in appendices, but does require some basic knowledge in analysis, linear algebra, and physics.

Peter Müller, now a professor emeritus at the University of Hawaii, has worked extensively on ocean and climate models and the foundations of complex system theories.

Table of Contents

Frontmatter
Chapter 1. Introduction
Abstract
Dynamical systems theory and probability theory are basic tools of science. They are mathematical theories. They are used to describe, understand, model, and predict systems in quantitative sciences such as physics, biology, and economics. The time evolution of many of these systems is governed by dynamical laws that can be cast into a common mathematical form that makes them a dynamical system. Dynamical systems theory studies the properties of these systems independent of the context in which they arise. Science also has to deal with uncertainty. We do not know exactly that an event will occur under certain circumstances or as the outcome of a certain experiment. Mathematicians have developed the concept of probability to quantify uncertainty. Uncertainty may be intrinsic, quantum mechanics being the prime example; or it may be epistemic, due to our lack of complete knowledge or control. Though probability theory grew out of the analysis of games of chances it is also an axiomatic theory independent of any specific context. Part I and Part II of this book review dynamical systems theory and probability theory.
Peter Müller

Dynamics

Frontmatter
Chapter 2. Deterministic Dynamical Systems
Abstract
In this chapter, we first give the general and abstract definition of a deterministic dynamical system. This definition has as its basic ingredients a state space whose points characterize the state of the system and a family of maps that determines the time evolution of states according to a dynamical law. The state space is comprised of the minimal set of variables that determines all properties of interest. To qualify as a dynamical system the family of maps must satisfy a composition rule to conform with the additive structure of time.
Peter Müller
Chapter 3. Linear and Integrable Systems
Abstract
This chapter considers linear dynamical systems. The concept of linearity requires the state space to be a vector space. The elements of the state space can thus be added and multiplied by real (complex) numbers. The time evolution of linear systems is represented by a linear operator on this vector space. The prime example of a linear dynamical system is the time evolution of a quantum mechanical system according to the Schrödinger equation. Linear algebra provides powerful tools for the analysis of linear operators and hence of linear systems.
Peter Müller
Chapter 4. Stability and Long-Time Behavior
Abstract
Dynamical systems theory aims at an understanding and comprehension of the complete dynamical system not just at documenting the properties of a particular solution. Essential elements of such a comprehension are the stability of orbits and limiting sets. The stability of orbits addresses the question of whether or not two orbits that start out close together stay close together or eventually diverge from each other. Limiting sets address the question of the long-term behavior of non-periodic orbits.
Peter Müller
Chapter 5. Ergodic Theory
Abstract
The evolution of volumes provides useful information about dynamical systems. Conservative systems that conserve volume cannot have attractors, only dissipative systems that do not conserve volume can. Volume is but a special kind of measure and studying the behavior of more general measures provides profound insights into dynamical systems, allowing to address questions such as whether points return to the set from which they originated, whether they visit other sets, what is the averaged behavior of observables along trajectories, and what may be regarded as typical behavior. Measures are studied by measure theory which is a well-developed mathematical theory. The basis of measure theory is a measure space: a triple that consists of a basic set, a \(\sigma \)-algebra, which is the family of its measurable subsets, and a measure that assigns to each measurable subset a non-negative real number, the measure of that subset. We thus assume the state space of the dynamical system to be a measure space. To make the dynamical system compatible with the measure space one assumes that the dynamical map is a measurable function, similarly to assuming that the dynamical map is a continuous function when studying the topological properties of systems. In this chapter we thus consider the quadruple consisting of a state space, a sigma-algebra, a measure, and a family of measurable dynamical maps. Such systems are studied by ergodic theory.
Peter Müller
Chapter 6. Numerical Algorithms
Abstract
An important class of discrete dynamical systems arises when continuous dynamical systems are discretized to be solved numerically on a computer. The main concern is to what extent the numerical discrete solution mimics or resembles the solution of the original continuous system. This chapter describes some basic properties of the algorithms that are used to obtain numerical solutions for ODEs and PDEs. It is an abridgment of a chapter in Müller and von Storch (2004). Numerical algorithms executed on a digital computer have to introduce a finite time step, and a finite grid size in case of PDEs. Numerical algorithms introduce two kind of basic errors: round-off errors and truncation errors. Round-off errors occur since digital computers have only a finite word length, a finite number of bits to represent real numbers. Round-off errors are property of the machine on which the numerical code is executed. Truncation errors occur when the derivative of a function is replaced by a finite difference approximation. Truncation errors are under the control of the modeler. They can be reduced by decreasing the time step or grid spacing.
Peter Müller

Probability

Frontmatter
Chapter 7. Probability Theory
Abstract
This chapter gives the axiomatic approach to probability theory, due to Kolmogorov. It defines the probability space and then introduces the important notions of conditional probabilities and statistical independence. The central concept of a random variable and its probability distribution and its expectation is introduced next. Moments, characteristic functions, generating functions, and escort distributions are various ways to describe the probability distributions of random variables. Then families of two or a finite number of random variables are considered, so-called bi- and multivariate random variables. Finally, we consider two important theorems which describe the limiting behavior of mean values, the weak and strong law of large numbers. Limits of sequences of random variables can also be used to define probability distributions. One example is the central limit theorem which establishes the importance of the Gaussian distribution. Another example are the generalized extreme value distributions.
Peter Müller
Chapter 8. Discrete-Time Stochastic Processes
Abstract
Here we start to extend the concept of random variables to families of random variables \((f_t)_{t\in T}\) where the index set is either \({T=\,{\textsf {I\!\!\!Z}}}\) (\({\,{\textsf {I\!\!\!N}}}\)) or \({\,{\textsf {I\!\!\!R}}}\) (\({\,{\textsf {I\!\!\!R}}_+}\)). The first case represents discrete-time stochastic processes, the second case represents continuous-time stochastic processes. We thus progress up through the general classification of families of random variables.
Peter Müller
Chapter 9. Continuous-Time Stochastic Processes
Abstract
In this chapter we consider families of random variables \((f_t)_{t\in T}\) where time T is continuous, either \({T=\,{\textsf {I\!\!\!R}}}\) or \({T=\,{\textsf {I\!\!\!R}}_+}\). We call these families continuous stochastic processes.
Peter Müller
Chapter 10. Information Entropy
Abstract
The concepts of information content and entropy grew out of attempts to quantify randomness. Early ideas go back to Boltzmann, but the definitive step was taken by Claude Shannon (1948) in his treatise “A Mathematical Theory of Communication” which lay the groundwork for what is known today as information theory. More recent references are Mackey (2003) and Cover and Thomas (2005). Information theory studies sequences of letters or symbols from a finite alphabet. It presumes that a source produces these letters with a given probability distribution and then studies how information is degraded when storing, processing, or transmitting data.
Peter Müller

Probability in Dynamical Evolution

Frontmatter
Chapter 11. Time Evolution of Broadened Initial States
Abstract
In this chapter, we study the time evolution of broadened initial conditions under the action of a dynamical map. The broadened initial conditions are represented by probability density functions, in contrast to pure initial conditions which are represented by points. The evolution of densities is a special case of the evolution of measures which is studied in the chapter on ergodic theory.
Peter Müller
Chapter 12. Stochastic Processes Generated by Observables
Abstract
Broadened or random initial conditions inject randomness into dynamical systems. When only the time evolution of the dynamical system is considered, this randomness is of a very limited kind. The time evolution of random initial conditions remains deterministic and is given by the linear Frobenius–Perron operator. This situation changes when the evolution of observables is considered which may be many-to-one functions. An initial value of the observable might then not determine uniquely the future values of this observable.
Peter Müller
Chapter 13. Stochastic Dynamical Systems
Abstract
Randomness is introduced into dynamical systems for many reasons. A dynamical system might depend on some not well-determined parameters. Assuming this parameter to be a random variable results in a system where many trajectories emanate from a point in state space, one for each realization of the random variable, and one can analyze the spread of these trajectories to study how sensitive the dynamical system is to the exact value of the parameter.
Peter Müller

Probability in Scientific Reasoning

Frontmatter
Chapter 14. Interpretations of Probability
Abstract
Probability is a mathematical construct that satisfies Kolmogorov’s axioms. Probability theory tells us how to manipulate given probabilities according to the logical implications of these axioms. What things in the real world might correspond to this mathematical construct is a contentious issue. Applying probability theory to the real world requires first to identify ‘events’ to which one can assign probabilities. We consider four main classes of ‘events’ or four interpretations of probability (Guttman 1999).
Peter Müller
Chapter 15. Parameter Estimation and Hypothesis Testing
Abstract
The task of determining probabilities is generally referred to as statistics. Statistics analyzes evidence, observations, data or samples. These samples are regarded as realizations of random variables. Regarding a sample as a realization of a random variable is the basic assumption of statistics. A major sub-discipline of statistics is statistical inference. It provides methods of how to infer properties of the assumed underlying probability distribution from the sample. Other sub-disciplines deal with problems such as what constitutes a good sampling strategy and an appropriate probability model. Given a probability model that is characterized by a set of parameters, statistical inference addresses two main problems: the estimation of parameters and the testing of hypotheses about these parameters.
Peter Müller
Chapter 16. Bayesian Inferences
Abstract
Bayesian inference also addresses the problem of parameter estimation and hypothesis testing. It differs from the frequentist view of the previous chapter in various respects.
Peter Müller

Probability in Physics

Frontmatter
Chapter 17. Equilibrium Statistical Mechanics
Abstract
Statistical mechanics considers macroscopic systems that are governed by phenomenological laws such as the laws of thermodynamics or continuum mechanics. It attempts to derive these laws from the underlying microscopic system. Our main example of such a microscopic system will be a classical N-particle system governed by a Hamilton function. Statistical mechanics is most advanced for systems that are in thermodynamic equilibrium, and we will consider this case in this chapter, and the non-equilibrium case in the next chapter.
Peter Müller
Chapter 18. Non-equilibrium Statistical Mechanics
Abstract
In this chapter we consider macroscopic systems that are not in equilibrium. Examples are fluids and gases as described by fluid dynamics, deformable bodies as described by the theory of elasticity, and electromagnetic fields and charges and currents as described by the macroscopic Maxwell equations.
Peter Müller
Chapter 19. Foundational Issues of Statistical Mechanics
Abstract
Statistical mechanics relates macroscopic properties to the underlying microscopic dynamics. Though statistical mechanics is highly successful on a technical level, properties of macroscopic systems are routinely determined from Hamilton functions or Hamiltonians, it is still an open question why statistical mechanics works? The basic problem is:
Peter Müller
Chapter 20. Quantum Mechanics
Abstract
Any discussion of dynamics and probability must include a discussion of quantum mechanics where dynamics and probability are intertwined in a peculiar way. This chapter cannot, of course, cover quantum mechanics in any substantial way. Instead, we just present with a broad brush the pragmatic rules and some foundational issues of quantum mechanics. Practitioners of quantum mechanics generally agree on the pragmatic rules but disagree on many of the foundational issues. Probability is intrinsic to quantum mechanics. Pure states only have a probabilistic interpretation which derives from a generalization of the Pythagorean theorem rather than from a probability measure. The broadened states of classical mechanics correspond to mixed states in quantum mechanics. There is an additional type of state in quantum mechanics, an entangled state, which does not have any equivalent in classical mechanics. Entangled states derive from the fact that the state space of a composite system is given by the tensor product of the subsystems rather than by the Cartesian product as in classical physics. In quantum mechanics, identical particles are also indistinguishable which led to the non-intuitive counting rules in the chapter on equilibrium statistical mechanics. The differences between quantum and classical mechanics become most succinct when considering the logic applied to propositions in both fields.
Peter Müller
Backmatter
Metadata
Title
Handbook of Dynamics and Probability
Author
Dr. Peter Müller
Copyright Year
2022
Electronic ISBN
978-3-030-88486-4
Print ISBN
978-3-030-88485-7
DOI
https://doi.org/10.1007/978-3-030-88486-4