Skip to main content

2002 | Buch

Interactive Markov Chains

And the Quest for Quantified Quality

verfasst von: Holger Hermanns

Verlag: Springer Berlin Heidelberg

Buchreihe : Lecture Notes in Computer Science

insite
SUCHEN

Über dieses Buch

Markov Chains are widely used as stochastic models to study a broad spectrum of system performance and dependability characteristics. This monograph is devoted to compositional specification and analysis of Markov chains.
Based on principles known from process algebra, the author systematically develops an algebra of interactive Markov chains. By presenting a number of distinguishing results, of both theoretical and practical nature, the author substantiates the claim that interactive Markov chains are more than just another formalism: Among other, an algebraic theory of interactive Markov chains is developed, devise algorithms to mechanize compositional aggregation are presented, and state spaces of several million states resulting from the study of an ordinary telefone system are analyzed.

Inhaltsverzeichnis

Frontmatter
Introduction
Abstract
The purpose of this book is to provide a compositional methodology of modelling and analysis with Markov chains. Markov chains are widely used as simple and yet adequate models in many diverse areas, not only in mathematics and computer science but also in other disciplines such as operations research, industrial engineering, biology, demographics, and so on. Markov chains can be used to estimate performance and dependability characteristics of various nature, for instance to quantify throughputs of manufacturing systems, locate bottlenecks in communication systems, or to estimate reliability in aerospace systems.
Holger Hermanns
Interactive Processes
Abstract
In this chapter we introduce the basic framework that will be used throughout the whole monograph. We introduce labelled transition systems together with two composition operators, abstraction and parallel composition. We then proceed discussing useful equivalences on this model, essentially recalling the motivation for strong and weak bisimilarity. We also sketch efficient algorithms to compute these equivalences. The contents of this chapter is a collection from [145], [27], [154], [125], [86], [76], [73], [22]. Its purpose is to give an intuitive understanding of the key features of process algebra. A reader familiar this features is invited to fleetingly glance through this chapter in order to catch a glimpse of the notations used.
Holger Hermanns
Markov Chains
Abstract
This chapter deals with a particular class of stochastic models that form a cornerstone of this book. Stochastic models are widely used to describe phenomena that change randomly as time progresses.We focus onMarkov chains, as simple and adequate models for many such phenomena. More precise, we cover discrete—as well as continuous-time Markov chains. We discuss details of their analysis to set the ground for the requirements of later chapters, and introduce useful equivalence relations for both types of models. These relations are defined in the style of bisimilarity and are akin to the notion lumpability on Markov chains. Furthermore, we present efficient algorithms to compute these relations, which, as a side result, can be used to compute the ‘best possible’ lumping of a given Markov chain.
Holger Hermanns
Interactive Markov Chains
Abstract
This chapter introduces the central formalism of this book, Interactive Markov Chains1 (IMC). It arises as an integration of interactive processes and continuous-time Markov chains. There are different ways to combine both formalisms, and some of them have appeared in the literature. We therefore begin with a detailed discussion of the different integration possibilities and argue why we take which decision. As a result IMC combine the different ingredients as orthogonal to each other as possible. We proceed by defining composition operators for IMC. Wethen focus our attention on the discussion of strong and weak bisimilarity, incorporating the notion of maximal progress into the definitions. In order to efficiently compute these relations we develop algorithms that are more involved than the ones presented in earlier chapters. Anyhow, we prove that their computational complexity is not increased. A small example of using IMC to compositionally specify and aggregate the leaky bucket principle concludes this chapter.
Holger Hermanns
Algebra of Interactive Markov Chains
Abstract
In this chapter we will develop an algebra of Interactive Markov Chains. What turns Interactive Markov Chains into an algebra? An algebra usually consists of a set of operations on a given carrier set together with equational laws that characterise these operations. A well-know algebra is the algebra of natural numbers where addition and multiplication satisfy associativity and commutativity laws.
To begin with, we show how to specify IMC in a purely syntactic way by means of a language, and this language shall be the carrier set of our algebra of Interactive Markov Chains. We will investigate strong and weak bisimilarity in this context, introducing the notion of weak congruence, a slight refinement of weak bisimilarity. Then, we tackle the issue of a sound and complete equational theory for strong bisimilarity and weak congruence. Nontrivial problems will have to be solved in order to establish an equational treatment of time-divergence and maximal progress. Indeed, our solution solves an open problem for timed process calculi in general, and we highlight that it can be adapted to solve a similar open problem for process calculi with priority.
In addition, we introduce further operators that exemplify two different directions of extensions to Interactive Markov Chains. One operator is introduced to enhance specification convenience. A second operator is introduced for more pragmatic reasons, namely to diminish the infamous state space explosion problem for specifications that exhibit symmetries.
Holger Hermanns
Interactive Markov Chains in Practice
Abstract
In this chapter we use Interactive Markov Chains to investigate some illustrative examples and case studies. We first study the effect of symmetric composition by means of a simple producer—consumer example. In particular, we compare the growth of the state space to other methods to generate an aggregated state space. In a second case study, we use IMC to model a real world application, namely an ordinary telephony system. The constraint oriented specification of time dependencies will be a central issue. In order to circumvent state space sizes of more than 10 millions of states we make excessive use of the theoretical properties and concepts achieved so far, and obtain a Markov chain of manageable size. These two case studies show how performance estimation can be based on a Markov chain obtained from a highly modular and hierarchical IMC specification. A third example will then be used to highlight some implicit limitations to this approach. In particular, the issue of nondeterminism will discussed.
Holger Hermanns
Conclusion
Abstract
This chapter gives a retrospective view on the main contributions of this book. It summarises the major building blocks of compositional performance and dependability estimation with Interactive Markov Chains, and addresses the general question whether the challenge of compositional performance and dependability estimation has been met. In addition, we point out relevant directions for further work.
Holger Hermanns
Proofs for Chapter 3 and Chapter 4
Abstract
The algorithm of Table 3.1 computes Markovian bisimilarity on S. It can be implemented with a time complexity of O(mM log n) where mM is the number of Markovian transitions and n is the number of states. The space complexity of this implementation is O(mM).
Holger Hermanns
Proofs for Chapter 5
Holger Hermanns
Backmatter
Metadaten
Titel
Interactive Markov Chains
verfasst von
Holger Hermanns
Copyright-Jahr
2002
Verlag
Springer Berlin Heidelberg
Electronic ISBN
978-3-540-45804-3
Print ISBN
978-3-540-44261-5
DOI
https://doi.org/10.1007/3-540-45804-2