Skip to main content
main-content

Über dieses Buch

This book is intended to be an introduction to the mathematical description of information in science. The necessary mathematical theory of this introduction will be treated in a more vivid way than in the usual theorem-proof structure. This, however, enables us to develop an idea of the connections between different information measures and to understand the trains of thought in their derivation, which is a crucial point for correct applications. It is therefore our intention in the mathematical descriptions to evolve the important ideas of the derivations, so that we obtain the resulting functions as well as the main thoughts and the conditions for the validity of the result. This simplifies the handling of the information measures, which are sometimes hard to classify without any additional background information. Though the mathematical descriptions are the exact formulations of the measures examined, we do not restrict ourselves to rigorous mathematical considerations, but we will also integrate the different measures into the structure and context of possible information measures. Nevertheless the mathematical approach is unavoidable when we are looking for an objective description and for possible applications in optimization.

Inhaltsverzeichnis

Frontmatter

Abstract

Without Abstract
Christoph Arndt

1. Introduction

Abstract
Information is an essential component of the transmission of messages or knowledge and it was being used with or without awareness in our everyday life long before computers made their way into our life. Every organism that wants to survive in its environment has to incorporate information concerning food, energy or, more generally, information with respect to environmental conditions, which is necessary for it to exist (independently). This starts with simple monads, becomes more important for crops and animals and finally culminates in human beings, who are the only creatures able to communicate abstract concepts and ideas, because of the development of a complex language.
Christoph Arndt

2. Basic considerations

Abstract
In order to derive a function describing information, we will follow a train of thought, demonstrated by Howard Resnikoff [RES87], where we first define the preliminaries that we require of the resulting information function.
Christoph Arndt

3. Historic development of information theory

Abstract
In a short overview — which cannot include all events in information theory — we want to introduce the measures of information familiar to scientists of information theory, communication theory and mathematics. In this overview our main attention is not focused on a philosophic, semantic view of information, but is focused on the information that we are able to describe with mathematical methods and thus to apply in our theories.
Christoph Arndt

4. The concept of entropy in physics

Abstract
The concept of entropy began its development in physics in the 19th century, when scientists began to describe thermodynamic processes. In the progress of these considerations the physicists first assumed some substance able to receive, store and deliver heat. Carnot talked about a power residing in the fire. Boltzmann was the first to develop a statistical description of entropy, which led to a lot of resistance in his time.
Christoph Arndt

5. Extension of Shannon’s information

Without Abstract
Christoph Arndt

6. Generalized entropy measures

Without Abstract
Christoph Arndt

7. Information functions and gaussian distributions

Without Abstract
Christoph Arndt

8. Shannon’s information of discrete probability distributions

Without Abstract
Christoph Arndt

9. Information functions for gaussian distributions part II

Without Abstract
Christoph Arndt

10. Bounds of the variance

Abstract
Information depends on the uncertainty of the signals that we observe. The uncertainty in these signals is commonly described by the variance, representing the power of the error. The variance is not a function of a probability distribution function, but a moment, i.e. a direct component of this distribution density. These moments are used to determine optimal estimators or to find limitations that these estimators cannot exceed. In the literature such limitations are known as bounds and we will adopt this denotation, which describes the main point of this chapter.
Christoph Arndt

11. Ambiguity function

Abstract
We now introduce and study another interesting function, which is connected to the previous information measures, especially to Kullback’s discrimination information. It enables us to derive certain information criteria, applied in system identification, which is the main reason for us to include this function in our presentation. In [SCH73] the author — referring to radar engineering — presents the ambiguity function, which allows a conclusion about the uniqueness of the expected estimation.
Christoph Arndt

12. Akaike’s information criterion

Without Abstract
Christoph Arndt

13. Channel information

Without Abstract
Christoph Arndt

14. ‘Deterministic’ and stochastic information

Abstract
At the beginning of this chapter we want to carry out some basic considerations regarding the information in mappings with additional noise. Such mappings occur in state space models, when we model the relation between the state space variables and the observation space variables with the observation equation.
Christoph Arndt

15. Maximum entropy estimation

Abstract
The maximum entropy estimation is a method that enables us to estimate the distribution density function of one or more random variables, if our previous knowledge is restricted by a limited number of samples or by other limitations. The basic idea of this method can be described with the Laplacian ‘principle of insufficient reasoning’. This principle states that we may only use a priori knowledge, which we get from measurements or other known restrictions. All other information, which is, for instance, implicated by an estimation algorithm, leads to a limitation of the results obtained.
Christoph Arndt

16. Concluding remarks

Abstract
These three concepts are closely related, which is the reason for us to carry out a more detailed examination in this chapter. Information and entropy have been subject to the considerations in the previous chapters and we noticed that we may roughly define information as organization and entropy as disorganization. Now this chapter adds the concept of self-organization, applied in biology and physics, which seems to present a contradiction to the second law of thermodynamics. This contradiction stems from the following consideration: The non-living organisms on our earth disintegrate and thus they tend towards the Equilibrium State, the so-called ‘heat death’.
Christoph Arndt

Backmatter

Weitere Informationen