Skip to main content
main-content

Über dieses Buch

The general concept of information is here, for the first time, defined mathematically by adding one single axiom to the probability theory. This Mathematical Theory of Information is explored in fourteen chapters: 1. Information can be measured in different units, in anything from bits to dollars. We will here argue that any measure is acceptable if it does not violate the Law of Diminishing Information. This law is supported by two independent arguments: one derived from the Bar-Hillel ideal receiver, the other is based on Shannon's noisy channel. The entropy in the 'classical information theory' is one of the measures conforming to the Law of Diminishing Information, but it has, however, properties such as being symmetric, which makes it unsuitable for some applications. The measure reliability is found to be a universal information measure. 2. For discrete and finite signals, the Law of Diminishing Information is defined mathematically, using probability theory and matrix algebra. 3. The Law of Diminishing Information is used as an axiom to derive essential properties of information. Byron's law: there is more information in a lie than in gibberish. Preservation: no information is lost in a reversible channel. Etc. The Mathematical Theory of Information supports colligation, i. e. the property to bind facts together making 'two plus two greater than four'. Colligation is a must when the information carries knowledge, or is a base for decisions. In such cases, reliability is always a useful information measure. Entropy does not allow colligation.

Inhaltsverzeichnis

Frontmatter

1. About Information

Abstract
Information is playing an increasing role in our industrialized society. A technical overview of the flourishing electronics industry stated in 1987: ℌOn almost every technology front, the driving force behind new developments is the ever-rising demand for information. Huge amounts of data and information, larger than anyone ever dreamed of a generation ago, have to move faster and faster through processors and networks, then end up having to be storedℍ Electronics,1987, p.83.
Jan Kåhre

2. The Law of Diminishing Information

Abstract
We will now set out to derive a strict mathematical formulation of the Law of Diminishing Information (1.8.1). The mathematical treatment will, to begin with, be limited to discrete finite systems. The reason for this is that the mathematical proofs are then concise and clearly correct; the line of thought will not be obscured by technicalities connected to proofs that include the infinitely small and the infinitely large.
Jan Kåhre

3. General Properties of Information

Abstract
In Chapter 1 we argued that The Law of Diminishing Information is a necessary condition for an information measure to be acceptable. But is the Law (2.7.1) also a sufficient condition in the sense that it covers all mathematical aspects of information? Many intuitive properties have been attributed to information measures. We have systematically collected the proposed information properties, both academic desiderata from the classical information theory and common sense ones from proverbs and poetry. Of course it is not possible to guarantee that any such collection is an exhaustive set, but we will show that all the acceptable intuitive properties and desiderata can be derived as theorems using the Law (2.7.1) as an axiom.
Jan Kåhre

4. Specific Information Measures

Abstract
We will now look at different functions generating specific information measures in finite discrete systems. The theorems in this chapter are mainly in the form of criteria for functions to generate measures in accordance with the Law of Diminishing Information (2.7.1). The examples form a nonexhaustive collection of members of the inf family. We have already met the von Neumann utility measures in Section 3.4.
Jan Kåhre

5. Selected Applications

Abstract
Now it is time to put the mathematical theory of information to the test on a selection of applications, ranging from the theoretical to the practical: from philosophy to mathematics, and to engineering.
Jan Kåhre

6. Infodynamics

Abstract
The state of a dynamic system changes from one instant of time to another. There is a transition rule, according to which the state at a given instant is generated by the state at the preceding instant. Mathematically, it is the question of handling long chains, or blocks in series. The length of the chain can grow without limit.
Jan Kåhre

7. Statistical Information

Abstract
There are two main applications of statistical information theory: telecommunications and thermodynamics. Both areas are well covered by their own specialized textbooks. We will start the information theoretical examination from the telecommunications end with Shannon’s theory.
Jan Kåhre

8. Algorithmic Information

Abstract
Intuitively, information and complexity go together. We feel that a complex message must contain more information then a simple one. Complexity is, however, an illusive property. A Mandelbrot fractal pattern may look intricate, indeed, but to a mathematician it is simple. No more information than a small piece of computer program is needed to reproduce the pattern.
Jan Kåhre

9. Continuous Systems

Abstract
This chapter prepares the ground for the investigation of information transmitted by continuous signals in Chapter 10. Conceptually, the probability theory of continuous systems does not differ from the theory of discrete systems. The tools are, however, quite different, such as the theory of integral equations and of coordinate transformations.
Jan Kåhre

10. Continuous Information

Abstract
In Chapter 9, the continuous signal was constructed as the limiting case of a discrete signal, when the increments approach zero. The statistical properties of a continuous signal were defined in terms of probability density. The same approach can be used with the measure of information transmitted by a continuous signal, or as it is called in electronics, an analog signal.
Jan Kåhre

11. Deterministic Dynamics

Abstract
The real world presents the worst-case scenario to the Law of Diminishing Information: Physical information is indestructible. This could at least be a consequence of the symmetry of the physical laws. As an example, we take the pattern ORDER in Figure 6.7 built of real particles, e.g. gas molecules. They will diffuse in the course of time, until the initial pattern has completely disappeared. The physical world is, however, time-reversible: if all the velocities of the particles were reversed, the initial pattern ORDER would reappear. All the information was still there!
Jan Kåhre

12. Control and Communication

Abstract
We have seen that a closed system will reach a steady state as the time passes (Section 11.1). That is not the case with an open system, which is subject to a multitude of influences. If such a system reacts to all the impulses, its behaviour would be complicated beyond description. The open system must have the capability of damping an outside impulse, so that the memory of its influence is erased.
Jan Kåhre

13. Information Physics

Abstract
The two great accomplishments of contemporary physics, the relativity theory and the quantum theory, are both associated with information, the former with speed of transmission (Section 13.2), the latter with accuracy (Section 14.1). In this chapter we will concentrate on relativity. Hence, we will discuss information chains and the timing of signals, disregarding the measure used to estimate the amount of information.
Jan Kåhre

14. The Information Quantum

Abstract
In this chapter, we will study the particles moving along a line from another perspective: How well do we know where a particle is? The answer depends naturally on the accuracy of our measurement. There is, however, also a question about how well the position of a particle is defined. A high-precision measurement cannot be made on a diffuse object.
Jan Kåhre

Backmatter

Weitere Informationen