Skip to main content

2012 | Buch

Novelty, Information and Surprise

insite
SUCHEN

Über dieses Buch

The book offers a new approach to information theory that is more general then the classical approach by Shannon. The classical definition of information is given for an alphabet of symbols or for a set of mutually exclusive propositions (a partition of the probability space Ω) with corresponding probabilities adding up to 1. The new definition is given for an arbitrary cover of Ω, i.e. for a set of possibly overlapping propositions. The generalized information concept is called novelty and it is accompanied by two new concepts derived from it, designated as information and surprise, which describe "opposite" versions of novelty, information being related more to classical information theory and surprise being related more to the classical concept of statistical significance. In the discussion of these three concepts and their interrelations several properties or classes of covers are defined, which turn out to be lattices. The book also presents applications of these new concepts, mostly in statistics and in neuroscience.

Inhaltsverzeichnis

Frontmatter

Surprise and Information of Descriptions

Frontmatter
Chapter 1. Prerequisites from Logic and Probability Theory
Abstract
This chapter lays the probabilistic groundwork for the rest of the book. We introduce standard probability theory. We call the elements A of the σ-algebra “propositions” instead of “events”, which would be more common. We reserve the word “event” for the elements of the probability space Ω.
Günther Palm
Chapter 2. Improbability and Novelty of Descriptions
Abstract
In this chapter we define the information of an event AΣ, or in our terminology the novelty of a propositionA as \({-\log }_{2}p(A)\). We further define the important new concept of a description and extend the definition of novelty from events to descriptions. Finally we introduce the notions of completeness and directedness of descriptions and thereby the distinction between surprise and information, which are opposite special cases of novelty.
Günther Palm
Chapter 3. Conditional and Subjective Novelty and Information
Abstract
You are playing cards and you are about to bet on a card that it is an ace. Your neighbor, who could have seen the value of the card, whispers to you “Don’t do that! If this card is an ace, my uncle is the pope.” The card turns out to be an ace. Now you know that the pope is his uncle.
Günther Palm

Coding and Information Transmission

Frontmatter
Chapter 4. On Guessing and Coding
Abstract
In Chap. 2 we defined thenovelty of a proposition as a special function of its probability p(A). We motivated the definition \(\mathcal{N}(A) = {-\log }_{2}p(A)\) by the idea that \(\mathcal{N}(A)\) should measure the number of yes–no questions needed to guess A.
Günther Palm
Chapter 5. Information Transmission
Abstract
This chapter introduces the concept of a transition probability and the problem of guessing the input of an information channel from observing its output. It gives a first idea on the classical results of Shannon, without introducing the technicalities of stationary stochastic processes and the proof of Shanny’s Theorem. This material is provided in the next three chapters. Since it is not necessary for the understanding of Parts IV, V, and VI, one can move directly to Part IV after this chapter.
Günther Palm

Information Rate and Channel Capacity

Frontmatter
Chapter 6. Stationary Processes and Their Information Rate
Abstract
This chapter briefly introduces the necessary concepts from the theory of stochastic processes (see for example Lamperti 1977; Doob 1953) that are needed for a proper definition of information rate and channel capacity, following Shannon.
Günther Palm
Chapter 7. Channel Capacity
Abstract
In this chapter we extend the definitions of Chap. 5 to real information channels that handle sequences of symbols instead of single symbols. This extension is necessary to use the idea of taking a limit of very long sequences to define information rate (Definition 5.5) now to define transinformation rate and channel capacity. This leads to the proof of Shannon’s famous theorem in the next chapter.
Günther Palm
Chapter 8. How to Transmit Information Reliably with Unreliable Elements (Shannon’s Theorem)
Abstract
The goal of our rather technical excursion into the field of stationary processes was to formulate and prove Shannon’s theorem. This is done in this last chapter of Part III.
Günther Palm

Repertoires and Covers

Frontmatter
Chapter 9. Repertoires and Descriptions
Abstract
This chapter introduces the notion of a cover or repertoire and its proper descriptions. Based on the new idea of relating covers and descriptions, some interesting properties of covers are defined.
Günther Palm
Chapter 10. Novelty, Information and Surprise of Repertoires
Abstract
This chapter finally contains the definition of novelty, information and surprise for arbitrary covers and in particular for repertoires and some methods for their practical calculation. We give the broadest possible definitions of these terms for arbitrary covers, because we use it occasionally in Part VI. Practically it would be sufficient to define everything just for repertoires. It turns out that the theories of novelty and of information on repertoires are both proper extensions of classical information theory (where complementary theorems hold), which coincide with each other and with classical information theory, when the repertoires are partitions.
Günther Palm
Chapter 11. Conditioning, Mutual Information, and Information Gain
Abstract
In this chapter we want to discuss the extension of three concepts of classical information theory, namely conditional information, transinformation (also called mutual information), and information gain (also called Kullback–Leibler distance) from descriptions to (reasonably large classes of) covers. This extension will also extend these concepts from discrete to continuous random variables.
Günther Palm

Information, Novelty and Surprise in Science

Frontmatter
Chapter 12. Information, Novelty, and Surprise in Brain Theory
Abstract
In biological research it is common to assume that each organ of an organism serves a definite purpose. The purpose of the brain seems to be the coordination and processing of information which the animal obtains through its sense organs about the outside world and about its own internal state (Bateson 1972). An important aspect of this is the storage of information in memory and the use of the stored information in connection with the present sensory stimuli.
Günther Palm
Chapter 13. Surprise from Repetitions and Combination of Surprises
Abstract
In this chapter we consider the surprise for arepertoire which represents the interest in several statistical tests which were performed more or less independently. Then we consider the surprise obtained from repetitions of the same low-probability event.
Günther Palm
Chapter 14. Entropy in Physics
Abstract
The termentropy was created in statistical mechanics; it is closely connected toinformation and it is this connection that is the theme of this chapter. Let us first describe the historical context of classical statistical mechanics.
Günther Palm

Generalized Information Theory

Frontmatter
Chapter 15. Order- and Lattice-Structures
Abstract
In this part we want to condense the new mathematical ideas and structures that have been introduced so far into a mathematical theory, which can be put in the framework of lattice theory. In the next chapter we want to get a better understanding of the order structure (defined in Definition 10.3 on the set of all covers. For this purpose we now introduce a number of basic concepts concerning order and lattices (Birkhoff 1967).
Günther Palm
Chapter 16. Three Orderings on Repertoires
Abstract
The set of all repertoires actually has an interesting structure, when we “look at” a repertoire α in terms of its proper descriptions D(α). This means that we should consider two repertoires to be essentially the same if they have the same proper descriptions, or we should say that α is more refined than β if the proper descriptions in α are contained in those in β.
Günther Palm
Chapter 17. Information Theory on Lattices of Covers
Abstract
Classical information theory considers the Information \(\mathcal{I}\) on the lattice \((\mathfrak{P},\wedge ,\vee )\) of partitions.
Günther Palm
Backmatter
Metadaten
Titel
Novelty, Information and Surprise
verfasst von
Günther Palm
Copyright-Jahr
2012
Verlag
Springer Berlin Heidelberg
Electronic ISBN
978-3-642-29075-6
Print ISBN
978-3-642-29074-9
DOI
https://doi.org/10.1007/978-3-642-29075-6