Skip to main content
main-content

Über dieses Buch

The volume “Storing and Transmitting Data” is based on Rudolf Ahlswede's introductory course on "Information Theory I" and presents an introduction to Shannon Theory. Readers, familiar or unfamiliar with the technical intricacies of Information Theory, will benefit considerably from working through the book; especially Chapter VI with its lively comments and uncensored insider views from the world of science and research offers informative and revealing insights. This is the first of several volumes that will serve as a collected research documentation of Rudolf Ahlswede’s lectures on information theory. Each volume includes comments from an invited well-known expert. Holger Boche contributed his insights in the supplement of the present volume.

Classical information processing concerns the main tasks of gaining knowledge, storage, transmitting and hiding data. The first task is the prime goal of Statistics. For the two next, Shannon presented an impressive mathematical theory called Information Theory, which he based on probabilistic models. The theory largely involves the concept of codes with small error probabilities in spite of noise in the transmission, which is modeled by channels. The lectures presented in this work are suitable for graduate students in Mathematics, and also in Theoretical Computer Science, Physics, and Electrical Engineering with background in basic Mathematics. The lectures can be used as the basis for courses or to supplement courses in many ways. Ph.D. students will also find research problems, often with conjectures, that offer potential subjects for a thesis. More advanced researchers may find the basis of entire research programs.

Inhaltsverzeichnis

Frontmatter

Chapter 1. Introduction

Abstract
In the last decades great progress has been made in the technologies of information and communication (e.g. smaller and faster microchips, storage media like CD’s and usb sticks, glass fibre cables, TV satellites, etc.).
Rudolf Ahlswede, Alexander Ahlswede, Ingo Althöfer, Christian Deppe, Ulrich Tamm

Storing Data

Frontmatter

Chapter 2. Data Compression

Abstract
The source model discussed throughout this chapter is the Discrete Source (DS). Such a source is a pair \((\mathcal{X},P)\), where \(\mathcal{X}\triangleq \{1,\dots ,a\}\), say, is a finite alphabet and \(P\triangleq (P(1),\dots ,P(a))\) is a probability distribution on \(\mathcal{X}\). A discrete source can also be described by a random variable \(X:\mathcal{X}\rightarrow \mathcal{X}\), where \(\text {Prob}(X=x)=P(x)\) for all \(x\in \mathcal{X}\).
Rudolf Ahlswede, Alexander Ahlswede, Ingo Althöfer, Christian Deppe, Ulrich Tamm

Chapter 3. The Entropy as a Measure of Uncertainty

Abstract
Up to now we had an operational access to the entropy function, i.e., the entropy was involved into the solution of a mathematical problem. More specifically, the entropy turned out to be a measure for data compression. In this lecture we will take a different approach and interpret the entropy as a measure of uncertainty of an experiment with \(n\) possible outcomes, where each outcome will take place with a certain probability. The approach will be axiomatic, i.e., some “reasonable” conditions which a measure of uncertainty should possess are postulated.
Rudolf Ahlswede, Alexander Ahlswede, Ingo Althöfer, Christian Deppe, Ulrich Tamm

Chapter 4. Universal Coding

Abstract
We will deal with the source \( ( {\mathbb {N}}, P ) \) whose set of messages is \( {\mathbb {N}}= \{ 1,2,... \} \) and the probability distribution is given by \( P = \{ P( j ), j \in {\mathbb {N}}\} \). We also assume that \(P( 1 ) \ge P( 2 ) \ge P( 3 ) \ge \cdots \) Note that our considerations can be extended to a more general case of countable sources defined as follows.
Rudolf Ahlswede, Alexander Ahlswede, Ingo Althöfer, Christian Deppe, Ulrich Tamm

Transmitting Data

Frontmatter

Chapter 5. Coding Theorems and Converses for the DMC

Abstract
In [13] 1948 C.E. Shannon initiated research in Information Theory with his paper ‘A Mathematical Theory of Communication’, in which he investigated several problems concerning the transmission of information from a source to a destination (see Chap. 1). In this lecture we shall discuss the three fundamental results of this pioneering paper: (1) the Source Coding Theorem for the Discrete Memoryless Source (DMS), (2) the Coding Theorem for the Discrete Memoryless Channel (DMC), (3) the Rate Distortion Theorem. In each of these fundamental theorems the entropy arises as a measure for data compression.
Rudolf Ahlswede, Alexander Ahlswede, Ingo Althöfer, Christian Deppe, Ulrich Tamm

Chapter 6. Towards Capacity Functions

Abstract
Among the mostly investigated parameters for noisy channels are code size, error probability in decoding, block length; rate, capacity, reliability function; delay, complexity of coding. There are several statements about connections between these quantities. They carry names like “coding theorem”, “converse theorem” (weak, strong, ...), “direct theorem”, “capacity theorem”, “lower bound”, “upper bound”, etc. There are analogous notions for source coding.
Rudolf Ahlswede, Alexander Ahlswede, Ingo Althöfer, Christian Deppe, Ulrich Tamm

Chapter 7. Error Bounds

Abstract
A binary symmetric channel, abbreviated by BSC, is a DMC with a transmission matrix
$$W=\left( \begin{array}{cc} 1-\varepsilon &{} \varepsilon \\ \varepsilon &{} 1-\varepsilon \end{array} \right) , \quad 0 \le \varepsilon \le \frac{1}{2}$$
.
Rudolf Ahlswede, Alexander Ahlswede, Ingo Althöfer, Christian Deppe, Ulrich Tamm

Appendix

Frontmatter

Chapter 8. Inequalities

Abstract
A function \(f:A\rightarrow \mathbb R\), \(A\subset \mathbb R^n\), is convex on \(A\), if for all \(x,y\in A\) and for all \(\alpha \in [0,1]\)
$$ f(\alpha \cdot x+(1-\alpha )\cdot y)\le \alpha \cdot f(x)+(1-\alpha )\cdot f(y). $$
Accordingly, \(f\) is said to be concave, if \(f(\alpha \cdot x+(1-\alpha )\cdot y)\ge \alpha \cdot f(x)+(1-\alpha )\cdot f(y)\), and strictly convex (strictly concave), if strict inequality holds.
Rudolf Ahlswede, Alexander Ahlswede, Ingo Althöfer, Christian Deppe, Ulrich Tamm

Backmatter

Weitere Informationen