Skip to main content

2018 | Buch

An Introduction to Single-User Information Theory

insite
SUCHEN

Über dieses Buch

This book presents a succinct and mathematically rigorous treatment of the main pillars of Shannon’s information theory, discussing the fundamental concepts and indispensable results of Shannon’s mathematical theory of communications. It includes five meticulously written core chapters (with accompanying problems), emphasizing the key topics of information measures; lossless and lossy data compression; channel coding; and joint source-channel coding for single-user (point-to-point) communications systems. It also features two appendices covering necessary background material in real analysis and in probability theory and stochastic processes.

The book is ideal for a one-semester foundational course on information theory for senior undergraduate and entry-level graduate students in mathematics, statistics, engineering, and computing and information sciences. A comprehensive instructor’s solutions manual is available.

Inhaltsverzeichnis

Frontmatter
Chapter 1. Introduction
Abstract
Since its inception, the main role of information theory has been to provide the engineering and scientific communities with a mathematical framework for the theory of communication by establishing the fundamental limits on the performance of various communication systems.
Fady Alajaji, Po-Ning Chen
Chapter 2. Information Measures for Discrete Systems
Abstract
In this chapter, we define Shannon’s information measures for discrete-time discrete-alphabet systems from a probabilistic standpoint and develop their properties. Elucidating the operational significance of probabilistically defined information measures vis-a-vis the fundamental limits of coding constitutes a main objective of this book; this will be seen in the subsequent chapters.
Fady Alajaji, Po-Ning Chen
Chapter 3. Lossless Data Compression
Abstract
As mentioned in Chap. 1, data compression describes methods of representing a source by a code whose average codeword length (or code rate) is acceptably small. The representation can be lossless (or asymptotically lossless) where the reconstructed source is identical (or identical with vanishing error probability) to the original source; or lossy where the reconstructed source is allowed to deviate from the original source, usually within an acceptable threshold. We herein focus on lossless data compression.
Fady Alajaji, Po-Ning Chen
Chapter 4. Data Transmission and Channel Capacity
Abstract
A noisy communication channel is an input–output medium in which the output is not completely or deterministically specified by the input. The channel is indeed stochastically modeled, where given channel input x, the channel output y is governed by a transition (conditional) probability distribution denoted by \(P_{Y|X}(y|x)\). Since two different inputs may give rise to the same output, the receiver, upon receipt of an output, needs to guess the most probable sent input.
Fady Alajaji, Po-Ning Chen
Chapter 5. Differential Entropy and Gaussian Channels
Abstract
We have so far examined information measures and their operational characterization for discrete-time discrete-alphabet systems. In this chapter, we turn our focus to continuous-alphabet (real-valued) systems. Except for a brief interlude with the continuous-time (waveform) Gaussian channel, we consider discrete-time systems, as treated throughout the book.
Fady Alajaji, Po-Ning Chen
Chapter 6. Lossy Data Compression and Transmission
Abstract
In a number of situations, one may need to compress a source to a rate less than the source entropy, which as we saw in Chap. 3 is the minimum lossless data compression rate.
Fady Alajaji, Po-Ning Chen
Backmatter
Metadaten
Titel
An Introduction to Single-User Information Theory
verfasst von
Fady Alajaji
Dr. Po-Ning Chen
Copyright-Jahr
2018
Verlag
Springer Singapore
Electronic ISBN
978-981-10-8001-2
Print ISBN
978-981-10-8000-5
DOI
https://doi.org/10.1007/978-981-10-8001-2