Abstract
Since its inception, the main role of information theory has been to provide the engineering and scientific communities with a mathematical framework for the theory of communication by establishing the fundamental limits on the performance of various communication systems.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Shannon borrowed the term “entropy” from statistical mechanics since his quantity admits the same expression as Boltzmann’s entropy [55].
- 2.
See [359] for accessing most of Shannon’s works, including his master’s thesis [337, 338] which made a breakthrough connection between electrical switching circuits and Boolean algebra and played a catalyst role in the digital revolution, his dissertation on an algebraic framework for population genetics [339], and his seminal paper on information-theoretic cryptography [342]. Refer also to [362] for a recent (nontechnical) biography on Shannon and [146] for a broad discourse on the history of information and on the information age.
- 3.
Except for a brief interlude with the continuous-time (waveform) Gaussian channel in Chap. 5, we will consider discrete-time communication systems throughout the text.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2018 Springer Nature Singapore Pte Ltd.
About this chapter
Cite this chapter
Alajaji, F., Chen, PN. (2018). Introduction. In: An Introduction to Single-User Information Theory. Springer Undergraduate Texts in Mathematics and Technology. Springer, Singapore. https://doi.org/10.1007/978-981-10-8001-2_1
Download citation
DOI: https://doi.org/10.1007/978-981-10-8001-2_1
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-10-8000-5
Online ISBN: 978-981-10-8001-2
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)