Skip to main content
Top

2014 | OriginalPaper | Chapter

3. The Entropy as a Measure of Uncertainty

Authors : Rudolf Ahlswede, Alexander Ahlswede, Ingo Althöfer, Christian Deppe, Ulrich Tamm

Published in: Storing and Transmitting Data

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Up to now we had an operational access to the entropy function, i.e., the entropy was involved into the solution of a mathematical problem. More specifically, the entropy turned out to be a measure for data compression. In this lecture we will take a different approach and interpret the entropy as a measure of uncertainty of an experiment with \(n\) possible outcomes, where each outcome will take place with a certain probability. The approach will be axiomatic, i.e., some “reasonable” conditions which a measure of uncertainty should possess are postulated.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Appendix
Available only for authorised users
Literature
2.
go back to reference A.Y. Khinchin, The concept of entropy in probability theory, (in Russian), Uspekhi Mat. Nauk, 8(3), translation in “Mathematical Foundations of Information Theory” (Dover Publications Inc, New York, 1953). 1958, pp. 3–20 A.Y. Khinchin, The concept of entropy in probability theory, (in Russian), Uspekhi Mat. Nauk, 8(3), translation in “Mathematical Foundations of Information Theory” (Dover Publications Inc, New York, 1953). 1958, pp. 3–20
3.
go back to reference D.K. Faddeev, On the concept of entropy of a finite probabilistic scheme. Uspehi Mat. Nauk 11(1), 227–231 (1956)MATHMathSciNet D.K. Faddeev, On the concept of entropy of a finite probabilistic scheme. Uspehi Mat. Nauk 11(1), 227–231 (1956)MATHMathSciNet
4.
go back to reference J. Aczél, Z. Daróczy, On measures of information and their characterizations, Mathematics in Science and Engineering, vol. 115 (Academic Press, New York-London, 1975) J. Aczél, Z. Daróczy, On measures of information and their characterizations, Mathematics in Science and Engineering, vol. 115 (Academic Press, New York-London, 1975)
5.
go back to reference R. Ahlswede, Multi-way communication channels, in Proceedings of 2nd International Symposium on Information Theory, Thakadsor, Armenian SSR, vol. 23–52 (Akademiai Kiado, Budapest, 1973) Sept. 1971 R. Ahlswede, Multi-way communication channels, in Proceedings of 2nd International Symposium on Information Theory, Thakadsor, Armenian SSR, vol. 23–52 (Akademiai Kiado, Budapest, 1973) Sept. 1971
7.
go back to reference R.L. Dobrushin, General formulation of shannon’s basic theorem in information theory, (in russian) uspehi mat. Nauk 14(6), 3–104 (1959)MATHMathSciNet R.L. Dobrushin, General formulation of shannon’s basic theorem in information theory, (in russian) uspehi mat. Nauk 14(6), 3–104 (1959)MATHMathSciNet
8.
go back to reference R.L. Dobrushin, Arbeiten zur Informationstheorie IV (VEB Deutscher Verlag der Wissenschaften, Berlin, 1963) R.L. Dobrushin, Arbeiten zur Informationstheorie IV (VEB Deutscher Verlag der Wissenschaften, Berlin, 1963)
9.
go back to reference F. Topsøe, A new proof of a result concerning computation of the capacity for a discrete channel. Z. Wahrscheinlichkeitstheorie und Verw. Gebiete 22, 166–168 (1972) F. Topsøe, A new proof of a result concerning computation of the capacity for a discrete channel. Z. Wahrscheinlichkeitstheorie und Verw. Gebiete 22, 166–168 (1972)
10.
go back to reference F. Topsøe, Informationstheorie (Teubner, Stuttgart, 1973) F. Topsøe, Informationstheorie (Teubner, Stuttgart, 1973)
11.
go back to reference L.L. Campbell, A coding theorem and Rényi’ s entropy. Inf. Control 8(4), 423–429 (1965) L.L. Campbell, A coding theorem and Rényi’ s entropy. Inf. Control 8(4), 423–429 (1965)
12.
13.
go back to reference J. Balatoni, A. Rényi, On the notion of entropy, 1956. Selected Papers of Alfred Rényi, Vol. 1, 558–585 (Akadémiai Kiadó, Budapest, 1976) J. Balatoni, A. Rényi, On the notion of entropy, 1956. Selected Papers of Alfred Rényi, Vol. 1, 558–585 (Akadémiai Kiadó, Budapest, 1976)
14.
go back to reference A. Rényi, On the amount of information concerning an unknown parameter in a space of observations, 1964. Selected Papers of Alfred Rényi, Vol. 3 (Akadémiai Kiadó, Budapest, 1976), pp. 272–279 A. Rényi, On the amount of information concerning an unknown parameter in a space of observations, 1964. Selected Papers of Alfred Rényi, Vol. 3 (Akadémiai Kiadó, Budapest, 1976), pp. 272–279
15.
go back to reference A. Rényi, On some problems of statistics from the point of view of information theory, 1969. Selected Papers of Alfred Rényi, Vol. 3 (Akadémiai Kiadó, Budapest, 1976), pp. 560–576 A. Rényi, On some problems of statistics from the point of view of information theory, 1969. Selected Papers of Alfred Rényi, Vol. 3 (Akadémiai Kiadó, Budapest, 1976), pp. 560–576
17.
go back to reference J. Aczél, Z. Daróczy, Charakterisierung der entropien positiver ordnung und der shannon chen entropie. Acla Math. Acad. Sci. Hung. 14, 95–121 (1963)CrossRefMATH J. Aczél, Z. Daróczy, Charakterisierung der entropien positiver ordnung und der shannon chen entropie. Acla Math. Acad. Sci. Hung. 14, 95–121 (1963)CrossRefMATH
18.
go back to reference E.F. Beckenbach, R. Bellman, Inequalities (Springer, Berlin, 1961) E.F. Beckenbach, R. Bellman, Inequalities (Springer, Berlin, 1961)
20.
go back to reference N. Blachman, The convolution inequality for entropy powers. IEEE Trans. Inf. Theory IT-11, 267–271 (1965) N. Blachman, The convolution inequality for entropy powers. IEEE Trans. Inf. Theory IT-11, 267–271 (1965)
21.
go back to reference B.S. Choi, T. Cover, An information-theoretic proof of burg’s maximum entropy spectrum. Proc. IEEE 72, 1094–1095 (1984)CrossRef B.S. Choi, T. Cover, An information-theoretic proof of burg’s maximum entropy spectrum. Proc. IEEE 72, 1094–1095 (1984)CrossRef
22.
go back to reference T. Cover, A. El Gamal, An information theoretic proof of Hadamard’s inequality. IEEE Trans. Inf. Theory IT-29, 930–931 (1983) T. Cover, A. El Gamal, An information theoretic proof of Hadamard’s inequality. IEEE Trans. Inf. Theory IT-29, 930–931 (1983)
23.
go back to reference I. Csiszár, \(i\)-divergence geometry of probability distributions and minimization problems. Ann. Probab. 3(1), 146–158 (1975)CrossRefMATH I. Csiszár, \(i\)-divergence geometry of probability distributions and minimization problems. Ann. Probab. 3(1), 146–158 (1975)CrossRefMATH
24.
go back to reference I. Csiszár, Why least squares and maximum entropy? an axiomatic approach to interference for linear inverse problems. Ann. Statist. 19(4), 2033–2066 (1991)CrossRef I. Csiszár, Why least squares and maximum entropy? an axiomatic approach to interference for linear inverse problems. Ann. Statist. 19(4), 2033–2066 (1991)CrossRef
25.
go back to reference I.Csiszár, New axiomatic results on inference for inverse problems. Studia Sci. Math. Hungar 26, 207–237 (1991) I.Csiszár, New axiomatic results on inference for inverse problems. Studia Sci. Math. Hungar 26, 207–237 (1991)
26.
go back to reference Z. Daróczy, Über die gemeinsame Charakterisierung der zu den nicht vollständigen Verteilungen gehörigen Entropien von Shannon und Rényi”. Z. Wahrscheinlichkeitstheorie u. Verw. Gebiete 1, 381–388 (1963) Z. Daróczy, Über die gemeinsame Charakterisierung der zu den nicht vollständigen Verteilungen gehörigen Entropien von Shannon und Rényi”. Z. Wahrscheinlichkeitstheorie u. Verw. Gebiete 1, 381–388 (1963)
27.
go back to reference Ky Fan, On a theorem of weyl concerning the eigenvalues of linear transformations, ii. Proc. Natl. Acad. Sci. USA 36, 31–35 (1950)CrossRef Ky Fan, On a theorem of weyl concerning the eigenvalues of linear transformations, ii. Proc. Natl. Acad. Sci. USA 36, 31–35 (1950)CrossRef
29.
go back to reference A. Feinstein, Foundations of Information Theory (McGraw-Hill, New York, 1958) A. Feinstein, Foundations of Information Theory (McGraw-Hill, New York, 1958)
30.
go back to reference S. Kullback, Information Theory and Statistics (Willey, New York, 1959) S. Kullback, Information Theory and Statistics (Willey, New York, 1959)
31.
go back to reference A. Marshall, I. Olkin, Inequalities: Theory of Majorization and Its Applications (Academic Press, New York, 1979) A. Marshall, I. Olkin, Inequalities: Theory of Majorization and Its Applications (Academic Press, New York, 1979)
33.
go back to reference H. Minkowski, Diskontunuitätsbereich für arithmetische äquivalenz. J. für Math. 129, 220–274 (1950) H. Minkowski, Diskontunuitätsbereich für arithmetische äquivalenz. J. für Math. 129, 220–274 (1950)
34.
go back to reference L. Mirsky, On a generalization of Hadamard’s determinantal inequality due to Szasz. Arch. Math. VIII, pp. 274–275 (1957) L. Mirsky, On a generalization of Hadamard’s determinantal inequality due to Szasz. Arch. Math. VIII, pp. 274–275 (1957)
36.
go back to reference A. Rényi, On measures of entropy and information, Proceedings of the Berkeley Symposium on Mathematical Statistics and Probability, Vol. 1 (University of California Press, Berkeley, 1961), pp. 547–561 A. Rényi, On measures of entropy and information, Proceedings of the Berkeley Symposium on Mathematical Statistics and Probability, Vol. 1 (University of California Press, Berkeley, 1961), pp. 547–561
37.
go back to reference A. Rényi, Wahrscheinlichkeitsrechnung Mit einem Anhang über Informationstheory (VEB Deutscher Verlag der Wissenschaften, Berlin, 1962) A. Rényi, Wahrscheinlichkeitsrechnung Mit einem Anhang über Informationstheory (VEB Deutscher Verlag der Wissenschaften, Berlin, 1962)
38.
go back to reference A. Rényi, On the amount of information in a frequency-count, 1964. Selected Papers of Alfred Rényi, Vol. 3 (Akadémiai Kiadó, Budapest, 1976), pp. 280–283 A. Rényi, On the amount of information in a frequency-count, 1964. Selected Papers of Alfred Rényi, Vol. 3 (Akadémiai Kiadó, Budapest, 1976), pp. 280–283
39.
go back to reference A. Rényi, On the amount of missing information and the Neymann-Pearson lemma, 1966. Selected Papers of Alfred Rényi, Vol. 3 (Akadémiai Kiadó, Budapest, 1976), pp. 432–439 A. Rényi, On the amount of missing information and the Neymann-Pearson lemma, 1966. Selected Papers of Alfred Rényi, Vol. 3 (Akadémiai Kiadó, Budapest, 1976), pp. 432–439
40.
go back to reference A. Rényi, On the amount of information in a random variable concerning an event, 1966. Selected Papers of Alfred Rényi, Vol. 3 (Akadémiai Kiadó, Budapest, 1976), pp. 440–443 A. Rényi, On the amount of information in a random variable concerning an event, 1966. Selected Papers of Alfred Rényi, Vol. 3 (Akadémiai Kiadó, Budapest, 1976), pp. 440–443
41.
go back to reference A. Rényi, Statistics and information theory, 1967. Selected Papers of Alfred Rényi, Vol. 3 (Akadémiai Kiadó, Budapest, 1976), pp. 449–456 A. Rényi, Statistics and information theory, 1967. Selected Papers of Alfred Rényi, Vol. 3 (Akadémiai Kiadó, Budapest, 1976), pp. 449–456
42.
go back to reference A. Rényi, On some basic problems of statistics from the point of view of information theory, 1965–1967. Selected Papers of Alfred Rényi, Vol. 3 (Akadémiai Kiadó, Budapest, 1976), pp. 479–491 A. Rényi, On some basic problems of statistics from the point of view of information theory, 1965–1967. Selected Papers of Alfred Rényi, Vol. 3 (Akadémiai Kiadó, Budapest, 1976), pp. 479–491
43.
go back to reference A. Rényi, Information and statistics, 1968. Selected Papers of Alfred Rényi, Vol. 3 (Akadémiai Kiadó, Budapest, 1976), pp. 528–530 A. Rényi, Information and statistics, 1968. Selected Papers of Alfred Rényi, Vol. 3 (Akadémiai Kiadó, Budapest, 1976), pp. 528–530
44.
go back to reference J.E. Shore, R.W. Johnson, Axiomatic derivation of the principle of maximum entropy and principle of minimum of cross-entropy. IEEE Trans. Inf. Theory 26, 26–37 (1980)CrossRefMATHMathSciNet J.E. Shore, R.W. Johnson, Axiomatic derivation of the principle of maximum entropy and principle of minimum of cross-entropy. IEEE Trans. Inf. Theory 26, 26–37 (1980)CrossRefMATHMathSciNet
Metadata
Title
The Entropy as a Measure of Uncertainty
Authors
Rudolf Ahlswede
Alexander Ahlswede
Ingo Althöfer
Christian Deppe
Ulrich Tamm
Copyright Year
2014
DOI
https://doi.org/10.1007/978-3-319-05479-7_3

Premium Partner