Skip to main content

1987 | OriginalPaper | Buchkapitel

Essential Average Mutual Information

verfasst von : Yaser S. Abu-Mostafa

Erschienen in: Open Problems in Communication and Computation

Verlag: Springer New York

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Consider two dependent random variables (S, C) and suppose that $$\hat{\chi }$$ is the optimal estimate of C when only S is known. I(S; C) is a measure of how much S tells us about C, and $$I(\hat{\chi } ; C)$$ is a measure of how much our optimal estimate $$\hat{\chi }$$ tells us about C. What can we say about $$I(\hat{\chi } ; C)$$ if we know that I(S; C) = 3 bits, for example? The optimality of $$\hat{\chi }$$ suggests that $$I(\hat{\chi } ; C)$$ should also be close to 3 bits. This is what we address in this problem. Let (S, C) be jointly distributed ~p(s,c), where S = {0,..., N−1}, and C = {0,..., M−1}. Let $$\hat{\chi }:\{ 0, \ldots ,N - 1\} \to \{ 0, \ldots ,M - 1\}$$ denote an arbitrary function of the outcomes of S. The problem is to estimate the numbers α(N, M) defined by $$\alpha (N,M) = \mathop{{\inf }}\limits_{{p:I(S;C) > 0}} \mathop{{\max }}\limits_{{\hat{\chi } = \hat{C}(S)}} \left( {\frac{{I(\hat{\chi };C)}}{{I(S;C)}}} \right)$$ Since $$I(\hat{\chi };C) \leqslant I(S;C)$$ (data processing inequality), α(N, M) ≤ 1 . In fact, α(N, M) < 1 for N, M as shown in the following example for α(3, 2).

Metadaten
Titel
Essential Average Mutual Information
verfasst von
Yaser S. Abu-Mostafa
Copyright-Jahr
1987
Verlag
Springer New York
DOI
https://doi.org/10.1007/978-1-4612-4808-8_19