Skip to main content

2017 | OriginalPaper | Buchkapitel

Entropy and Thinning of Discrete Random Variables

verfasst von : Oliver Johnson

Erschienen in: Convexity and Concentration

Verlag: Springer New York

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

We describe five types of results concerning information and concentration of discrete random variables, and relationships between them, motivated by their counterparts in the continuous case. The results we consider are information theoretic approaches to Poisson approximation, the maximum entropy property of the Poisson distribution, discrete concentration (Poincaré and logarithmic Sobolev) inequalities, monotonicity of entropy and concavity of entropy in the Shepp–Olkin regime.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat J. A. Adell, A. Lekuona, and Y. Yu. Sharp bounds on the entropy of the Poisson law and related quantities. IEEE Trans. Inform. Theory, 56(5):2299–2306, May 2010. J. A. Adell, A. Lekuona, and Y. Yu. Sharp bounds on the entropy of the Poisson law and related quantities. IEEE Trans. Inform. Theory, 56(5):2299–2306, May 2010.
2.
Zurück zum Zitat S.-i. Amari, O. E. Barndorff-Nielsen, R. E. Kass, S. L. Lauritzen, and C. R. Rao. Differential geometry in statistical inference. Institute of Mathematical Statistics Lecture Notes—Monograph Series, 10. Institute of Mathematical Statistics, Hayward, CA, 1987. S.-i. Amari, O. E. Barndorff-Nielsen, R. E. Kass, S. L. Lauritzen, and C. R. Rao. Differential geometry in statistical inference. Institute of Mathematical Statistics Lecture Notes—Monograph Series, 10. Institute of Mathematical Statistics, Hayward, CA, 1987.
3.
Zurück zum Zitat S.-i. Amari and H. Nagaoka. Methods of information geometry, volume 191 of Translations of Mathematical Monographs. American Mathematical Society, Providence, RI, 2000. S.-i. Amari and H. Nagaoka. Methods of information geometry, volume 191 of Translations of Mathematical Monographs. American Mathematical Society, Providence, RI, 2000.
4.
Zurück zum Zitat L. Ambrosio, N. Gigli, and G. Savaré. Gradient flows in metric spaces and in the space of probability measures. Lectures in Mathematics ETH Zürich. Birkhäuser Verlag, Basel, second edition, 2008. L. Ambrosio, N. Gigli, and G. Savaré. Gradient flows in metric spaces and in the space of probability measures. Lectures in Mathematics ETH Zürich. Birkhäuser Verlag, Basel, second edition, 2008.
5.
Zurück zum Zitat V. Anantharam. Counterexamples to a proposed Stam inequality on finite groups. IEEE Trans. Inform. Theory, 56(4):1825–1827, 2010.MathSciNetCrossRef V. Anantharam. Counterexamples to a proposed Stam inequality on finite groups. IEEE Trans. Inform. Theory, 56(4):1825–1827, 2010.MathSciNetCrossRef
6.
Zurück zum Zitat C. Ané, S. Blachere, D. Chafaï, P. Fougeres, I. Gentil, F. Malrieu, C. Roberto, and G. Scheffer. Sur les inégalités de Sobolev logarithmiques. Panoramas et Syntheses, 10:217, 2000.MATH C. Ané, S. Blachere, D. Chafaï, P. Fougeres, I. Gentil, F. Malrieu, C. Roberto, and G. Scheffer. Sur les inégalités de Sobolev logarithmiques. Panoramas et Syntheses, 10:217, 2000.MATH
7.
Zurück zum Zitat S. Artstein, K. M. Ball, F. Barthe, and A. Naor. On the rate of convergence in the entropic central limit theorem. Probab. Theory Related Fields, 129(3):381–390, 2004.MathSciNetCrossRefMATH S. Artstein, K. M. Ball, F. Barthe, and A. Naor. On the rate of convergence in the entropic central limit theorem. Probab. Theory Related Fields, 129(3):381–390, 2004.MathSciNetCrossRefMATH
8.
Zurück zum Zitat S. Artstein, K. M. Ball, F. Barthe, and A. Naor. Solution of Shannon’s problem on the monotonicity of entropy. J. Amer. Math. Soc., 17(4):975–982 (electronic), 2004. S. Artstein, K. M. Ball, F. Barthe, and A. Naor. Solution of Shannon’s problem on the monotonicity of entropy. J. Amer. Math. Soc., 17(4):975–982 (electronic), 2004.
9.
Zurück zum Zitat D. Bakry and M. Émery. Diffusions hypercontractives. In Séminaire de probabilités, XIX, volume 1123 of Lecture Notes in Math., pages 177–206. Springer, Berlin, 1985. D. Bakry and M. Émery. Diffusions hypercontractives. In Séminaire de probabilités, XIX, volume 1123 of Lecture Notes in Math., pages 177–206. Springer, Berlin, 1985.
10.
Zurück zum Zitat D. Bakry, I. Gentil, and M. Ledoux. Analysis and geometry of Markov diffusion operators, volume 348 of Grundlehren der mathematischen Wissenschaften. Springer, 2014.CrossRefMATH D. Bakry, I. Gentil, and M. Ledoux. Analysis and geometry of Markov diffusion operators, volume 348 of Grundlehren der mathematischen Wissenschaften. Springer, 2014.CrossRefMATH
11.
12.
Zurück zum Zitat A. Barbour, L. Holst, and S. Janson. Poisson Approximation. Clarendon Press, Oxford, 1992.MATH A. Barbour, L. Holst, and S. Janson. Poisson Approximation. Clarendon Press, Oxford, 1992.MATH
13.
Zurück zum Zitat A. Barbour, O. T. Johnson, I. Kontoyiannis, and M. Madiman. Compound Poisson approximation via local information quantities. Electronic Journal of Probability, 15:1344–1369, 2010.MathSciNetCrossRefMATH A. Barbour, O. T. Johnson, I. Kontoyiannis, and M. Madiman. Compound Poisson approximation via local information quantities. Electronic Journal of Probability, 15:1344–1369, 2010.MathSciNetCrossRefMATH
15.
Zurück zum Zitat J.-D. Benamou and Y. Brenier. A numerical method for the optimal time-continuous mass transport problem and related problems. In Monge Ampère equation: applications to geometry and optimization (Deerfield Beach, FL, 1997), volume 226 of Contemp. Math., pages 1–11. Amer. Math. Soc., Providence, RI, 1999. J.-D. Benamou and Y. Brenier. A numerical method for the optimal time-continuous mass transport problem and related problems. In Monge Ampère equation: applications to geometry and optimization (Deerfield Beach, FL, 1997), volume 226 of Contemp. Math., pages 1–11. Amer. Math. Soc., Providence, RI, 1999.
16.
Zurück zum Zitat J.-D. Benamou and Y. Brenier. A computational fluid mechanics solution to the Monge-Kantorovich mass transfer problem. Numer. Math., 84(3):375–393, 2000.MathSciNetCrossRefMATH J.-D. Benamou and Y. Brenier. A computational fluid mechanics solution to the Monge-Kantorovich mass transfer problem. Numer. Math., 84(3):375–393, 2000.MathSciNetCrossRefMATH
17.
18.
Zurück zum Zitat S. G. Bobkov, G. P. Chistyakov, and F. Götze. Convergence to stable laws in relative entropy. Journal of Theoretical Probability, 26(3):803–818, 2013.MathSciNetCrossRefMATH S. G. Bobkov, G. P. Chistyakov, and F. Götze. Convergence to stable laws in relative entropy. Journal of Theoretical Probability, 26(3):803–818, 2013.MathSciNetCrossRefMATH
19.
Zurück zum Zitat S. G. Bobkov, G. P. Chistyakov, and F. Götze. Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem. Ann. Probab., 41(4):2479–2512, 2013.MathSciNetCrossRefMATH S. G. Bobkov, G. P. Chistyakov, and F. Götze. Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem. Ann. Probab., 41(4):2479–2512, 2013.MathSciNetCrossRefMATH
20.
Zurück zum Zitat S. G. Bobkov, G. P. Chistyakov, and F. Götze. Berry–Esseen bounds in the entropic central limit theorem. Probability Theory and Related Fields, 159(3–4):435–478, 2014.MathSciNetCrossRefMATH S. G. Bobkov, G. P. Chistyakov, and F. Götze. Berry–Esseen bounds in the entropic central limit theorem. Probability Theory and Related Fields, 159(3–4):435–478, 2014.MathSciNetCrossRefMATH
21.
Zurück zum Zitat S. G. Bobkov, G. P. Chistyakov, and F. Götze. Fisher information and convergence to stable laws. Bernoulli, 20(3):1620–1646, 2014.MathSciNetCrossRefMATH S. G. Bobkov, G. P. Chistyakov, and F. Götze. Fisher information and convergence to stable laws. Bernoulli, 20(3):1620–1646, 2014.MathSciNetCrossRefMATH
22.
Zurück zum Zitat S. G. Bobkov and M. Ledoux. On modified logarithmic Sobolev inequalities for Bernoulli and Poisson measures. J. Funct. Anal., 156(2):347–365, 1998.MathSciNetCrossRefMATH S. G. Bobkov and M. Ledoux. On modified logarithmic Sobolev inequalities for Bernoulli and Poisson measures. J. Funct. Anal., 156(2):347–365, 1998.MathSciNetCrossRefMATH
23.
Zurück zum Zitat A. Borovkov and S. Utev. On an inequality and a related characterisation of the normal distribution. Theory Probab. Appl., 28(2):219–228, 1984.CrossRefMATH A. Borovkov and S. Utev. On an inequality and a related characterisation of the normal distribution. Theory Probab. Appl., 28(2):219–228, 1984.CrossRefMATH
24.
Zurück zum Zitat P. Brändén. Iterated sequences and the geometry of zeros. J. Reine Angew. Math., 658:115–131, 2011.MathSciNetMATH P. Brändén. Iterated sequences and the geometry of zeros. J. Reine Angew. Math., 658:115–131, 2011.MathSciNetMATH
25.
Zurück zum Zitat L. D. Brown. A proof of the Central Limit Theorem motivated by the Cramér-Rao inequality. In G. Kallianpur, P. R. Krishnaiah, and J. K. Ghosh, editors, Statistics and Probability: Essays in Honour of C.R. Rao, pages 141–148. North-Holland, New York, 1982. L. D. Brown. A proof of the Central Limit Theorem motivated by the Cramér-Rao inequality. In G. Kallianpur, P. R. Krishnaiah, and J. K. Ghosh, editors, Statistics and Probability: Essays in Honour of C.R. Rao, pages 141–148. North-Holland, New York, 1982.
26.
Zurück zum Zitat T. Cacoullos. On upper and lower bounds for the variance of a function of a random variable. Ann. Probab., 10(3):799–809, 1982.MathSciNetCrossRefMATH T. Cacoullos. On upper and lower bounds for the variance of a function of a random variable. Ann. Probab., 10(3):799–809, 1982.MathSciNetCrossRefMATH
27.
Zurück zum Zitat L. A. Caffarelli. Monotonicity properties of optimal transportation and the FKG and related inequalities. Communications in Mathematical Physics, 214(3):547–563, 2000.MathSciNetCrossRefMATH L. A. Caffarelli. Monotonicity properties of optimal transportation and the FKG and related inequalities. Communications in Mathematical Physics, 214(3):547–563, 2000.MathSciNetCrossRefMATH
28.
Zurück zum Zitat P. Caputo, P. Dai Pra, and G. Posta. Convex entropy decay via the Bochner-Bakry-Emery approach. Ann. Inst. Henri Poincaré Probab. Stat., 45(3):734–753, 2009.MathSciNetCrossRefMATH P. Caputo, P. Dai Pra, and G. Posta. Convex entropy decay via the Bochner-Bakry-Emery approach. Ann. Inst. Henri Poincaré Probab. Stat., 45(3):734–753, 2009.MathSciNetCrossRefMATH
29.
Zurück zum Zitat E. Carlen and A. Soffer. Entropy production by block variable summation and Central Limit Theorems. Comm. Math. Phys., 140(2):339–371, 1991.MathSciNetCrossRefMATH E. Carlen and A. Soffer. Entropy production by block variable summation and Central Limit Theorems. Comm. Math. Phys., 140(2):339–371, 1991.MathSciNetCrossRefMATH
30.
Zurück zum Zitat E. A. Carlen and W. Gangbo. Constrained steepest descent in the 2-Wasserstein metric. Ann. of Math. (2), 157(3):807–846, 2003.MathSciNetCrossRefMATH E. A. Carlen and W. Gangbo. Constrained steepest descent in the 2-Wasserstein metric. Ann. of Math. (2), 157(3):807–846, 2003.MathSciNetCrossRefMATH
31.
Zurück zum Zitat D. Chafaï. Binomial-Poisson entropic inequalities and the M/M/∞ queue. ESAIM Probability and Statistics, 10:317–339, 2006.MathSciNetCrossRefMATH D. Chafaï. Binomial-Poisson entropic inequalities and the M/M/ queue. ESAIM Probability and Statistics, 10:317–339, 2006.MathSciNetCrossRefMATH
33.
Zurück zum Zitat D. Cordero-Erausquin. Some applications of mass transport to Gaussian-type inequalities. Arch. Ration. Mech. Anal., 161(3):257–269, 2002.MathSciNetCrossRefMATH D. Cordero-Erausquin. Some applications of mass transport to Gaussian-type inequalities. Arch. Ration. Mech. Anal., 161(3):257–269, 2002.MathSciNetCrossRefMATH
35.
Zurück zum Zitat F. Daly and O. T. Johnson. Bounds on the Poincaré constant under negative dependence. Statistics and Probability Letters, 83:511–518, 2013.MathSciNetCrossRefMATH F. Daly and O. T. Johnson. Bounds on the Poincaré constant under negative dependence. Statistics and Probability Letters, 83:511–518, 2013.MathSciNetCrossRefMATH
36.
Zurück zum Zitat A. Dembo, T. M. Cover, and J. A. Thomas. Information theoretic inequalities. IEEE Trans. Information Theory, 37(6):1501–1518, 1991.MathSciNetCrossRefMATH A. Dembo, T. M. Cover, and J. A. Thomas. Information theoretic inequalities. IEEE Trans. Information Theory, 37(6):1501–1518, 1991.MathSciNetCrossRefMATH
37.
Zurück zum Zitat Y. Derriennic. Entropie, théorèmes limite et marches aléatoires. In H. Heyer, editor, Probability Measures on Groups VIII, Oberwolfach, number 1210 in Lecture Notes in Mathematics, pages 241–284, Berlin, 1985. Springer-Verlag. In French. Y. Derriennic. Entropie, théorèmes limite et marches aléatoires. In H. Heyer, editor, Probability Measures on Groups VIII, Oberwolfach, number 1210 in Lecture Notes in Mathematics, pages 241–284, Berlin, 1985. Springer-Verlag. In French.
38.
Zurück zum Zitat M. Erbar and J. Maas. Ricci curvature of finite Markov chains via convexity of the entropy. Archive for Rational Mechanics and Analysis, 206:997–1038, 2012.MathSciNetCrossRefMATH M. Erbar and J. Maas. Ricci curvature of finite Markov chains via convexity of the entropy. Archive for Rational Mechanics and Analysis, 206:997–1038, 2012.MathSciNetCrossRefMATH
39.
Zurück zum Zitat B. V. Gnedenko and A. N. Kolmogorov. Limit distributions for sums of independent random variables. Addison-Wesley, Cambridge, Mass, 1954.MATH B. V. Gnedenko and A. N. Kolmogorov. Limit distributions for sums of independent random variables. Addison-Wesley, Cambridge, Mass, 1954.MATH
40.
Zurück zum Zitat B. V. Gnedenko and V. Y. Korolev. Random Summation: Limit Theorems and Applications. CRC Press, Boca Raton, Florida, 1996.MATH B. V. Gnedenko and V. Y. Korolev. Random Summation: Limit Theorems and Applications. CRC Press, Boca Raton, Florida, 1996.MATH
41.
Zurück zum Zitat N. Gozlan, C. Roberto, P.-M. Samson, and P. Tetali. Displacement convexity of entropy and related inequalities on graphs. Probability Theory and Related Fields, 160(1–2):47–94, 2014.MathSciNetCrossRefMATH N. Gozlan, C. Roberto, P.-M. Samson, and P. Tetali. Displacement convexity of entropy and related inequalities on graphs. Probability Theory and Related Fields, 160(1–2):47–94, 2014.MathSciNetCrossRefMATH
43.
Zurück zum Zitat A. Guionnet and B. Zegarlinski. Lectures on logarithmic Sobolev inequalities. In Séminaire de Probabilités, XXXVI, volume 1801 of Lecture Notes in Math., pages 1–134. Springer, Berlin, 2003. A. Guionnet and B. Zegarlinski. Lectures on logarithmic Sobolev inequalities. In Séminaire de Probabilités, XXXVI, volume 1801 of Lecture Notes in Math., pages 1–134. Springer, Berlin, 2003.
44.
Zurück zum Zitat D. Guo, S. Shamai, and S. Verdú. Mutual information and minimum mean-square error in Gaussian channels. IEEE Trans. Inform. Theory, 51(4):1261–1282, 2005.MathSciNetCrossRefMATH D. Guo, S. Shamai, and S. Verdú. Mutual information and minimum mean-square error in Gaussian channels. IEEE Trans. Inform. Theory, 51(4):1261–1282, 2005.MathSciNetCrossRefMATH
45.
Zurück zum Zitat S. Haghighatshoar, E. Abbe, and I. E. Telatar. A new entropy power inequality for integer-valued random variables. IEEE Transactions on Information Theory, 60(7):3787–3796, 2014.MathSciNetCrossRef S. Haghighatshoar, E. Abbe, and I. E. Telatar. A new entropy power inequality for integer-valued random variables. IEEE Transactions on Information Theory, 60(7):3787–3796, 2014.MathSciNetCrossRef
46.
Zurück zum Zitat P. Harremoës. Binomial and Poisson distributions as maximum entropy distributions. IEEE Trans. Information Theory, 47(5):2039–2041, 2001.MathSciNetCrossRefMATH P. Harremoës. Binomial and Poisson distributions as maximum entropy distributions. IEEE Trans. Information Theory, 47(5):2039–2041, 2001.MathSciNetCrossRefMATH
47.
Zurück zum Zitat P. Harremoës, O. T. Johnson, and I. Kontoyiannis. Thinning, entropy and the law of thin numbers. IEEE Trans. Inform. Theory, 56(9):4228–4244, 2010.MathSciNetCrossRef P. Harremoës, O. T. Johnson, and I. Kontoyiannis. Thinning, entropy and the law of thin numbers. IEEE Trans. Inform. Theory, 56(9):4228–4244, 2010.MathSciNetCrossRef
48.
Zurück zum Zitat P. Harremoës and C. Vignat. An Entropy Power Inequality for the binomial family. JIPAM. J. Inequal. Pure Appl. Math., 4, 2003. Issue 5, Article 93; see also http://jipam.vu.edu.au/. P. Harremoës and C. Vignat. An Entropy Power Inequality for the binomial family. JIPAM. J. Inequal. Pure Appl. Math., 4, 2003. Issue 5, Article 93; see also http://​jipam.​vu.​edu.​au/​.
49.
Zurück zum Zitat E. Hillion. Concavity of entropy along binomial convolutions. Electron. Commun. Probab., 17(4):1–9, 2012.MathSciNetMATH E. Hillion. Concavity of entropy along binomial convolutions. Electron. Commun. Probab., 17(4):1–9, 2012.MathSciNetMATH
50.
Zurück zum Zitat E. Hillion and O. T. Johnson. Discrete versions of the transport equation and the Shepp-Olkin conjecture. Annals of Probability, 44(1):276–306, 2016.MathSciNetCrossRefMATH E. Hillion and O. T. Johnson. Discrete versions of the transport equation and the Shepp-Olkin conjecture. Annals of Probability, 44(1):276–306, 2016.MathSciNetCrossRefMATH
51.
Zurück zum Zitat E. Hillion and O. T. Johnson. A proof of the Shepp-Olkin entropy concavity conjecture. Bernoulli (to appear), 2017. See also arxiv:1503.01570. E. Hillion and O. T. Johnson. A proof of the Shepp-Olkin entropy concavity conjecture. Bernoulli (to appear), 2017. See also arxiv:1503.01570.
52.
Zurück zum Zitat E. Hillion, O. T. Johnson, and Y. Yu. A natural derivative on [0, n] and a binomial Poincaré inequality. ESAIM Probability and Statistics, 16:703–712, 2014. E. Hillion, O. T. Johnson, and Y. Yu. A natural derivative on [0, n] and a binomial Poincaré inequality. ESAIM Probability and Statistics, 16:703–712, 2014.
53.
Zurück zum Zitat V. Jog and V. Anantharam. The entropy power inequality and Mrs. Gerber’s Lemma for groups of order 2 n . IEEE Transactions on Information Theory, 60(7):3773–3786, 2014. V. Jog and V. Anantharam. The entropy power inequality and Mrs. Gerber’s Lemma for groups of order 2 n . IEEE Transactions on Information Theory, 60(7):3773–3786, 2014.
54.
Zurück zum Zitat O. T. Johnson. Information theory and the Central Limit Theorem. Imperial College Press, London, 2004.CrossRefMATH O. T. Johnson. Information theory and the Central Limit Theorem. Imperial College Press, London, 2004.CrossRefMATH
55.
Zurück zum Zitat O. T. Johnson. Log-concavity and the maximum entropy property of the Poisson distribution. Stoch. Proc. Appl., 117(6):791–802, 2007.MathSciNetCrossRefMATH O. T. Johnson. Log-concavity and the maximum entropy property of the Poisson distribution. Stoch. Proc. Appl., 117(6):791–802, 2007.MathSciNetCrossRefMATH
56.
Zurück zum Zitat O. T. Johnson. A de Bruijn identity for symmetric stable laws. In submission, see arXiv:1310.2045, 2013. O. T. Johnson. A de Bruijn identity for symmetric stable laws. In submission, see arXiv:1310.2045, 2013.
58.
Zurück zum Zitat O. T. Johnson and A. R. Barron. Fisher information inequalities and the Central Limit Theorem. Probability Theory and Related Fields, 129(3):391–409, 2004.MathSciNetCrossRefMATH O. T. Johnson and A. R. Barron. Fisher information inequalities and the Central Limit Theorem. Probability Theory and Related Fields, 129(3):391–409, 2004.MathSciNetCrossRefMATH
59.
Zurück zum Zitat O. T. Johnson, I. Kontoyiannis, and M. Madiman. Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures. Discrete Applied Mathematics, 161:1232–1250, 2013.MathSciNetCrossRefMATH O. T. Johnson, I. Kontoyiannis, and M. Madiman. Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures. Discrete Applied Mathematics, 161:1232–1250, 2013.MathSciNetCrossRefMATH
60.
Zurück zum Zitat O. T. Johnson and Y. Yu. Monotonicity, thinning and discrete versions of the Entropy Power Inequality. IEEE Trans. Inform. Theory, 56(11):5387–5395, 2010.MathSciNetCrossRef O. T. Johnson and Y. Yu. Monotonicity, thinning and discrete versions of the Entropy Power Inequality. IEEE Trans. Inform. Theory, 56(11):5387–5395, 2010.MathSciNetCrossRef
61.
Zurück zum Zitat I. Johnstone and B. MacGibbon. Une mesure d’information caractérisant la loi de Poisson. In Séminaire de Probabilités, XXI, pages 563–573. Springer, Berlin, 1987. I. Johnstone and B. MacGibbon. Une mesure d’information caractérisant la loi de Poisson. In Séminaire de Probabilités, XXI, pages 563–573. Springer, Berlin, 1987.
62.
Zurück zum Zitat A. Kagan. A discrete version of the Stam inequality and a characterization of the Poisson distribution. J. Statist. Plann. Inference, 92(1-2):7–12, 2001.MathSciNetCrossRefMATH A. Kagan. A discrete version of the Stam inequality and a characterization of the Poisson distribution. J. Statist. Plann. Inference, 92(1-2):7–12, 2001.MathSciNetCrossRefMATH
65.
Zurück zum Zitat I. Kontoyiannis, P. Harremoës, and O. T. Johnson. Entropy and the law of small numbers. IEEE Trans. Inform. Theory, 51(2):466–472, 2005.MathSciNetCrossRefMATH I. Kontoyiannis, P. Harremoës, and O. T. Johnson. Entropy and the law of small numbers. IEEE Trans. Inform. Theory, 51(2):466–472, 2005.MathSciNetCrossRefMATH
66.
Zurück zum Zitat S. Kullback. A lower bound for discrimination information in terms of variation. IEEE Trans. Information Theory, 13:126–127, 1967.CrossRef S. Kullback. A lower bound for discrimination information in terms of variation. IEEE Trans. Information Theory, 13:126–127, 1967.CrossRef
67.
Zurück zum Zitat C. Ley and Y. Swan. Stein’s density approach for discrete distributions and information inequalities. See arxiv:1211.3668, 2012. C. Ley and Y. Swan. Stein’s density approach for discrete distributions and information inequalities. See arxiv:1211.3668, 2012.
69.
70.
Zurück zum Zitat Y. Linnik. An information-theoretic proof of the Central Limit Theorem with the Lindeberg Condition. Theory Probab. Appl., 4:288–299, 1959.MathSciNetCrossRefMATH Y. Linnik. An information-theoretic proof of the Central Limit Theorem with the Lindeberg Condition. Theory Probab. Appl., 4:288–299, 1959.MathSciNetCrossRefMATH
71.
Zurück zum Zitat J. Lott and C. Villani. Ricci curvature for metric-measure spaces via optimal transport. Ann. of Math. (2), 169(3):903–991, 2009.MathSciNetCrossRefMATH J. Lott and C. Villani. Ricci curvature for metric-measure spaces via optimal transport. Ann. of Math. (2), 169(3):903–991, 2009.MathSciNetCrossRefMATH
72.
Zurück zum Zitat M. Madiman and A. Barron. Generalized entropy power inequalities and monotonicity properties of information. IEEE Trans. Inform. Theory, 53(7):2317–2329, 2007.MathSciNetCrossRefMATH M. Madiman and A. Barron. Generalized entropy power inequalities and monotonicity properties of information. IEEE Trans. Inform. Theory, 53(7):2317–2329, 2007.MathSciNetCrossRefMATH
73.
Zurück zum Zitat M. Madiman, J. Melbourne, and P. Xu. Forward and reverse Entropy Power Inequalities in convex geometry. See: arxiv:1604.04225, 2016. M. Madiman, J. Melbourne, and P. Xu. Forward and reverse Entropy Power Inequalities in convex geometry. See: arxiv:1604.04225, 2016.
74.
Zurück zum Zitat P. Mateev. The entropy of the multinomial distribution. Teor. Verojatnost. i Primenen., 23(1):196–198, 1978.MathSciNetMATH P. Mateev. The entropy of the multinomial distribution. Teor. Verojatnost. i Primenen., 23(1):196–198, 1978.MathSciNetMATH
75.
Zurück zum Zitat N. Papadatos and V. Papathanasiou. Poisson approximation for a sum of dependent indicators: an alternative approach. Adv. in Appl. Probab., 34(3):609–625, 2002.MathSciNetCrossRefMATH N. Papadatos and V. Papathanasiou. Poisson approximation for a sum of dependent indicators: an alternative approach. Adv. in Appl. Probab., 34(3):609–625, 2002.MathSciNetCrossRefMATH
76.
Zurück zum Zitat V. Papathanasiou. Some characteristic properties of the Fisher information matrix via Cacoullos-type inequalities. J. Multivariate Anal., 44(2):256–265, 1993.MathSciNetCrossRefMATH V. Papathanasiou. Some characteristic properties of the Fisher information matrix via Cacoullos-type inequalities. J. Multivariate Anal., 44(2):256–265, 1993.MathSciNetCrossRefMATH
78.
Zurück zum Zitat C. R. Rao. On the distance between two populations. Sankhya, 9:246–248, 1948.MathSciNet C. R. Rao. On the distance between two populations. Sankhya, 9:246–248, 1948.MathSciNet
79.
Zurück zum Zitat A. Rényi. A characterization of Poisson processes. Magyar Tud. Akad. Mat. Kutató Int. Közl., 1:519–527, 1956.MathSciNetMATH A. Rényi. A characterization of Poisson processes. Magyar Tud. Akad. Mat. Kutató Int. Közl., 1:519–527, 1956.MathSciNetMATH
80.
Zurück zum Zitat A. Rényi. On measures of entropy and information. In J. Neyman, editor, Proceedings of the 4th Berkeley Conference on Mathematical Statistics and Probability, pages 547–561, Berkeley, 1961. University of California Press. A. Rényi. On measures of entropy and information. In J. Neyman, editor, Proceedings of the 4th Berkeley Conference on Mathematical Statistics and Probability, pages 547–561, Berkeley, 1961. University of California Press.
82.
Zurück zum Zitat M. Shaked and J. G. Shanthikumar. Stochastic orders. Springer Series in Statistics. Springer, New York, 2007.CrossRefMATH M. Shaked and J. G. Shanthikumar. Stochastic orders. Springer Series in Statistics. Springer, New York, 2007.CrossRefMATH
83.
Zurück zum Zitat S. Shamai and A. Wyner. A binary analog to the entropy-power inequality. IEEE Trans. Inform. Theory, 36(6):1428–1430, Nov 1990. S. Shamai and A. Wyner. A binary analog to the entropy-power inequality. IEEE Trans. Inform. Theory, 36(6):1428–1430, Nov 1990.
84.
Zurück zum Zitat C. E. Shannon. A mathematical theory of communication. Bell System Tech. J., 27:379–423, 623–656, 1948. C. E. Shannon. A mathematical theory of communication. Bell System Tech. J., 27:379–423, 623–656, 1948.
85.
Zurück zum Zitat N. Sharma, S. Das, and S. Muthukrishnan. Entropy power inequality for a family of discrete random variables. In 2011 IEEE International Symposium on Information Theory Proceedings (ISIT), pages 1945–1949. IEEE, 2011. N. Sharma, S. Das, and S. Muthukrishnan. Entropy power inequality for a family of discrete random variables. In 2011 IEEE International Symposium on Information Theory Proceedings (ISIT), pages 1945–1949. IEEE, 2011.
86.
Zurück zum Zitat L. A. Shepp and I. Olkin. Entropy of the sum of independent Bernoulli random variables and of the multinomial distribution. In Contributions to probability, pages 201–206. Academic Press, New York, 1981. L. A. Shepp and I. Olkin. Entropy of the sum of independent Bernoulli random variables and of the multinomial distribution. In Contributions to probability, pages 201–206. Academic Press, New York, 1981.
87.
Zurück zum Zitat R. Shimizu. On Fisher’s amount of information for location family. In Patil, G. P. and Kotz, S. and Ord, J. K., editor, A Modern Course on Statistical Distributions in Scientific Work, Volume 3, pages 305–312. Reidel, 1975. R. Shimizu. On Fisher’s amount of information for location family. In Patil, G. P. and Kotz, S. and Ord, J. K., editor, A Modern Course on Statistical Distributions in Scientific Work, Volume 3, pages 305–312. Reidel, 1975.
88.
Zurück zum Zitat A. J. Stam. Some inequalities satisfied by the quantities of information of Fisher and Shannon. Information and Control, 2:101–112, 1959.MathSciNetCrossRefMATH A. J. Stam. Some inequalities satisfied by the quantities of information of Fisher and Shannon. Information and Control, 2:101–112, 1959.MathSciNetCrossRefMATH
89.
Zurück zum Zitat K.-T. Sturm. On the geometry of metric measure spaces. I. Acta Math., 196(1):65–131, 2006. K.-T. Sturm. On the geometry of metric measure spaces. I. Acta Math., 196(1):65–131, 2006.
90.
Zurück zum Zitat K.-T. Sturm. On the geometry of metric measure spaces. II. Acta Math., 196(1):133–177, 2006. K.-T. Sturm. On the geometry of metric measure spaces. II. Acta Math., 196(1):133–177, 2006.
91.
Zurück zum Zitat G. Szegő. Orthogonal Polynomials. American Mathematical Society, New York, revised edition, 1958.MATH G. Szegő. Orthogonal Polynomials. American Mathematical Society, New York, revised edition, 1958.MATH
92.
Zurück zum Zitat G. Toscani. The fractional Fisher information and the central limit theorem for stable laws. Ricerche di Matematica, 65(1):71–91, 2016.MathSciNetCrossRefMATH G. Toscani. The fractional Fisher information and the central limit theorem for stable laws. Ricerche di Matematica, 65(1):71–91, 2016.MathSciNetCrossRefMATH
93.
94.
Zurück zum Zitat A. Tulino and S. Verdú. Monotonic decrease of the non-Gaussianness of the sum of independent random variables: a simple proof. IEEE Trans. Inform. Theory, 52(9):4295–4297, 2006.MathSciNetCrossRefMATH A. Tulino and S. Verdú. Monotonic decrease of the non-Gaussianness of the sum of independent random variables: a simple proof. IEEE Trans. Inform. Theory, 52(9):4295–4297, 2006.MathSciNetCrossRefMATH
95.
Zurück zum Zitat C. Villani. Topics in optimal transportation, volume 58 of Graduate Studies in Mathematics. American Mathematical Society, Providence, RI, 2003. C. Villani. Topics in optimal transportation, volume 58 of Graduate Studies in Mathematics. American Mathematical Society, Providence, RI, 2003.
96.
Zurück zum Zitat C. Villani. Optimal transport: Old and New, volume 338 of Grundlehren der Mathematischen Wissenschaften. Springer-Verlag, Berlin, 2009. C. Villani. Optimal transport: Old and New, volume 338 of Grundlehren der Mathematischen Wissenschaften. Springer-Verlag, Berlin, 2009.
97.
Zurück zum Zitat D. W. Walkup. Pólya sequences, binomial convolution and the union of random sets. J. Appl. Probability, 13(1):76–85, 1976.MathSciNetCrossRefMATH D. W. Walkup. Pólya sequences, binomial convolution and the union of random sets. J. Appl. Probability, 13(1):76–85, 1976.MathSciNetCrossRefMATH
98.
Zurück zum Zitat L. Wang, J. O. Woo, and M. Madiman. A lower bound on the Rényi entropy of convolutions in the integers. In 2014 IEEE International Symposium on Information Theory (ISIT), pages 2829–2833. IEEE, 2014. L. Wang, J. O. Woo, and M. Madiman. A lower bound on the Rényi entropy of convolutions in the integers. In 2014 IEEE International Symposium on Information Theory (ISIT), pages 2829–2833. IEEE, 2014.
99.
Zurück zum Zitat H. S. Witsenhausen. Some aspects of convexity useful in information theory. IEEE Trans. Inform. Theory, 26(3):265–271, 1980.MathSciNetCrossRefMATH H. S. Witsenhausen. Some aspects of convexity useful in information theory. IEEE Trans. Inform. Theory, 26(3):265–271, 1980.MathSciNetCrossRefMATH
100.
Zurück zum Zitat L. Wu. A new modified logarithmic Sobolev inequality for Poisson point processes and several applications. Probab. Theory Related Fields, 118(3):427–438, 2000.MathSciNetCrossRefMATH L. Wu. A new modified logarithmic Sobolev inequality for Poisson point processes and several applications. Probab. Theory Related Fields, 118(3):427–438, 2000.MathSciNetCrossRefMATH
101.
Zurück zum Zitat A. D. Wyner. A theorem on the entropy of certain binary sequences and applications. II. IEEE Trans. Information Theory, 19(6):772–777, 1973. A. D. Wyner. A theorem on the entropy of certain binary sequences and applications. II. IEEE Trans. Information Theory, 19(6):772–777, 1973.
102.
Zurück zum Zitat A. D. Wyner and J. Ziv. A theorem on the entropy of certain binary sequences and applications. I. IEEE Trans. Information Theory, 19(6):769–772, 1973. A. D. Wyner and J. Ziv. A theorem on the entropy of certain binary sequences and applications. I. IEEE Trans. Information Theory, 19(6):769–772, 1973.
103.
Zurück zum Zitat Y. Yu. On the maximum entropy properties of the binomial distribution. IEEE Trans. Inform. Theory, 54(7):3351–3353, July 2008. Y. Yu. On the maximum entropy properties of the binomial distribution. IEEE Trans. Inform. Theory, 54(7):3351–3353, July 2008.
104.
Zurück zum Zitat Y. Yu. Monotonic convergence in an information-theoretic law of small numbers. IEEE Trans. Inform. Theory, 55(12):5412–5422, 2009.MathSciNetCrossRef Y. Yu. Monotonic convergence in an information-theoretic law of small numbers. IEEE Trans. Inform. Theory, 55(12):5412–5422, 2009.MathSciNetCrossRef
105.
Zurück zum Zitat Y. Yu. On the entropy of compound distributions on nonnegative integers. IEEE Trans. Inform. Theory, 55(8):3645–3650, 2009.MathSciNetCrossRef Y. Yu. On the entropy of compound distributions on nonnegative integers. IEEE Trans. Inform. Theory, 55(8):3645–3650, 2009.MathSciNetCrossRef
106.
Zurück zum Zitat Y. Yu and O. T. Johnson. Concavity of entropy under thinning. In Proceedings of the 2009 IEEE International Symposium on Information Theory, 28th June - 3rd July 2009, Seoul, pages 144–148, 2009. Y. Yu and O. T. Johnson. Concavity of entropy under thinning. In Proceedings of the 2009 IEEE International Symposium on Information Theory, 28th June - 3rd July 2009, Seoul, pages 144–148, 2009.
Metadaten
Titel
Entropy and Thinning of Discrete Random Variables
verfasst von
Oliver Johnson
Copyright-Jahr
2017
Verlag
Springer New York
DOI
https://doi.org/10.1007/978-1-4939-7005-6_2

Premium Partner