Skip to main content
Top

2024 | OriginalPaper | Chapter

Minimum Kernel Discrepancy Estimators

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

For two decades, reproducing kernels and their associated discrepancies have facilitated elegant theoretical analyses in the setting of quasi Monte Carlo. These same tools are now receiving interest in statistics and related fields, as criteria that can be used to select an appropriate statistical model for a given dataset. The focus of this article is on minimum kernel discrepancy estimators, whose use in statistical applications is reviewed, and a general theoretical framework for establishing their asymptotic properties is presented.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Appendix
Available only for authorised users
Footnotes
1
i.e. additional levels of differentiability in directions that are axis-aligned.
 
Literature
1.
go back to reference Akaike, H.: Information theory and an extension of the likelihood principle. In: Proceedings of the Second International Symposium of Information Theory (1973) Akaike, H.: Information theory and an extension of the likelihood principle. In: Proceedings of the Second International Symposium of Information Theory (1973)
2.
go back to reference Aliprantis, C.D., Burkinshaw, O.: Principles of Real Analysis. Academic Press (1998) Aliprantis, C.D., Burkinshaw, O.: Principles of Real Analysis. Academic Press (1998)
3.
go back to reference Alquier, P., Gerber, M.: Universal robust regression via maximum mean discrepancy. Biometrika (2023). To appear Alquier, P., Gerber, M.: Universal robust regression via maximum mean discrepancy. Biometrika (2023). To appear
4.
go back to reference Anastasiou, A., Barp, A., Briol, F.X., Ebner, B., Gaunt, R.E., Ghaderinezhad, F., Gorham, J., Gretton, A., Ley, C., Liu, Q., Mackey, L., Oates, C.J., Reinert, G., Swan, Y.: Stein’s method meets statistics: a review of some recent developments. Stat. Sci. 38(1), 120–139 (2023)MathSciNetCrossRef Anastasiou, A., Barp, A., Briol, F.X., Ebner, B., Gaunt, R.E., Ghaderinezhad, F., Gorham, J., Gretton, A., Ley, C., Liu, Q., Mackey, L., Oates, C.J., Reinert, G., Swan, Y.: Stein’s method meets statistics: a review of some recent developments. Stat. Sci. 38(1), 120–139 (2023)MathSciNetCrossRef
5.
go back to reference Arjovsky, M., Chintala, S., Bottou, L.: Wasserstein generative adversarial networks. In: Proceedings of the 34th International Conference on Machine Learning (2017) Arjovsky, M., Chintala, S., Bottou, L.: Wasserstein generative adversarial networks. In: Proceedings of the 34th International Conference on Machine Learning (2017)
6.
go back to reference Barp, A., Briol, F.X., Duncan, A., Girolami, M., Mackey, L.: Minimum Stein discrepancy estimators. In: Proceedings of the 33rd Conference on Neural Information Processing Systems (2019) Barp, A., Briol, F.X., Duncan, A., Girolami, M., Mackey, L.: Minimum Stein discrepancy estimators. In: Proceedings of the 33rd Conference on Neural Information Processing Systems (2019)
7.
go back to reference Barp, A., Simon-Gabriel, C.J., Girolami, M., Mackey, L.: Targeted separation and convergence with kernel discrepancies (2022). arXiv:2209.12835 Barp, A., Simon-Gabriel, C.J., Girolami, M., Mackey, L.: Targeted separation and convergence with kernel discrepancies (2022). arXiv:​2209.​12835
8.
go back to reference Basu, A., Shioya, H., Park, C.: Statistical Inference: The Minimum Distance Approach. CRC Press (2011) Basu, A., Shioya, H., Park, C.: Statistical Inference: The Minimum Distance Approach. CRC Press (2011)
10.
go back to reference Billingsley, P.: Probability and Measure. Wiley (1979) Billingsley, P.: Probability and Measure. Wiley (1979)
11.
go back to reference Bińkowski, M., Sutherland, D.J., Arbel, M., Gretton, A.: Demystifying MMD GANs. In: Proceedings of the 6th International Conference on Learning Representations (2018) Bińkowski, M., Sutherland, D.J., Arbel, M., Gretton, A.: Demystifying MMD GANs. In: Proceedings of the 6th International Conference on Learning Representations (2018)
12.
go back to reference Bonner, N., Kirschner, H.P.: Note on conditions for weak convergence of von Mises’ differentiable statistical functions. Ann. Stat. 5(2), 405–407 (1977)MathSciNetCrossRef Bonner, N., Kirschner, H.P.: Note on conditions for weak convergence of von Mises’ differentiable statistical functions. Ann. Stat. 5(2), 405–407 (1977)MathSciNetCrossRef
13.
go back to reference Briol, F.X., Barp, A., Duncan, A.B., Girolami, M.: Statistical inference for generative models with maximum mean discrepancy (2019). arXiv:1906.05944 Briol, F.X., Barp, A., Duncan, A.B., Girolami, M.: Statistical inference for generative models with maximum mean discrepancy (2019). arXiv:​1906.​05944
14.
go back to reference Carmeli, C., De Vito, E., Toigo, A.: Vector valued reproducing kernel Hilbert spaces of integrable functions and Mercer theorem. Anal. Appl. 4(04), 377–408 (2006)MathSciNetCrossRef Carmeli, C., De Vito, E., Toigo, A.: Vector valued reproducing kernel Hilbert spaces of integrable functions and Mercer theorem. Anal. Appl. 4(04), 377–408 (2006)MathSciNetCrossRef
15.
go back to reference Chérief-Abdellatif, B.E., Alquier, P.: MMD-Bayes: robust Bayesian estimation via maximum mean discrepancy. In: Symposium on Advances in Approximate Bayesian Inference, pp. 1–21. PMLR (2020) Chérief-Abdellatif, B.E., Alquier, P.: MMD-Bayes: robust Bayesian estimation via maximum mean discrepancy. In: Symposium on Advances in Approximate Bayesian Inference, pp. 1–21. PMLR (2020)
16.
go back to reference Chérief-Abdellatif, B.E., Alquier, P.: Finite sample properties of parametric MMD estimation: robustness to misspecification and dependence. Bernoulli 28(1), 181–213 (2022)MathSciNetCrossRef Chérief-Abdellatif, B.E., Alquier, P.: Finite sample properties of parametric MMD estimation: robustness to misspecification and dependence. Bernoulli 28(1), 181–213 (2022)MathSciNetCrossRef
17.
go back to reference Chwialkowski, K., Strathmann, H., Gretton, A.: A kernel test of goodness of fit. In: Proceedings of the 33rd International Conference on Machine Learning (2016) Chwialkowski, K., Strathmann, H., Gretton, A.: A kernel test of goodness of fit. In: Proceedings of the 33rd International Conference on Machine Learning (2016)
18.
go back to reference Cortes, E.C., Scott, C.: Sparse approximation of a kernel mean. IEEE Trans. Signal Process. 65(5), 1310–1323 (2016)MathSciNetCrossRef Cortes, E.C., Scott, C.: Sparse approximation of a kernel mean. IEEE Trans. Signal Process. 65(5), 1310–1323 (2016)MathSciNetCrossRef
19.
go back to reference Davidson, J.: Stochastic Limit Theory: An Introduction for Econometricians. OUP Oxford (1994) Davidson, J.: Stochastic Limit Theory: An Introduction for Econometricians. OUP Oxford (1994)
21.
22.
go back to reference Dellaporta, C., Knoblauch, J., Damoulas, T., Briol, F.X.: Robust Bayesian inference for simulator-based models via the MMD posterior bootstrap. In: Proceedings of the 25th International Conference on Artificial Intelligence and Statistics (2022) Dellaporta, C., Knoblauch, J., Damoulas, T., Briol, F.X.: Robust Bayesian inference for simulator-based models via the MMD posterior bootstrap. In: Proceedings of the 25th International Conference on Artificial Intelligence and Statistics (2022)
23.
go back to reference Dhariwal, P., Nichol, A.: Diffusion models beat GANs on image synthesis. In: Proceedings of the 35th Conference on Neural Information Processing Systems (2021) Dhariwal, P., Nichol, A.: Diffusion models beat GANs on image synthesis. In: Proceedings of the 35th Conference on Neural Information Processing Systems (2021)
24.
go back to reference Dick, J., Kuo, F.Y., Sloan, I.H.: High-dimensional integration: the quasi-Monte Carlo way. Acta Numerica 22, 133–288 (2013)MathSciNetCrossRef Dick, J., Kuo, F.Y., Sloan, I.H.: High-dimensional integration: the quasi-Monte Carlo way. Acta Numerica 22, 133–288 (2013)MathSciNetCrossRef
25.
go back to reference Dick, J., Pillichshammer, F.: Digital Nets and Sequences: Discrepancy Theory and Quasi-Monte Carlo integration. Cambridge University Press (2010) Dick, J., Pillichshammer, F.: Digital Nets and Sequences: Discrepancy Theory and Quasi-Monte Carlo integration. Cambridge University Press (2010)
26.
go back to reference Donoho, D.L., Liu, R.C.: The “automatic’’ robustness of minimum distance functionals. Ann. Stat. 16(2), 552–586 (1988)MathSciNetCrossRef Donoho, D.L., Liu, R.C.: The “automatic’’ robustness of minimum distance functionals. Ann. Stat. 16(2), 552–586 (1988)MathSciNetCrossRef
27.
go back to reference Dunford, N.: Integration of vector-valued functions. Bulletin of the American Mathematical Society, p. 43 (1937) Dunford, N.: Integration of vector-valued functions. Bulletin of the American Mathematical Society, p. 43 (1937)
28.
go back to reference Dziugaite, G.K., Roy, D.M., Ghahramani, Z.: Training generative neural networks via maximum mean discrepancy optimization. In: Proceedings of the 31st Conference on Uncertainty in Artificial Intelligence (2015) Dziugaite, G.K., Roy, D.M., Ghahramani, Z.: Training generative neural networks via maximum mean discrepancy optimization. In: Proceedings of the 31st Conference on Uncertainty in Artificial Intelligence (2015)
29.
go back to reference Frazier, D.T., Drovandi, C.: Robust approximate Bayesian inference with synthetic likelihood. J. Comput. Graph. Stat. 30(4), 958–976 (2021)MathSciNetCrossRef Frazier, D.T., Drovandi, C.: Robust approximate Bayesian inference with synthetic likelihood. J. Comput. Graph. Stat. 30(4), 958–976 (2021)MathSciNetCrossRef
30.
go back to reference Freedman, D.A.: On the so-called “Huber sandwich estimator’’ and “robust standard errors’’. Am. Stat. 60(4), 299–302 (2006)MathSciNetCrossRef Freedman, D.A.: On the so-called “Huber sandwich estimator’’ and “robust standard errors’’. Am. Stat. 60(4), 299–302 (2006)MathSciNetCrossRef
31.
go back to reference Genevay, A., Peyré, G., Cuturi, M.: Learning generative models with sinkhorn divergences. In: Proceedings of the 21st International Conference on Artificial Intelligence and Statistics (2018) Genevay, A., Peyré, G., Cuturi, M.: Learning generative models with sinkhorn divergences. In: Proceedings of the 21st International Conference on Artificial Intelligence and Statistics (2018)
32.
go back to reference Gneiting, T., Raftery, A.E.: Strictly proper scoring rules, prediction, and estimation. J. Am. Stat. Assoc. 102(477), 359–378 (2007)MathSciNetCrossRef Gneiting, T., Raftery, A.E.: Strictly proper scoring rules, prediction, and estimation. J. Am. Stat. Assoc. 102(477), 359–378 (2007)MathSciNetCrossRef
33.
go back to reference Goodfellow, I., Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D., Ozair, S., Courville, A., Bengio, Y.: Generative adversarial networks. Commun. ACM 63(11), 139–144 (2020)MathSciNetCrossRef Goodfellow, I., Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D., Ozair, S., Courville, A., Bengio, Y.: Generative adversarial networks. Commun. ACM 63(11), 139–144 (2020)MathSciNetCrossRef
34.
go back to reference Gorham, J., Mackey, L.: Measuring sample quality with kernels. In: Proceedings of the 34th International Conference on Machine Learning (2017) Gorham, J., Mackey, L.: Measuring sample quality with kernels. In: Proceedings of the 34th International Conference on Machine Learning (2017)
35.
go back to reference Gretton, A., Borgwardt, K.M., Rasch, M.J., Schölkopf, B., Smola, A.: A kernel two-sample test. J. Mach. Learn. Res. 13(1), 723–773 (2012)MathSciNet Gretton, A., Borgwardt, K.M., Rasch, M.J., Schölkopf, B., Smola, A.: A kernel two-sample test. J. Mach. Learn. Res. 13(1), 723–773 (2012)MathSciNet
36.
go back to reference Hansen, L.P.: Large sample properties of generalized method of moments estimators. Econometrica, pp. 1029–1054 (1982) Hansen, L.P.: Large sample properties of generalized method of moments estimators. Econometrica, pp. 1029–1054 (1982)
37.
38.
go back to reference Hlawka, E.: Funktionen von beschränkter variatiou in der theorie der gleichverteilung. Annali di Matematica Pura ed Applicata 54(1), 325–333 (1961)CrossRef Hlawka, E.: Funktionen von beschränkter variatiou in der theorie der gleichverteilung. Annali di Matematica Pura ed Applicata 54(1), 325–333 (1961)CrossRef
39.
go back to reference Hoeffding, W.: A class of statistics with asymptotically normal distribution. Ann. Math. Stat. 19(3), 293–325 (1948)MathSciNetCrossRef Hoeffding, W.: A class of statistics with asymptotically normal distribution. Ann. Math. Stat. 19(3), 293–325 (1948)MathSciNetCrossRef
40.
go back to reference Hoeffding, W.: The strong law of large numbers for \({U}\)-statistics. Technical report, North Carolina State University. Department of Statistics (1961) Hoeffding, W.: The strong law of large numbers for \({U}\)-statistics. Technical report, North Carolina State University. Department of Statistics (1961)
41.
go back to reference Huber, P.J.: Robust estimation of a location parameter. The Annals of Mathematical Statistics, pp. 73–101 (1964) Huber, P.J.: Robust estimation of a location parameter. The Annals of Mathematical Statistics, pp. 73–101 (1964)
42.
go back to reference Hyvärinen, A., Dayan, P.: Estimation of non-normalized statistical models by score matching. J. Mach. Learn. Res. 6(4) (2005) Hyvärinen, A., Dayan, P.: Estimation of non-normalized statistical models by score matching. J. Mach. Learn. Res. 6(4) (2005)
44.
go back to reference Kuo, F.Y.: Component-by-component constructions achieve the optimal rate of convergence for multivariate integration in weighted Korobov and Sobolev spaces. J. Complex. 19(3), 301–320 (2003)MathSciNetCrossRef Kuo, F.Y.: Component-by-component constructions achieve the optimal rate of convergence for multivariate integration in weighted Korobov and Sobolev spaces. J. Complex. 19(3), 301–320 (2003)MathSciNetCrossRef
45.
46.
go back to reference LeCun, Y., Chopra, S., Hadsell, R., Ranzato, M., Huang, F.: A tutorial on energy-based learning. In: Schölkopf, B., Smola, A.J., Taskar, B., Vishwanathan, S. (eds.), Predicting Structured Data (2007) LeCun, Y., Chopra, S., Hadsell, R., Ranzato, M., Huang, F.: A tutorial on energy-based learning. In: Schölkopf, B., Smola, A.J., Taskar, B., Vishwanathan, S. (eds.), Predicting Structured Data (2007)
47.
go back to reference Ledoux, M., Talagrand, M.: Probability in Banach Spaces: Isoperimetry and Processes. Springer Science & Business Media (1991) Ledoux, M., Talagrand, M.: Probability in Banach Spaces: Isoperimetry and Processes. Springer Science & Business Media (1991)
48.
go back to reference Li, C.L., Chang, W.C., Cheng, Y., Yang, Y., Póczos, B.: MMD GAN: towards deeper understanding of moment matching network. In: Proceedings of the 31st Conference on Neural Information Processing Systems (2017) Li, C.L., Chang, W.C., Cheng, Y., Yang, Y., Póczos, B.: MMD GAN: towards deeper understanding of moment matching network. In: Proceedings of the 31st Conference on Neural Information Processing Systems (2017)
49.
go back to reference Li, Y., Swersky, K., Zemel, R.: Generative moment matching networks. In: Proceedings of the 32nd International Conference on Machine Learning (2015) Li, Y., Swersky, K., Zemel, R.: Generative moment matching networks. In: Proceedings of the 32nd International Conference on Machine Learning (2015)
50.
go back to reference Liu, Q., Lee, J., Jordan, M.: A kernelized Stein discrepancy for goodness-of-fit tests. In: Proceedings of the 33rd International Conference on Machine Learning (2016) Liu, Q., Lee, J., Jordan, M.: A kernelized Stein discrepancy for goodness-of-fit tests. In: Proceedings of the 33rd International Conference on Machine Learning (2016)
51.
go back to reference Lyne, A.M., Girolami, M., Atchadé, Y., Strathmann, H., Simpson, D.: On Russian roulette estimates for Bayesian inference with doubly-intractable likelihoods. Stat. Sci. 30(4), 443–467 (2015)MathSciNetCrossRef Lyne, A.M., Girolami, M., Atchadé, Y., Strathmann, H., Simpson, D.: On Russian roulette estimates for Bayesian inference with doubly-intractable likelihoods. Stat. Sci. 30(4), 443–467 (2015)MathSciNetCrossRef
52.
go back to reference Matsubara, T., Knoblauch, J., Briol, F.X., Oates, C.J.: Robust generalised bayesian inference for intractable likelihoods. J. R. Stat. Soc. Ser. B 84(3), 997–1022 (2022)MathSciNetCrossRef Matsubara, T., Knoblauch, J., Briol, F.X., Oates, C.J.: Robust generalised bayesian inference for intractable likelihoods. J. R. Stat. Soc. Ser. B 84(3), 997–1022 (2022)MathSciNetCrossRef
53.
go back to reference Matsubara, T., Knoblauch, J., Briol, F.X., Oates, C.J.: Robust generalised Bayesian inference for intractable likelihoods. J. R. Stat. Soc.: Ser. B 84(3), 997–1022 (2022)MathSciNetCrossRef Matsubara, T., Knoblauch, J., Briol, F.X., Oates, C.J.: Robust generalised Bayesian inference for intractable likelihoods. J. R. Stat. Soc.: Ser. B 84(3), 997–1022 (2022)MathSciNetCrossRef
54.
go back to reference Mitrovic, J., Sejdinovic, D., Teh, Y.W.: DR-ABC: Approximate Bayesian computation with kernel-based distribution regression. In: Proceedings of the 33rd International Conference on Machine Learning (2016) Mitrovic, J., Sejdinovic, D., Teh, Y.W.: DR-ABC: Approximate Bayesian computation with kernel-based distribution regression. In: Proceedings of the 33rd International Conference on Machine Learning (2016)
55.
go back to reference Mroueh, Y., Li, C.L., Sercu, T., Raj, A., Cheng, Y.: Sobolev GAN. In: Proceedings of the 6th International Conference on Learning Representations (2018) Mroueh, Y., Li, C.L., Sercu, T., Raj, A., Cheng, Y.: Sobolev GAN. In: Proceedings of the 6th International Conference on Learning Representations (2018)
56.
go back to reference Mroueh, Y., Sercu, T.: Fisher GAN. In: Proceedings of the 31st Conference on Neural Information Processing Systems (2017) Mroueh, Y., Sercu, T.: Fisher GAN. In: Proceedings of the 31st Conference on Neural Information Processing Systems (2017)
57.
go back to reference Mroueh, Y., Sercu, T., Goel, V.: McGAN: mean and covariance feature matching GAN. In: Proceedings of the 34th International Conference on Machine Learning (2017) Mroueh, Y., Sercu, T., Goel, V.: McGAN: mean and covariance feature matching GAN. In: Proceedings of the 34th International Conference on Machine Learning (2017)
58.
go back to reference Muandet, K., Fukumizu, K., Sriperumbudur, B., Schölkopf, B.: Kernel mean embedding of distributions: A review and beyond. Found. Trends® Mach. Learn. 10(1–2), 1–141 (2017) Muandet, K., Fukumizu, K., Sriperumbudur, B., Schölkopf, B.: Kernel mean embedding of distributions: A review and beyond. Found. Trends® Mach. Learn. 10(1–2), 1–141 (2017)
59.
go back to reference Müller, A.: Integral probability metrics and their generating classes of functions. Adv. Appl. Probab. 29(2), 429–443 (1997)MathSciNetCrossRef Müller, A.: Integral probability metrics and their generating classes of functions. Adv. Appl. Probab. 29(2), 429–443 (1997)MathSciNetCrossRef
60.
go back to reference Nietert, S., Goldfeld, Z., Kato, K.: Smooth \( p \)-wasserstein distance: structure, empirical approximation, and statistical applications. In: Proceedings of the 38th International Conference on Machine Learning (2021) Nietert, S., Goldfeld, Z., Kato, K.: Smooth \( p \)-wasserstein distance: structure, empirical approximation, and statistical applications. In: Proceedings of the 38th International Conference on Machine Learning (2021)
61.
go back to reference Niu, Z., Meier, J., Briol, F.X.: Discrepancy-based inference for intractable generative models using quasi-Monte Carlo. Electron. J. Stat. 17(1), 1411–1456 (2023)MathSciNetCrossRef Niu, Z., Meier, J., Briol, F.X.: Discrepancy-based inference for intractable generative models using quasi-Monte Carlo. Electron. J. Stat. 17(1), 1411–1456 (2023)MathSciNetCrossRef
62.
go back to reference Oates, C.J., Girolami, M., Chopin, N.: Control functionals for Monte Carlo integration. J. R. Stat. Soc. Ser. B 79, 695–718 (2017)MathSciNetCrossRef Oates, C.J., Girolami, M., Chopin, N.: Control functionals for Monte Carlo integration. J. R. Stat. Soc. Ser. B 79, 695–718 (2017)MathSciNetCrossRef
63.
go back to reference Pardo, L.: Statistical Inference Based on Divergence Measures. Chapman and Hall/CRC (2018) Pardo, L.: Statistical Inference Based on Divergence Measures. Chapman and Hall/CRC (2018)
64.
go back to reference Park, M., Jitkrittum, W., Sejdinovic, D.: K2-ABC: Approximate Bayesian computation with kernel embeddings. In: Proceedings of the 18th International Conference on Artificial Intelligence and Statistics (2016) Park, M., Jitkrittum, W., Sejdinovic, D.: K2-ABC: Approximate Bayesian computation with kernel embeddings. In: Proceedings of the 18th International Conference on Artificial Intelligence and Statistics (2016)
65.
go back to reference Schwabik, S., Ye, G.: Topics in Banach Space Integration. World Scientific (2005) Schwabik, S., Ye, G.: Topics in Banach Space Integration. World Scientific (2005)
66.
go back to reference Serfling, R.J.: Approximation Theorems of Mathematical Statistics. Wiley (2009) Serfling, R.J.: Approximation Theorems of Mathematical Statistics. Wiley (2009)
67.
go back to reference Simon-Gabriel, C.J., Barp, A., Mackey, L.: Metrizing weak convergence with maximum mean discrepancies. J. Mach. Learn. Res. 24, 1–20 (2023)MathSciNet Simon-Gabriel, C.J., Barp, A., Mackey, L.: Metrizing weak convergence with maximum mean discrepancies. J. Mach. Learn. Res. 24, 1–20 (2023)MathSciNet
68.
go back to reference Sloan, I.H., Kachoyan, P.J.: Lattice methods for multiple integration: theory, error analysis and examples. SIAM J. Numer. Anal. 24(1), 116–128 (1987)MathSciNetCrossRef Sloan, I.H., Kachoyan, P.J.: Lattice methods for multiple integration: theory, error analysis and examples. SIAM J. Numer. Anal. 24(1), 116–128 (1987)MathSciNetCrossRef
69.
go back to reference Sloan, I.H., Woźniakowski, H.: When are quasi-Monte Carlo algorithms efficient for high dimensional integrals? J. Compl. 14(1), 1–33 (1998)MathSciNetCrossRef Sloan, I.H., Woźniakowski, H.: When are quasi-Monte Carlo algorithms efficient for high dimensional integrals? J. Compl. 14(1), 1–33 (1998)MathSciNetCrossRef
70.
go back to reference Song, L., Zhang, X., Smola, A., Gretton, A., Schölkopf, B.: Tailoring density estimation via reproducing kernel moment matching. In: Proceedings of the 25th International Conference on Machine Learning (2008) Song, L., Zhang, X., Smola, A., Gretton, A., Schölkopf, B.: Tailoring density estimation via reproducing kernel moment matching. In: Proceedings of the 25th International Conference on Machine Learning (2008)
72.
go back to reference Steinwart, I., Christmann, A.: Support Vector Machines. Springer Science & Business Media (2008) Steinwart, I., Christmann, A.: Support Vector Machines. Springer Science & Business Media (2008)
73.
go back to reference Sutherland, D.J., Tung, H.Y., Strathmann, H., De, S., Ramdas, A., Smola, A.J., Gretton, A.: Generative models and model criticism via optimized maximum mean discrepancy. In: Proceedings of the 5th International Conference on Learning Representations (2017) Sutherland, D.J., Tung, H.Y., Strathmann, H., De, S., Ramdas, A., Smola, A.J., Gretton, A.: Generative models and model criticism via optimized maximum mean discrepancy. In: Proceedings of the 5th International Conference on Learning Representations (2017)
74.
go back to reference Teymur, O., Gorham, J., Riabiz, M., Oates, C.J.: Optimal quantisation of probability measures using maximum mean discrepancy. In: Proceedings of the 24th International Conference on Artificial Intelligence and Statistics (2021) Teymur, O., Gorham, J., Riabiz, M., Oates, C.J.: Optimal quantisation of probability measures using maximum mean discrepancy. In: Proceedings of the 24th International Conference on Artificial Intelligence and Statistics (2021)
75.
go back to reference Theis, L., van den Oord, A., Bethge, M.: A note on the evaluation of generative models. In: Proceedings of the 4th International Conference on Learning Representations (2016) Theis, L., van den Oord, A., Bethge, M.: A note on the evaluation of generative models. In: Proceedings of the 4th International Conference on Learning Representations (2016)
76.
go back to reference Van der Vaart, A.W.: Asymptotic Statistics. Cambridge University Press (2000) Van der Vaart, A.W.: Asymptotic Statistics. Cambridge University Press (2000)
77.
go back to reference Wynne, G., Duncan, A.B.: A kernel two-sample test for functional data. J. Mach. Learn. Res. 23(73), 1–51 (2022)MathSciNet Wynne, G., Duncan, A.B.: A kernel two-sample test for functional data. J. Mach. Learn. Res. 23(73), 1–51 (2022)MathSciNet
78.
go back to reference Wynne, G., Kasprzak, M., Duncan, A.B.: A spectral representation of kernel Stein discrepancy with application to goodness-of-fit tests for measures on infinite dimensional Hilbert spaces (2022). arXiv:2206.04552 Wynne, G., Kasprzak, M., Duncan, A.B.: A spectral representation of kernel Stein discrepancy with application to goodness-of-fit tests for measures on infinite dimensional Hilbert spaces (2022). arXiv:​2206.​04552
Metadata
Title
Minimum Kernel Discrepancy Estimators
Author
Chris J. Oates
Copyright Year
2024
DOI
https://doi.org/10.1007/978-3-031-59762-6_6

Premium Partner